Safiya Umoja Noble
I started the book several years ago by doing collective searches on keywords around different community identities. I did searches on “black girls,” “Asian girls,” and “Latina girls” online and found that pornography was the primary way they were represented on the first page of search results.
That doesn’t seem to be a very fair or credible representation of women of color in the United States. It reduces them to sexualized objects.
They suppressed a lot of porn, in part because we’ve been speaking out about this for six or seven years. But if you go to Google today and search for “Asian girls” or “Latina girls,” you’ll still find the hypersexualized content.
I think what you see there is the gaze of people of color looking at white women and girls and naming whiteness as an identity, which is something that you don’t typically see white women doing themselves.
These search algorithms aren’t merely selecting what information we’re exposed to; they’re cementing assumptions about what information is worth knowing in the first place. That might be the most insidious part of this.
Safiya Umoja Noble
There is a dominant male, Western-centric point of view that gets encoded into the organization of information. You have to remember that an algorithm is just an automated decision tree. If these keywords are present, then a variety of assumptions have to be made about what to point to in all the trillions of pages that exist on the web.
And those decisions always correlate to the relationship of advertisers to the platform. Google has a huge empire called AdWords, and people bid in a real-time auction to optimize their content.
That model — of information going to the highest bidder — will always privilege people who have the most resources. And that means that people who don’t have a lot of resources, like children, will never be able to fully control the ways in which they’re represented, given the logic and mechanisms of how search engines work.
In the book, you talk about how racist websites gamed search engines to control the narrative around Martin Luther King Jr. so that if you searched for MLK, you’d find links to white supremacist propaganda.
You also talk about the stakes involved here, and point to Dylann Roof as an example.
Safiya Umoja Noble
In his manifesto, Dylann Roof has a diatribe against people of color, and he says that the first event that truly awakened him was the Trayvon Martin story. He says he went to Google and did a search on “black-on-white crime.” Now, most of us know that black-on-white crime is not an American epidemic — that, in fact, most crime happens within a community. But that’s a separate discussion.
So Roof goes to Google and puts in a white nationalist red herring (“black-on-white crime.”) And of course, it immediately takes him to white supremacist websites, which in turn take him down a racist rabbit hole of conspiracy and misinformation. Often, these racist websites are designed to appear credible and benign, in part because that helps them game the algorithms, but also because it convinces a lot of people that the information is truthful.
This is how Roof gets radicalized. He says he learns about the “true history of America,” and about the “race problem” and the “Jewish problem.” He learns that everything he’s ever been taught in school is a lie. And then he says, in his own words, that this makes him research more and more, which we can only imagine is online, and this leads to his “racial awareness.”
And now we know that shortly thereafter, he steps into the “Mother” Emanuel AME Church in Charleston, South Carolina, and murders nine African-American worshippers in cold blood, in order to start a race war.
So the ideas that people are encountering online really matter. It matters that Dylann Roof didn’t see the FBI statistics that tell the truth about how crime works in America. It matters that he didn’t get any counterpoints. It matters that people like him are pushed in these directions without resistance or context.
People who are a numerical minority in society will never be able to use this kind of “majority rules” logic to their benefit. The majority will always be able to control the notions of what’s important, or what’s important to click on, and that’s not how the information landscape ought to work.
The platform exists because it’s made by people. It didn’t come down from an alien spacecraft. It’s made by human beings, and the people who make it are biased, and they code their biases into search. How can these things not inform their judgment?
So it’s disingenuous to suggest that the platform just exists unto itself, and that the only people who can manipulate it or influence it are the people who use it, when actually, the makers of the platform are the primary source of responsibility. I would say that there are makers, as well as users, of a platform. They have to take responsibility for their creations.