Status
Pages
Collection
Publication
Description
A revealing look at how negative biases against women of color are embedded in search engine results and algorithms Run a Google search for �black girls��what will you find? �Big Booty� and other sexually explicit terms are likely to come up as top search terms. But, if you type in �white girls,� the results are radically different. The suggested porn sites and un-moderated discussions about �why black women are so sassy� or �why black women are so angry� presents a disturbing portrait of black womanhood in modern society.In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance�operating as a source for email, a major vehicle for primary and secondary school learning, and beyond�understanding and reversing these disquieting trends and discriminatory practices is of utmost importance.An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.… (more)
Media reviews
User reviews
The internet is not the neutral, unbiased warehouse of all things. Search, for example, is loaded with ingrained prejudices from our culture and history. Safiya Noble, a black feminist, was incensed when she searched ”black girls” on Google and came up with
If it weren’t bad enough that only porn resulted, look at Google’s autocomplete suggestions for a search on “women”:
-Women cannot: drive, be bishops, be trusted, speak in church
-Women should not: have rights, work, vote, box
-Women should: stay at home, be slaves, be in the kitchen, not speak in church
-Women need to: be put in their places, know their place, be controlled, be disciplined
The result is Algorithms of Oppression, a six year project to determine the extent of this poison, how it came to be, and what should be done. Noble found western society itself at the heart of it.
Google, incredibly, denies any responsibility. It says its algorithm operates on its own and they can’t train it. This is what we call a lie, as Google has managed to abide by all kinds of European directives against hate, Nazi products for sale, and the right to be forgotten. And magically, the black girls search results have been evolving too, as Noble shows in her many screenshots.
We like to believe that what rises to the top in search is whatever is most popular and most relevant. But we fool ourselves. There are classification systems at work, and Noble says blacks have been “contained and constrained“ by them. The search “beautiful” results in an endless page of photos of white women. Not Starry Night, Niagara Falls or the Taj Mahal, but white women. A search for “professor” brings photos of only white men. And a search for “unprofessional hairstyles for work” shows only women of color. As you might guess, “professional hairstyles for work” shows only white women.
And it’s not as if Google has customized the results according to Noble’s search history. She has spent several years using Google in her pursuit of a doctorate in black feminist studies. And this is how Google profiles her.
Basically, Google’s search algorithm represents the white male view of the world, she says, and brings up results to fulfill that need. Black community or society is simply not part of the equation, and therefore not part of the algorithm. Same goes for women.
Noble has a chapter on libraries, because librarians classify everything. They must of course, in order for anyone to do any sort of in-depth research. Yet the very act of classification is discriminatory. Irish Catholic, Korean American, black feminist – are all problems looking for homes. Everyone becomes an “objectified symbol“ to someone else. Leo Buscaglia spent his life ranting against this. Because of these labels, we think we know something about this person, he used to say, but we don’t at all. This same built-in bias shows up in online search. It is not in any way neutral.
As exhaustive as she has tried to be, Noble made no effort to stem the tide. Her screenshots do not also show results when Google’s Family Filter is on, and she never tried searching with negative terms to block the sex listings (Black girls –sex). It’s almost certainly true that most people can’t be bothered with either of these tactics, but Noble should have included their results.
Unfortunately, she concludes that Google search be federally regulated. This despite her entire book demonstrating the embedded, if not innate bias throughout every aspect of western society. It’s not an especially hopeful ending, and really just skirts the whole core issue.
We are nowhere near being postracial.
David Wineberg
In her conclusion Noble tells us that she wrote an article about these observations in 2012 for a national women’s magazine, B*tch, and within six weeks the Google Search for “black girls” turned up an entire page of results like “Black Girls Code,” Black Girls Rock,” “7-Year-Old Writes Book to Show Black Girls They Are Princesses.” While Noble declines to take credit for these changes, she continued her research into the way non-white communities are sidelined in the digital universe.
We must keep several things in mind at once if the digital environment is to work for all of us. We must recognize the way the digital universe reflects and perpetuates the white male patriarchy from which it was developed. In order for the internet to live up to the promise of allowing unheard and disenfranchised populations some voice and access to information they can use to enhance their world, we must monitor the creation and use of the algorithms that control the processes by which we add to and search the internet. This is one reason it is so critical to have diversity in tech. Below find just a few of Noble's more salient points:
※ We are the product that Google sells to advertisers.
※The digital interface is a material reality structuring a discourse, embedded with historical relations...Search does not merely present pages but structures knowledge...
※ Google & other search engines have been enlisted to make decisions about the proper balance between personal privacy and access to information. The vast majority of these decisions face no public scrutiny, though they shape public discourse.
※ Those who have the power to design systems--classification or technical [like library, museum, & information professionals]--hold the ability to prioritize hierarchical schemes that privilege certain types of information over others.
※ The search arena is consolidated under the control of only a few companies.
※ Algorithms that rank & prioritize for profits compromise our ability to engage with complicated ideas. There is no counterposition, nor is there a disclaimer or framework for contextualizing what we get.
※ Access to high quality information, from journalism to research, is essential to a healthy and viable democracy...In some cases, journalists are facing screens that deliver real-time analytics about the virality of their stories. Under these circumstances, journalists are encouraged to modify headlines and keywords within a news story to promote greater traction and sharing among readers.
The early e-version of this manuscript obtained through Netgalley had formatting and linking issues that were a hindrance to understanding. Noble writes here for an academic audience I presume, and as such her jargon and complicated sentences are appropriate for communicating the most precise information in the least space. However, for a general audience this book would be a slog, something not true if one listens to Noble (as in the attached TED talk linked below). Surely one of the best things this book offers is a collection of references to others who are working on these problems around the country.
The other best thing about this book is an affecting story Noble includes in the final pages of her Epilogue about Kandis, a long-established black hairdresser in a college town trying to keep her business going by registering online with the ratings site, Yelp. Noble writes in the woman’s voice, simply and forthrightly, without jargon, and the clarity and moral force of the story is so hard-hitting, it is worth picking up the book for this story. At the very least I would recommend a TED talk on this story, and suggest placing the story closer to the front of this book in subsequent editions.
Basically, the story is as follows: Kandis's shop became an established business in the 1980s, before the fall off of black scholars attending the university, "when the campus stopped admitting so many Blacks." To keep fewer students aware of her business providing an exclusive and necessary service, she spent many hours to find a way to have her business come up when “black hair” was typed in as a search term within a specified radius of the school. The difficulties she experienced illustrate the algorithm problems clearly. “To be a Black woman and to need hair care can be an isolating experience. The quality of service I provide touches more than just the external part of someone. It’s not just about their hair.”I do not want to get off the subject Noble has concentrated on with such eloquence in her treatise, but I can’t resist noting that we are talking about black women’s hair again…Readers of my reviews will know I am concerned that black women have experienced violence in their attitudes about their hair. If I am misinterpreting what I perceive to be hatred of something so integral to their beings, I would be happy to know it. If black hair were perceived instead as an extension of one’s personality and sexuality without the almost universal animus for it when undressed, I would not worry about this obsession as much. But I think we need also to work on making black women feel their hair is beautiful. Period.
By the time we get to Noble’s Epilogue, she has raised a huge number of discussion points and questions which grew from her legitimate concerns that Google Search seemed to perpetuate the status quo or service a select group rather than break new ground for enabling the previously disenfranchised. This is critically important, urgent, and complicated work and Noble has the energy and intellectual fortitude needed to work with others to address these issues. This book would be especially useful for those looking for an area in the digital arena to piggyback her work to try and make a difference.
The problem is: what to do? Noble complains that Google directs searches to conglomerate news sources, but on YouTube that doesn’t happen and the results seem to be worse, leaning towards extremism and conspiracies, with a lot of racism. Past forms of information sorting were really bad too; Noble notes the history of racist Dewey Decimal System and Library of Congress classifications (not just history, though more contested now). She also discusses how Dylann Roof was radicalized by reading online, starting from Wikipedia and going from there—searches for “black on white crime” lead to white supremacist sites, rather than to neutral crime statistics that would reveal that most crime is intraracial. Could anything other than human moderation stop this pattern? I just don’t know; Noble suggests developing public search engines so that corporate motivations wouldn’t control the data collection/surveillance, but (1) they’d still confront the problems of dealing with a racist corpus, and (2) I’m not so hot on government surveillance either. Another suggestion is a black-friendly search engine, and there are some moves towards that, but I don’t think that solves the problem for people who don’t know to seek it out in the first place—or people like Roof.
The last chapter of the book focuses on a small business owner who cares for black hair, and whose business was harmed by two neoliberal blows—a decrease in the number of African-American students because of anti-diversity policies, and the rise of Yelp, which represented an increased cost—they’d only give her prominence/keep other hairdressers off her page if she paid, even if the other places didn’t specialize in black hair—and also presented particular difficulties for her reviews, inasmuch as she perceived that her customers were less likely to use Yelp in the first place than white people, so their reviews of her place might be the only reviews those customers left on Yelp and thus were more likely to look fake to Yelp. “Black people don’t ‘check in’ and let people know where they’re at when they sit in my chair. They already feel like they are being hunted; they aren’t going to tell The Man where they are. I have reviews from real clients that they put into a filter because it doesn’t meet their requirements of how they think someone should review.” Not that she was all that fond of all her customers—she also complained about people who came into her business to photograph the products she used, then order them online for less. Again, search engines aren’t the only problem she’s facing; it’s a constellation of economic and social changes of which search engines are only a part, perhaps a minor part, though it’s certainly worth pointing out that the small producers are the ones from whom wealth can still be extracted by these larger companies like Yelp.
Noble unpacks the trouble with corporations that have no public accountability except to shareholders dominating our information landscape and, in particular, how problematic their systems are for women and people of color. The design of our most dominant information gateway poaches unpaid labor, imagines the world to be just like those who write the code to sell attention and adds, and gives us back a reflection of ourselves that is warped by not jumbling information together without context. Its dominance means journalists now have to make their stories more sensational to be found in the din, and that whole communities lose their connections as their own histories are crowded out. (There’s an excellent interview that shows how Yelp has affected one black woman’s business and how the system we may use casually to check out options actually demands constant payments from businesses to make their online profiles more visible while making networks of word-of-mouth less vital.)
This is a book librarians and anyone else who worries about the state of our information systems should check out. I’ll share a few of the many quotes I noted down to give you some flavor.
Algorithmic oppression is not just a glitch in the system but, rather, is fundamental to the operating system of the web (10).
Google’s enviable position as the monopoly leader in the provision of information has allowed its organization of information and customization to be driven by its economic imperatives and has influenced broad swathes of society to see it as the creator and keeper of information culture online, which I am arguing is another form of American imperialism that manifests itself as ‘gatekeeper’ on the web (86).
Algorithms are, and will continue to be, loaded with power (171).
Though in an epilogue, written after Trump’s election, Noble admits her solutions – strengthening the social institutions that are unlikely to get anything but decreased budgets and creating public options - aren't in the cards, she believes we need to change our information systems fundamentally.
Without public funding and adequate information policy that protects the rights to fair representation online, an escalation in the erosion of quality information to inform the public will continue . . . My hope is that the public will reclaim its institutions and direct our resources in service of a multiracial democracy. Now, more than ever, we need libraries, universities, schools, and information resources that will help bolster and further expand democracy for all, rather than shrink the landscape of participation along racial, religious, and gendered lines" (181, 186).
Still, there's some very good analysis here, on categorization and assumptions and the way systems that were created thoughtlessly can create and reinforce great harm. There's a lot to build on from here.
It's a scholarly work (for all of those limits). Readers will also like Bowker and Star's
I had hoped for some actual technical discussion of how page results are generated, and what aspects are under the control of Google and which are merely aggregating the preferences of actual users, but that never comes. It does not help that, for a book published in 2018, the bulk of her examples are ancient, often from 2011. The Internet, and Google with it, has changed drastically in the last seven years, but her discussion barely recognizes the fact. I suspect the first section of the book, with the very outdated examples, was the core of her 2012 dissertation, which has here been included with very little change or update. Later chapters attempt to bring in more recent issues, but they feel like quick glosses meant to fill out the need to expand the dissertation to book length, and lack the more serious consideration that went into the earlier section. In all honesty, she should have written an entirely new, current book rather than attempt to update the older material. She has the background to do that, but she is trapped by the confines of trying to publish the dissertation as is.
With more technical expertise, and data more contemporary to the date of publication, this could have been a more exciting contribution. Failing that, it is still interesting on the general points, if dusty and outmoded in the details.
The Publisher Says: Run a Google search for "black girls" - what will you find? "Big Booty" and other sexually explicit terms are likely to come up as top search terms. But, if you type in "white girls," the results are radically different. The suggested porn sites and
In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.
Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance - operating as a source for email, a major vehicle for primary and secondary school learning, and beyond - understanding and reversing these disquieting trends and discriminatory practices is of utmost importance.
An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.
I RECEIVED A DRC FROM THE PUBLISHER VIA EDELWEISS+. THANK YOU.
My Review: The world, as the Internet has shaped it, took a promise of information access and educational opportunity unparalleled in human history and screwed it up to the point it reinforces the evils and stupidities it could so easily have alleviated.
The problem, it transpires, is both blindness..."*I* am no racist, or a sexist! Why, some of my best friends..." is not new, nor is it uncommon in any society...and neither is hubristic malevolence (Cambridge Analytica, for example). We're two decades in to a giant, uncontrolled social experiment. Voices like Author Noble's are still notable for their infrequence of prominence in the rarefied world of Congressional hearings and the European Union's creation of the GDPR.
The issues that Author Noble raises in this book need your attention. You, the searcher, are the product that Google and the other search engines are selling to earn their absurd, unconscionable, inadequately taxed profits. Every time you log on to the internet, Google knows...use other search engines, never click on any links, and Google still knows you're there. That's the Orwellian nightmare of it...like East Germany's Stasi, they're everywhere, in every website you visit. Unlike the Stasi, they are possessed of the capacity to quantify and analyze all the information you generate, and sell it to anyone who can use it. For you or against you, as long as the check clears, Google and its brethren couldn't care less.
(There are links to information sources in the blogged version of this review at Expendable Mudge Muses Aloud.)
Noble's thesis is that we port our biases into our
I do feel this could've dug deeper- there were a lot of "my work demonstrates..." sentences and I expected qualitative data, but I recognize that's my own STEMy expectations. Noble does highlight further places that can be built on this preliminary look at SEO and systemic biases (in particular, the influence of pornography on various web things like streaming and e-commerce) but I'm left wanting more.