Status
Call number
Genres
Publication
Language
Description
"A former Wall Street quantitative analyst sounds an alarm on mathematical modeling, a pervasive new force in society that threatens to undermine democracy and widen inequality,"--NoveList. "We live in the age of the algorithm. Increasingly, the decisions that affect our lives-- where we go to school, whether we get a car loan, how much we pay for health insurance--are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: Everyone is judged according to the same rules, and bias is eliminated. But as Cathy O'Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination: If a poor student can't get a loan because a lending model deems him too risky (by virtue of his zip code), he's then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a 'toxic cocktail for democracy.' Welcome to the dark side of Big Data. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These 'weapons of math destruction' score teachers and students, sort résumés, grant (or deny) loans, evaluate workers, target voters, set parole, and monitor our health. O'Neil calls on modelers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it's up to us to become more savvy about the models that govern our lives. This important book empowers us to ask the tough questions, uncover the truth, and demand change."--Dust jacket.… (more)
User reviews
We model everything now. Teacher evaluations, job applicants, credit applications, online purchasing, voting patterns, crime – pretty much anything you can think of is modeled in some opaque black box of unaccountable algorithms. They are so inherently faulty,
Something as simple as a zip code can tell a system what kind of neighborhood you live in, and make assumptions. Search history, social media activity, purchase record – all contribute to an instant decision that you are worthy or not. These values are plugged in to school applications, job applications, and personal evaluations such as HR records, personality tests and even dating sites. Even purged, forgiven, and expired details remain active. Police model neighborhoods. They harass residents for every little thing in poorer neighborhoods, while giving a free pass to wealthier ones, where crimes are far bigger, but mostly white collar. Only ten states have outlawed the use of credit checks on job applications. For shopping at downscale stores, credit cardholders had their limits slashed, making them poorer and making them poorer risks – as in higher interest rates. It is computer models that schedule shifts, without concern for the needs of the employee in terms of child care, time off between shifts, or advance notice. Managers are paid to optimize revenue per hour worked, so memos from above go unheeded.
That models are often incorrect, badly designed, misinformed and misconstrued, means that people are denied service, or not hired, or outright fired. But there’s always someone else behind them, so it’s just the cost of doing business. “Unfairness is the black stuff belching out of the smokestacks. It’s an emission, a toxic one,” O’Neil says. We are all just collateral damage.
One insurance company instantly evaluates whether a customer is likely to shop around. If it judges not, it charges them more. It actually has 100,000 microsegments (buckets) depending on instant customer scores. In Florida, a driver with a clean record but a poor credit score pays $1552 more for insurance than a driver with a high credit score and a drunk driving conviction. Shopping sites won’t offer you a discount if you are already logged in. Payday loans and for profit schools prey on the disadvantaged and the desperate, extracting billions from them. The games are endless.
WMD is extremely fast paced, fact packed, and depressing. It has come to the point that machines dictate who may have a successful life, right out of the gate. Initiative, courage, creativity, drive, human kindness – don’t enter into it. We are all typecast by Big Data - assigned values mathematically that can stymie a life. There is no appeal. There isn’t even any knowing. The poor get poorer. The rich find the new era refreshing.
And of course, none of this is transparent. Customers cannot arrive at these prices, these decisions or these scores themselves. It’s all in the math, manipulating us. And yet, 73% of Americans believe search engine results are “accurate and impartial”. 62% believe Facebook posts their submissions to everyone. Nothing could be farther from the truth. Worse, data banks draw on each other, multiplying their errors, sometimes creating completely false profiles of a person, who then cannot get a job, rent an apartment or buy a car.
O’Neil says she is outraged by her own profession. You will be too.
David Wineberg
In particular, as the author points out in many convincing ways, applications of Big Data “punish the
The author calls the mathematical models employing Big Data and used to such harmful effect “Weapons of Math Destruction” or WMDs.
In WMDs, she explains, “poisonous assumptions . . . camouflaged by math go largely untested and unquestioned.” They create their own toxic feedback loops, and, to an extent which shocked me, guide decisions in a large variety of areas from advertising to prisons to healthcare to hiring and firing decisions. Most importantly, because they rely on esoteric mathematical models (no matter that they are many times based on toxic, biased, and/or erroneous assumptions):
“They’re opaque, unquestioned, and unaccountable, and they operate at a scale to sort, target, or ‘optimize’ millions of people.”
The goal is always profit, but what is lost is fairness, the recognition of individual exceptions, and simple compassion and humanity. The author demonstrates conclusively how the uses to which Big Data are put adds to the growing dystopia and inequality gap.
I am not at all versed in math, but the author manages to explain how all this works without requiring that one understand specific algorithms. She provides specific examples from the worlds of teacher evaluations, hiring decisions generally, advertising, insurance, policing, college admissions, lending and credit evaluation, and political targeting.
One of the saddest chapters (and they are all sad, unfortunately) is about the many for-profit universities (Trump University comes to mind) that specifically target people in great need, selling them overpriced promises of success. Her quotes from the marketing materials of these places are horrifying. They look for individuals who are “isolated,” with “low self esteem” who have “few people in their lives who care about them” and feel “stuck.” She shows how they use google searches, residential data, and Facebook posts, inter alia, to find “the most desperate among us at enormous scale”:
“In education, they promise what’s usually a false road to prosperity, while also calculating how to maximize the dollars they draw from each prospect. Their operations cause immense and nefarious feedback loops and leave their customers buried under mountains of debt.”
The chapter on the way the “stop and frisk” policing operates is also very depressing; and in truth we have seen the tragic results in city after city.
The fact is, the whole book is rather a downer, albeit an important one. Although O’Neil cites a few programs that have used Big Data to help people rather than to enrich a few and oppress the rest, can one really think that “moral imagination” can take precedence over prejudice and greed? Personally, I’m not so sure. The author provides ideas about how to change (and importantly, regulate) uses of Big Data, but she is more optimistic than I am, ending on a positive note:
“We must come together to police these WMDs, to tame and disarm them. My hope is that they’ll be remembered, like the deadly coal mines of a century ago, as relics of the early days of this new revolution, before we learned how to bring fairness and accountability to the age of data. Math deserves much better than WMDs, and democracy does too.”
Evaluation: I hope this important book gets a lot of attention. My husband always makes the argument about privacy concerns that what do we care if we’ve done nothing wrong? This book shows how, astoundingly, that isn’t enough to stop Big Data from hurting us in many aspects of our lives. It is a critical lesson for today’s world, and the world of our children.
This book is important, and I think it should be read by anyone concerned about how Big Data can be used to harm us all. As someone whose future career depends upon algorithmic learning, statistics, and mathematics, I can say this book was eye opening. I'm used to hearing about the power of algorithms and modeling, but really, a model is not the thing that it models (as every mathematician knows).
This book is a lot more accessible than Derman's Models.Behaving.Badly, even if it is in the same vein. It has a much clearer focus, and it very clearly explains the traps mathematical modeling has created. I highly recommend this book to everyone. It doesn't require an understanding of math (there are no models or equations in this book). Just an understanding of how algorithms can contain bias through the use of proxies. Read it and share it.
Given her deep and varied background in data science, O'Neil has the expertise to address the algorithms at their constructive level, however, the reader may be left a bit wanting for a deeper explanation of exactly how the algorithms in question is flawed. I think readers will find this either relieving or frustrating. As a scientist, there were times when I wanted a deeper explanation of the nature of a particular algorithm's flaw, but this is a book of social commentary at its heart, and many readers will welcome the absence of the particulars. Due in part to O"Neil's clear writing, you do not need any mathematical background to read and understand the book. Essentially, O'Neil makes the argument that people are not pieces of data, and if they are treated as such (as they will increasingly be), then there will be serious human consequences. While I don't agree with every argument, it is an important book and one which I am glad to have read, and I recommend it.
Highly recommended!
"Data over people" is the message of this book, the author puts forth many examples of how we've truly lost touch with reality. Even though these data models are known to be wrong, there is no feedback loop or agenda to fix them. We accept everything and question nothing, as a result our future looks bleak and terrible. Individuals that are out-of-touch with the world will be dictating the path the world goes.
O'Neill spends most of the book going into detail about many different models and their unfairness. Part of the problems is that in our society's effort to create models to predict behavior, we use proxies so that we can manipulate the numbers, but proxies can never represent the real thing because life is complicated and cannot be boiled down to a set of numbers. Often these proxies help misrepresent the truth and in the case of most of these models, the ones who suffer are often those who are poor because the data that can be gathered on them doesn't read well: credit scores, zip codes, education, etc. None of these data points however, can tell you if they're trying to put their life back together but circumstances have been cruel, and so a large population of underserved people are now underserved by software. As she says on page 204, "Big Data processes codify the past. They do not invent the future." When a algorithm reinforces the past, it reinforces all of the things that come along with it, including the racism and, the classism and spits that back out.
O'Neill only stops to tell us how to fix this problem in the conclusion. After promising for the whole book to talk about how to fix this problem, it's only in the last twenty pages that she gets down to her proposed solutions. While her solutions seem good, they didn't seem as well researched as the rest of the book, and I felt rushed between one solution to the next. The only solution that seemed like it might actually work was the one that the EU implemented, only because it had actually been done before. If O'Neill had spent as much time researching and writing about her solutions as she did on even a fourth of the book, it would be a much better book, but instead it feels a little one-sided, without much of an idea on how to change this and move forward, much like some of the models she discusses.
This review was written for LibraryThing Early Reviewers.
She calls for careful ethical consideration and assessment of these toxic algorithms, but I was left feeling that the political will to regulate these tools is unlikely to be there for us, especially as these tools are exceptionally helpful for businesses to maximize their profits at the expense of employees and customers alike. Add in her observations that many of the worst offenders either actively help the richest Americans or can be avoided by the application of a bit of money, and it's hard to see any hope of change until such time as some sufficiently horrible event occurs that affectis a broad enough range of people to force change. (Just kidding, obviously, as the 2008 financial crisis would have seemed like just the sort of thing to make the government punish offenders and regulate the financial industry, but that certainly didn't happen.)
As much commentary as explanation, O'Neil is particularly interested in how these algorithms create and sustain feedback loops which perpetuate the very stereotypes and discriminatory practices they were meant to alleviate.
Weapons of Math Destruction does not require an understanding of advanced math, and O'Neil does a good job of explaining the underlying principals without relying on jargon. I would recommend it to anyone interested in how technological systems are playing an increasing -- and invisible role -- in shaping our society.
N.B.: I received a free copy of this book from LibraryThing's Early Reviewer program.
For me, the book is reminiscent of Freaknonomics. It covers thought-provoking, seemingly hidden,
Glad someone is highlighting these important issues, for the economy, for math, for science, and for people's livelihoods.
I received this book from the Blogging for Books program in exchange for this review.
Awards
Collection
DDC/MDS
005.7 |