Big Data and WMDs – A Cautionary Tale

Cathy O’Neil has penned perhaps my favorite book title of the past few years – Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Better still, her work is substantive and worthy of its clever name. Read it and you will never think of “systems” and data in quite the same way.

O’Neil is a serious mathematician, earning a Ph.D. from Harvard, where she wrote about arithmetic algebraic geometry. After working in academia, she moved to Wall Street as a “quant.” The position was lucrative but did not align with her values or interests. O’Neil started writing – she has a blog – and has emerged as a trenchant critic of big data analysis. She accessible and understandable to a lay public. In Weapons of Math Destruction, she develops a persuasive critique of big data analysis.

O’Neil’s book explores one misuse of data analysis after another. One of her key targets, unsurprisingly, is how WMDs (Weapons of Math Destruction) can wreak havoc in education. I believe that education is especially sensitive to big data modeling. It is extraordinarily challenging to demonstrate rigor, consistency, and the value added of an assignment, a course, a program, degree or institution. These are recurring and frustrating elusive goals. Quantification offers a solution, but accompanying it are an array of responsibilities and costs.

O’Neil details the misguided attempt in Washington, D.C., to hold teachers accountable for students’ progress. A data consultancy developed a rating system of teaching quality that lacked any effective process to adjust data or the underlying algorithm. The system defined a particular reality and simply could not address data that did not fit within that modeled world view. Once the system identified teachers as failures, the school system fired them.

One of those “failures,” a teacher with a good track record, stubbornly pushed to understand what happened to her. It took great persistence because no one in the school system could explain the algorithm. The teacher looked closely at the data and realized that several of her students had performed surprisingly well in prior years. However, these same students had difficulty reading in her class. After eliminating multiple causes, the teacher determined that there was cheating or some form of grade alteration in students’ records in an earlier year. The teacher was disadvantaged because of inaccuracies – cheating – in the system. Unfortunately, there was no remedy. It mattered little that she had done a good job educating her students of the years. The system could not accommodate inaccurate data. Administrators found themselves bound to be consistent. The teacher was let go even though she was the honest professional. Luckily for her, she found a better paying job teaching in a suburb. The algorithm and rules remained in place.

O’Neil argues convincingly that many of the data analysts who craft models do not understand or give much care to considering the behavior they are modeling. Along like lines, those who implement and use the system often do not understand how or why it works. It is rare to have someone “downstream” in an organization committed to a big data analytical model know or question the system’s underlying assumptions.

Data models, O’Neil explains, reflect goals and ideology – whether planned or not. Data models are designed to do something. Without critical thinking and careful analysis, there is no inherent accuracy in a model. Models have to be managed, adjusted, and re-adjusted to obtain accuracy. On their own, they can be helpful or destructive. O’Neil makes a compelling argument how racism can function like a predictive model. It seeks data and outcomes that support its underlying unethical aim.

Another model that O’Neil critiques is found in the criminal justice system. Recidivism models are used to estimate the likelihood of a prisoner committing a future crime. They give guidance for parole, for sentencing, and other decisions. One would hope that recidivism models would guarantee greater consistency and fairness in sentencing. But what happens instead is that the algorithms in these system rely on a host of assumptions, making preliminary judgments, and increasingly recommend steps to separate the likely to commit crimes from the likely not to. We all know that prisoners of color or those without wealth are more likely to hail from communities with higher crime rates. The model’s feedback loop labels poor prisoners of color “high risk.” It keep those individuals in situations and environments, like prison, that increase the probability that they will have future problems with the law.

If we want good systems, O’Neil stresses that they have to be transparent, continuously updated, and that everyone involved and affected by the model should have an understanding of what is being measured and modeled. Those that design the systems must be attentive to problems of scale. WMDs are not transparent. In fact, most companies and organizations that use these large systems go to great lengths to hide their data analytics as intellectual property. Joining opacity and scaling, the third problem O’Neil identifies is damage. These systems can cause all manner of problems for many people.

Her chapter on US News and World Report’s ranking of colleges and universities is particularly relevant to higher education. When US News started its ranking, it relied solely on the opinions of university presidents. The ranking did not generate much attention and the magazine received complaints about its accuracy. In response, editors started building models based on what they could measure. By default, these factors served as proxies for academic quality. Factors such as SAT scores, acceptance rates, graduation rates, alumni giving , and a host of other data points were added to the mix. The ranking became more “scientific” and received more attention. A feedback loop quickly emerged as colleges started to manipulate their data to conform to the magazine’s algorithm. The rankings became self-reinforcing. They have become an unexpected menace to higher education, O’Neil writes. When proxies are used in a model, those affected will work to game or manipulate the model.

Underlying the use of big data analytics are competing tendencies, O’Neil explains. Sometimes we seek fairness. Other times we want efficiency. WMD’s tend to promote efficiency. Fairness requires thought, exceptions, and a willingness to be flexible. Facebook “friends” are not the same as real life friends. A statistical model that predicts the likelihood of someone committing the crime should not have the same impact arresting a person for committing a crime.

O’Neil’s also looks at personality tests, particularly those that are used to help large corporations make hiring decisions. She highlights the unfairness that can result from use of these large HR systems. The same unfairness extends to financial measures, like FICO scores. These are used to assess a borrower’s credit-worthiness, but researchers can prove that there are better and more equitable models. O’Neil is greatly concerned about the ways that WMDs have inveigled themselves into advertising, social media, and a host of targeted marketing campaigns. All of these, she argues, lead to undermining our collective social sense and responsibilities. They disadvantage the poorer, the weaker, and minorities. They reduce human agency. Popping up through the narrative are the angry and frustrated voices of people whose lives have harmed because of a WMD. “The computer tells me to ___________” is a poor excuse for a decision with consequences.

O’Neil makes a compelling case that WMDs lead to inequality, unfairness and make us less human and humane. Big data and analytics are powerful tools – and like any other large and complicated machine, they should not be used without training and supervision.

David Potash

One Comment

Leave a Reply to Julius Nadas Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.