Human rights data used the wrong way can be misleading

/userfiles/image/Satterthwaite(1).jpg

In a world of evidence-based policy and data-driven decision-making, it’s time for the human rights advocacy world to more fully embrace new methods. Of course, the constructionuse, and reliance on quantitative indicators in human rights settings are all rife with danger. But the promise of using data to understand rights problems, their causes and solutions, is too great to pass up. The key—as I discussed with Todd Landman in a recent episode of The Rights Track—is critical engagement.

In this context, the Data Visualization for Human Rights project, based at the Center for Human Rights and Global Justice (CHRGJ), grew out of a desire to understand how human rights work could more effectively harness the power of visualizing data and data stories to advance rights fulfilment. Through a series of randomized user experiments, we have learned that data visualization is a powerful persuasive tool that needs to be carefully tailored to its audience and context. In one study, we found that readers who had a strongly negative opinion about a rights issue were less likely to find charts and graphs persuasive and more likely to be persuaded by rights data presented in a table. Those readers who came to the topic without strong opinions were, on the other hand, more persuaded by data displayed through graphs and charts.

These findings suggest that human rights advocates would be wise to analyze their various audiences and tailor their presentation of data accordingly. In another randomized user experiment we conducted, we found that it is strikingly easy to mislead readers when using deceptive visualization techniques. By inverting the y-axis, for instance, we could successfully reverse the message a reader perceives even when the data was reported accurately. Similarly, common techniques like starting the y-axis at a value other than zero can mislead the reader into seeing much greater values than the data support.


geralt/Pixabay (Some rights reserved)

Data is important for human rights, but advocates must tread carefully to avoid misleading people.


These findings are increasingly relevant for the human rights community, where the use of data visualization and other visual features in human rights communication and advocacy is growing. For a recent article, we reviewed all Human Rights Watch and Amnesty International reports published in 2006, 2010, and 2014 to classify visual features. We found an increase in the use of photographs, satellite imagery, maps, charts and graphs in these reports. Some of the visualizations were used to amplify traditional human rights findings, while others involved the use of new methods, such as geospatial analysis or quantitative surveys.

The research director at the Human Rights Data Analysis Group (HRDAG), Patrick Ball, explains in a recent episode of the Rights Track that quantitative analysts must be careful in their use of the incomplete and statistically biased data sets that rights groups create in the course of their work to document and denounce abuses. After all, these groups traditionally collect case information to halt abuses and seek accountability, not to identify trends or prevalence within a population. Using a technique called Multiple Systems Estimation (MSE), human rights statisticians can estimate the degree to which case reports overestimate—or quite often, underestimate—the prevalence and frequency of violations. Without such processing, the use of many datasets collected by advocates could yield misleading results.

Methods should only be used when they are well-suited to the objectives of the research and when data is valid. 

This does not mean that human rights work should not embrace quantitative methods. Instead, it means that methods should only be used when they are well-suited to the objectives of the research and when data is valid, reliable, and either unbiased or capable of statistical adjustment. Some rights—or elements of rights—including many economic and social rights, are indeed amenable to quantitative study. Groups like the Center for Economic and Social Rights regularly use such quantitative data and visualizations in their work. At the other end of the spectrum, certain elements of civil and political rights such as the right to be free from torture could also be studied in the same way (for example, through data about the training of police officers and prison personnel). However, those data would be more distant from the underlying right, serving as proxy indicators instead of robust compliance measures.

Other elements of rights are almost never accurately assessed through numbers. Most governments that use torture do not admit to doing so, and they rarely keep records of those they subject to coercive techniques. (The US CIA’s torture program is an exception: the techniques used on its victims were documented in excruciating detail.) This is why researchers seeking data for human rights must analyze human rights reports carefully, as these are necessarily incomplete and statistically biased—based on a variety of important sources, such as victim testimony, case reports, hospital records, and information about complaints filed.

Many human rights organizations are actively using or seeking opportunities to use methods beyond the usual qualitative and legal methods favored by the large, mainstream NGOs, but they may not have access to the expertise they need on staff to operationalize new methods. The Human Rights Methodology Lab, which I co-founded alongside Professor Sarah Knuckey of the Columbia Human Rights Institute and Amanda Klasing and Brian Root of Human Rights Watch, provides a formal space for activists and scholars to join forces in the context of real-life research projects with advocacy objectives. Through the Lab, academic researchers and NGO staff come together at the early stages of a project to design a methodological approach that is both best suited to the advocacy objectives of the researcher and as rigorous as possible. Other important initiatives include networks like the AAAS On-Call Scientists network, organizations such as HRDAG and DataKind, university centers like Carnegie Mellon’s Center for Human Rights Science, and guides from Data and Society and The Engine Room.

Nikki Reisch, Ellie Happel, and I recently presented a Global Justice Clinic project to the Lab that examines the right to water in the context of the emerging gold mining sector in Haiti. Gold mining poses well-known risks to water, with the potential to deplete water sources and contaminate rivers and springs. Understanding and demonstrating these impacts using scientific evidence is an important part of advocacy for communities living in mining-affected areas. In partnership with the Kolektif Jistis Min, a collective of Haitian social movement groups, the Clinic is designing and implementing a household survey on the right to water. Using a randomized sample, the survey will be the backbone of a community-owned scientifically valid baseline study concerning water in an area slated for gold mining production.

To contribute to evidence-based policy and decision-making, human rights workers need to access a broader set of tools for their research. Making choices about which methods are best suited to a given issue and context requires a sense of what different techniques can and cannot do. As we prepare students to enter the human rights field, we must ensure they have basic quantitative literacy and a sense of what different disciplines can contribute to our field. They must also have the ability to access expertise that crosses traditional boundaries while remaining committed to the advocacy objectives of those most affected by abuse.