The fine print: seeing beyond the hype in technology for human rights
With all the hype about new technologies for human rights, activists must think critically and strategically.
The challenges around technology implementation carry higher risks in the human rights sector than in other areas. Technology’s “fail fast” mantra isn’t appropriate here, nor are attempts to innovate, iterate and try again. The stakes are much higher, and reliability makes all the difference. Despite rapid changes in the technology environment, the core work of human rights documentation has not changed at the same rate. Violations must be documented, evidence must be gathered, and communities must be organised. All of this can happen with or without new technologies—how do human rights workers decide on a best course of action?
In 2016, the Oak Foundation supported us at The Engine Room to carry out a scoping study on the use of technology tools for human rights documentation. We spoke to human rights defenders (HRDs), technology tools providers, and intermediaries—people who play something of a bridging role between HRDs and their technology tools—to understand the challenges human rights documentation initiatives face when adopting new tools and practices.
The pressure to use these new technologies in human rights is very real.
The pressure to use these new technologies in human rights is very real. Funding applications detailing innovative new technologies garner more attention, and the possibilities are appealing. The hype around these tools often skips over the more difficult and less exciting realities, and for less tech-literate decision-makers, this misinformation can lead to strategic errors. These somewhat conflicting motivations mean that technologies are sometimes mentioned or adopted not because they are the most strategic or necessarily useful tools for the job, but due to external or uninformed internal pressures.
Flickr/ (Some rights reserved)
It is essential for human rights workers to stay critical and see past the hype. It’s critical that activists understand that no technology tool is going to “solve” any particular human rights issue.
But due to the sensitivity of their work, human rights defenders need to be able to trust in their tools. Documentation methodologies must be reliable and secure, and often new and relatively untested technologies are not. Software bugs are not unusual—updates sometimes break things, applications crash, or a new feature does not quite do what it was intended to do. However, for most of us, these bugs are (usually) little more than inconveniences. For human rights defenders working in precarious or risky situations, they can be devastating. And many human workers that we spoke to told us of software updates which confused them, new operating systems that broke, and frustration that came from trying a new tool only to have it fail at key moments. As such, many still turn to “old” (reliable) technologies, such as paper and pen.
In addition, all of people we interviewed said that sustainability and the lifetime of tools was a major concern. Here, tools being open-source and well-documented can make all the difference. Open-source tools don't guarantee sustainability, but at least if they are well-documented and developed with a community around them, they have the option to be picked up by other organisations or communities who care about the tool. Sometimes that doesn't happen though –
for example, one land rights organisation invested time and resources in using a tool to help them document land rights violations before discovering that the tool is now unsupported due to a lack of available funds. Also, if a new tool is “owned” within an organisation by an individual who then leaves suddenly without leaving documentation for someone else to pick up and learn from, the institutional learning could be lost. To counter this, the work of organisations whose main mission is to create, maintain and update open source tools specifically for the human rights sector, like HURIDOCS, is ever more essential.
Due to these concerns and more, many people admitted to us that they preferred proprietary systems over open source equivalents. Indeed, using open-source, self-hosted tools requires technical skills that small organisations often lack. Though a good opportunity to boost a skillset, open-source programs are often resource-intensive and, perhaps more crucially, lend themselves to beginner mistakes such as security vulnerabilities. But proprietary tools bring a third party (or more) into the equation. The terms and conditions of cloud-based services or many proprietary tools make it very difficult for HR organisations—which are collecting sensitive data that must be protected—to understand where that data is going or who might be able to access it.
Using these services is also a political statement. Relying upon behemoth corporations for essential infrastructure introduces a structural vulnerability. Technology companies get acquired, motivations and strategic aims change, and the user needs of human rights defenders are rarely (if ever) a priority. Tech companies can make changes or even choose to end core features or products, without being accountable to their users. Corporations also have very different, closer, relationships to government than human rights organizations, for whom governments may well be their main adversary.
The use of proprietary systems also means relinquishing long-term control over the possibilities available. Human rights organisations that use social media platforms to gather their information rely on algorithms to show them what they need to see. There’s no way of knowing how those algorithms change, or what information they are hiding or making more visible. Organisations who share information by uploading it to commercial platforms trust that the data will stay available to all of their key communities, when in reality it might not.
Multiple human rights defenders we spoke to had tales of being given ineffective technology advice for their situation by people unaware of their particular contexts. Human rights defenders in Pakistan, for example, have seen digital security trainers come to Pakistan and teach encryption software, totally unaware that encryption is illegal in the country. This lack of awareness can at best be a waste of resources, and at worst, put people in real danger.
As with any technology, it is important that the developers have a deep understanding of the needs of their users. In our research we found that some organisations have achieved this by hiring developers from the same region as their user base, like HURIDOCS. Other organisations have struggled to address the needs of their global human rights user base. The potential dissimilarities came up most clearly around the limited availability of tools and associated documentation in scripts other than the Latin script. That said, some tools providers have invested significant resources into providing at least initial versions of tools and documentation in Arabic and Cyrillic script, such as Martus and OpenEvsys.
Overall, it is essential for human rights workers to stay critical and see past the hype. Though a certain tool might seem like the easiest option now, what about in two years or five years time? What will you want to do with the data, and who owns it? Communities like the Responsible Data Forum provide easy-to-access on critical aspects of using technology via their mailing list.
Sometimes short-term and long-term priorities will clash, and it’s critical that activists understand that no technology tool is going to “solve” any particular human rights issue. When it comes down to it, implementing and using technology successfully and strategically in this field is much trickier than it seems.
***This article is based upon work done for The Engine Room's Technology Tools for Human Rights report, released in November 2016, available in full here, and for which interviews were carried out by Nisha Thompson and Tom Walker, with assistance from Kara Kaminski-Killany. The report was financially made possible thanks to the support of the Oak Foundation.
Zara Rahman leads research, documentation, and storytelling projects at The Engine Room. She has worked in over twenty countries in the field of information accessibility and data use among civil society. This year, she is a fellow at the Data & Society Research Institute in New York City.