If technology fails to design for the most vulnerable, it fails us all

0

What is Russian do the protesters have in common with Twitter users panicked by Elon Musk reading their DMs and people worried about the criminalization of abortion? They would all benefit from being protected by a more robust set of design practices from technology-developing companies.

Let’s go back. Last month, Russian police coerced protesters unlocking their phones to search for evidence of dissent, which has led to arrests and fines. Worse still, Telegram, one of the main chat apps used in Russia, is vulnerable to such searches. Even just having the Telegram app on a personal device can imply that its owner does not support the Kremlin war. But the builders of Telegram have did not design the app with personal safety in high-risk environments in mindand not only in Russian the context. Telegram can thus be armed against its users.

Similarly, amid the back and forth over Elon Musk’s plan to buy Twitter, many who use the platform have expressed concerns about his attempt to hype algorithmic content moderation and other design changes at his whim of $44 billion. Bringing recommendations from someone without a framework of risk and harm to very marginalized people leads to proclamations of “authenticate all humans.” This seems to be a push to remove online anonymity, something I have written about very personally. It’s poorly thought out, harmful to those most at risk, and not backed by any methodology or real evidence. Beyond his unclear bursts of change, Musk’s previous actions combined with the existing damage to Twitter’s current structures have made it clear that we are headed for further impacts on marginalized groups, such as Black and POC Twitter users and trans people. Meanwhile, the lack of security infrastructure has hit the United States hard since the Supreme Court’s draft opinion leaked in Dobbs vs. Jackson showing that the protections provided under Roe vs. Wade are mortally threatened. With the planned criminalization of those who seek or provide abortion services, it has become increasingly clear that the tools and technologies most used to access vital healthcare data are insecure and dangerous.

The same steps could protect users in all of these contexts. If the designers of these tools had designed their applications with a focus on security in high-risk environments, for people often considered the most “extreme” or “peripheral” cases and therefore ignored – the militarization that users fear would not be possible, or at the very least they would have tools to manage their risk.

The reality is that making better, safer and less harmful technologies requires design based on the lived realities of those who are most marginalized. These “edge cases” are often overlooked because they fall outside of the likely experiences of a typical user. However, these are powerful indicators for understanding the flaws in our technologies. That’s why I call these cases – of the most impacted and least supported people, groups and communities – “off-center”. The decentered are the most marginalized and often the most criminalized. By understanding and establishing who is most affected by distinct social, political and legal frameworks, we can understand who would be most likely to fall victim to the weaponization of certain technologies. And, as an added bonus, the technology that refocused the extremes still be generalizable to all users.

From 2016 to the beginning of this year, I led a research project to the human rights organization Article 19 in collaboration with local organizations in Iran, Lebanon and Egypt, with the support of international experts. We explored the lived experiences of queer people who have been persecuted by the police due to the use of specific personal technologies. Take the experience of a queer Syrian refugee in Lebanon who was stopped at a police or military checkpoint for papers. They had their phone arbitrarily searched. The icon for a queer app, Grindr, is visible and the agent determines that the person is queer. Other areas of the refugee’s phone are then checked, revealing what is considered “queer content”. The refugee is taken for further interrogation and subjected to verbal and physical abuse. They now face conviction under Article 534 of the Lebanese Penal Code and face imprisonment, fines and/or the revocation of their immigration status in Lebanon. This is one case among many others.

But what if that logo was hidden and an app indicating an individual’s sexuality wasn’t easily accessible? While letting the individual retain the app and connection with other queer people? Based on research and collaboration with the Guardian Project, Grindr has worked to implement a stealth mode on its product.

The company has also implemented our other recommendations with similar success. Changes such as the unobtrusive app icon allowed users to make the app appear like a common utility, such as a calendar or a calculator. Thus, during a first police search, users can circumvent this risk of being unmasked by the content or visuals of the applications they own. Although this feature was created solely based on the results of extreme cases, such as the queer Syrian refugee, it has proven popular with users around the world. Indeed, it has become so popular that it has gone from being fully available only in “high risk” countries to being available internationally for free in 2020, as well as the popular PIN function which was also introduced as part of this project. It was the first time a dating app had taken such drastic security measures for its users. many Grindr competitors have followed suit.

Share.

Comments are closed.