With the fight for LGBTQ rights increasingly being waged online, Document speaks to four individuals on the frontlines.
The physical and digital worlds are increasingly intertwined when it comes to safety; prejudices that manifest online can lead to physical harm, and marginalized communities, particularly LGBT people, are increasingly prone to security threats like surveillance and censorship. Dating apps like Grindr are being used to locate LGBT hotspots and target human rights activists that form a threat to authoritarian regimes. It’s therefore necessary that the LGBT community becomes aware of the intersection of human rights and technology. I spoke to four queer digital rights defenders helping to build a more equal and rights-respecting world. We discussed the importance of their work, the potential dangers for LGBT people in the digital environment, and what we can do about it now and in the future.
“Access to the internet could save lives of LGBT people in some countries, as the internet allows us to connect to human rights organizations and get lifesaving information.”
Pablo Aguiliera, 33, Mexico City
Board member at AllOut and member of r3d.
Bo Hanna—What does protecting LGBT rights in a digital context mean?
Pablo Aguilera—I’m a board member for AllOut, a global movement for equality and love, and I work for a Mexican non-profit called r3d, which defends human rights in the digital environment. As a digital defender, I work to protect human rights in the digital environment. In my case, it means that I connect LGBT organizations to stakeholders that work on digital rights. I [also strive to] influence policies and practices that impact the lives of marginalized communities. My work mostly focuses on pillars like freedom of expression, privacy, surveillance, and access to the internet. I think LGBT people all over the world should have access to information and tools that can protect them from potential threats online.
Bo—What are the potential dangers for the LGBT community online?
Pablo—The LGBT community is affected by censorship, and our data is being used for state-sponsored surveillance, or for big tech companies to profit from our personal information. There’s evidence that the Egyptian police used Grindr to arrest LGBT people during the massive crackdown in 2017… A lot of dating apps are not encrypted, and hackers and trolls can easily access apps and target [the LGBT community]. We know that the Indonesian government uses spyware to track and profile LGBT activists, and it’s very likely that the same technology has been used in Chechnya. The usage of data for marketing purposes is also problematic; we know that, for example, Grindr is sharing its data with companies for marketing profiling, which is, as well, considered as a form of surveillance. The unethical use of artificial intelligence is a big threat for marginalized communities around the world. We have seen attempts to detect gay men through facial recognition software in the past.
Bo—What can we do about these potential dangers now and in the future?
Pablo—As technology keeps on developing, new forms of threats occur, thus it’s hard to keep up with prevention strategies. The first step is to educate LGBT people, and other vulnerable communities, about how technology could possibly harm them, and take collective action… Access to the internet could save lives of LGBT people in some countries, as the internet allows us to connect to human rights organizations and get lifesaving information. In some countries, LGBT people unfortunately don’t have access to helpful sources online. The internet can help us fight government disinformation and propaganda.
We also need more accountability for people who develop technology that harms LGBT people. At the moment, there isn’t an international framework, nor global policy, protecting LGBT people from dangerous technology. There are only a few declarations and most of them don’t mention the LGBT community specifically because of the fact that some countries wouldn’t ratify them. Privacy policies should regulate companies from using our data. People tend to think data issues are not impacting our daily lives, and say things like, ‘I’m not that important, so I don’t care if Facebook uses my data.’ We need to be careful with the information we share online, because it can be easily used against us.
“We really encourage anonymity, fake names, and avatars. It’s sad we have to do this but it’s necessary.”
Esra’a Al Shafei, 32, Manama
Founder of Mideast Tunes, Migrant Rights, CrowdVoice, and Majal; board member at Wikipedia.
Bo––How do you fight for LGBT rights in the digital environment?
Esra’a Al Shafei–I run Majal; a digital platform that amplifies marginalized voices—ethnic minorities, migrant workers, independent musicians, and members of the LGBT community—in the Middle East and North Africa (MENA). LGBT individuals are severely underrepresented as a community in this region and face systematic oppression, discrimination, marginalization, and persecution. For obvious reasons, LGBT youth in the MENA region felt increasingly isolated, distant, and depressed. To reduce this isolation, we put ourselves at risk by resorting to insecure communication tools.
My colleagues and I tried to bring members of the Arab LGBT community together in a safe way via our discussion platform, Ahwaa, which we launched in 2011. We used game mechanics to protect our users from trolls and people with mal intent. Each user is associated with a number of points that indicate their level of participation and supportiveness. By earning more points, they get more perks on the platform, including being able to create their own chat rooms… People can talk in the chat rooms about their personal concerns—anything from reconciling their faith and identity, to questions and advice about safe sex and health.
Bo—What are the potential dangers for the LGBT community online?
Esra’a—Being outed as gay or trans can lead to consequences like being abandoned by family, losing your job, or being deported if you’re a migrant in this region. Dating apps like Grindr never truly considered what groups outside the Western world, like militant groups or state police, could do with [the technology]. Even our platform could be used against us by catfishers—nothing is bulletproof. That’s why people need to be careful with personal information that they share online, and use VPNs and anonymizing techniques when using dating apps and encrypted messaging. We really encourage anonymity, fake names, and avatars. It’s sad we have to do this but it’s necessary, and safer than in spaces like Facebook, where fake names and anonymity are strongly discouraged and get people suspended.
Bo—How can we improve online safety for LGBT people now and in the future?
Esra’a—We need to break the silence. In Lebanon, for example, there exist organizations that fight for LGBT rights, but these are incredibly rare in the Gulf region. Today, with all the platforms we have and the ones we are actively building, queer people can use the internet and organize internally to remove the isolation that we face. Working underground helps, it’s where people feel comfortable sharing and seeking support, from LGBTQ+ friendly lawyers to counselors and doctors. Many of our members are more concerned about the social stigma associated with being queer than imprisonment. That also has a huge emotional and even physical toll on a person.
We don’t have a lot of funding, our platform is mostly volunteer-run, so we really appreciate if people want to make a donation.
“We have to see how we can make these platforms accountable; they need to block, minimize, and shut down actively hateful disinformation about the LGBTQ community.”
Sasha Costanza-Chock, 43, Boston
Associate Professor of Civic Media at Massachusetts Institute of Technology, Faculty Associate at the Berkman-Klein Center for Internet & Society at Harvard University, and creator of the MIT Codesign Studio.
Bo—How do you fight for LGBT rights in the digital environment?
Sasha Costanza-Chock—I’m fighting for queer liberation through any media necessary. In the Transformative Media Organizing Project, we tried to create a space where LGBTQ+ activists and organizers could share strategy about how to use the media for queer liberation. We interviewed and surveyed organizations about what’s working, and we ran a series of skill-shares that took place both virtually and in real life. For example, we worked for organizations that set up workshops about digital security and social media strategy. Those are all documented and freely available online.
Bo—What are the potential dangers for the LGBT community online?
Sasha—One area of struggle is hetero and cis-normative technology. For example, some companies and governments are trying to develop facial analysis software to ‘detect’ who is gay or trans. The software doesn’t really analyze ‘gay faces,’ the algorithm is actually classifying people based on things like the angle of selfies and cues in self-representation online, and assumes, for example, that ‘male’ faces with make-up must be gay, or that ‘female’ faces without makeup are more likely to be lesbian. Another company is building software that is supposed to control access to a bathroom by gender; in this case, many trans women, gender nonconforming women, non-binary people, and in fact anyone whose face is misgendered by the algorithm, such as many black women, will be denied access to the women’s bathroom.
Bo—What can we do about these potential dangers now and in the future?
Sasha—We need to have more discussions about data ownership, privacy, and consent. We should [regulate] facial analysis software, and law enforcement should not be allowed to use facial analysis systems to discriminate against LGBT people and people of color. There is also the problem of far-right propaganda systems on social media being used to undermine human rights. We have to see how we can make these platforms accountable; they need to block, minimize, and shut down actively hateful disinformation about the LGBTQ community, and deal with the rampant harassment, hate speech, trolling, and propaganda that infests the networked public sphere.
“There exists a paradox: technology is very necessary for LGBT people to connect in repressive environments, but doing so presents some very acute risks.”
Miles Kenyon, 33, Toronto
Communications specialist at the Citizen Lab.
Bo—How do you fight for LGBT rights in the digital environment?
Miles Kenyon—Technology has presented the LGBT community with countless opportunities to connect, but it also presents complex threats. One fundamental issue is that we don’t see a lot of diversity when it comes to building technology, and this results in in the diverse needs of users not being met. For example, some automatic soap dispensers only recognize white skin. In the LGBT context, Google even created an A.I. bot that thought gay people were inherently bad. This illustrates the point that automated algorithms don’t think anything ‘new’ but replicate datasets that already exist, which means that biases can be built in unconsciously and reproduced with discriminatory consequences. We need to think about how to implement artificial intelligence in ways that factor in and respect human rights.
Bo—What are the potential dangers for the LGBT community online?
Miles—There exists a paradox: technology is very necessary for LGBT people to connect in repressive environments, but doing so presents some very acute risks. This means it’s necessary to have conversations with each other about safety… Censorship is a real struggle for the community. For instance, a Canadian company, Netsweeper, sells software that allows clients to filter certain content like hate speech or porn. However, if you sell the same technology to an authoritarian regime that seeks to control information, it could be misused [in a way that violates] human rights. Until recently, Netsweeper easily facilitated this by creating a filter called ‘Alternative Lifestyles,’ which existed only to filter out LGBT content.
Bo—What can we do about these potential dangers now and in the future?
Miles—As we move forward with technology, we must bring digital components into our community efforts. We should also increase diversity in building technology, and build policies that are made around technology that respects human rights. The United Nations has a set of guidelines outlining how businesses can sell technology without violating human rights, which all business should ensure they are compliant with. At a customer level; know how to protect yourself online, and use products like Security Planner to understand your personal risks.