Technology
The tech industry’s privileging of ‘safety over privacy’ could get the most vulnerable killed
High above the homeless camps of Seattle, in September 2022, Amazon hosted the first Tech Against Trafficking Summit. It was an elite affair. Project managers and executives from Amazon, Google, Facebook (Meta), Instagram, and Microsoft were present, as were ministers of labour from around the globe. Panellists included government leaders, law enforcement, tech executives, and NGO directors. Only two trafficking survivors made the speakers' list.
The summit was above all a show of force. Most of the tools presented were built for law enforcement, and safety over privacy appeared to be the mantra. Only the two survivors highlighted the dangers of haphazardly collecting any and all data, a view that was generally scoffed at. Stopping traffickers by any means necessary, the non-survivors said, was more important.
Tech: a tool or a timebomb?
As any survivor of human trafficking will tell you, privacy is safety. They cannot be sepa-rated when you are being hunted by the predator you've escaped. At odds with this is the fundamental ethos of the tech giants: collect as much data as possible, manipulate it, trade it, and find a way to profit from it.
Consider one of the tools presented: the Amazon Ring. Essentially camera-enabled door-bells, Rings allow occupants to see who's there before they answer and who came by in their absence. They seem innocuous enough. Yet, after finding their way onto doorframes across the US, they've also become "the largest civilian surveillance network the US has ev-er seen". As of May 2021, Amazon had brokered partnerships with "around one in 10 po-lice departments across the US" and given them "the ability to access recorded content from millions of privately owned home security cameras".
Preventing trafficking is one of Amazon's justifications for doing this. The presenter praised Amazon's partnership with the National Center for Missing and Exploited Children, and boasted how easy it is for law enforcement to obtain video footage without a warrant from any neighbourhood where a kidnapping is suspected. Safety first, privacy second. There was no discussion of how easily this kind of unregulated access could be abused without the right oversight. Nor was the underlying assumption of all these tools - that law enforcement is always part of the solution rather than the problem - ever questioned. This is despite data showing that law enforcement officers are 40% more likely to be domes-tic violence abusers than the general public, and that many officers have been accused or convicted of human trafficking themselves.
Artificial intelligence and machine learning were also celebrated as potential tools for catching suspected traffickers earlier. That these systems would eventually be able to do this was treated as a given: it was only a matter of time - and enough data. Yet, human trafficking data is both limited and notoriously inaccurate. Bad data means bad learning, and while this wasn't considered worthy of discussion at the summit, that can only mean one of two things. Either you end up with a system that is unhelpful at best and dangerous at worst because it was trained on inaccurate data. Or you use the problem of bad data as an excuse to collect more data. The pursuit of 'safety' trumps privacy once again.
The false messiah of Big Data
The tech industry is enamoured with large datasets. Preferring quantity of data over quality is in vogue, and the dangers of this trend are systematically downplayed by the world's largest companies. This is why Google forced out Timnit Gebru, a renowned AI expert, after she criticised large, inscrutable datasets as having - among other problems - an ingrained potential for racist and sexist bias.
Similarly, nobody in this space seems to want to acknowledge that training AI on human trafficking data risks collecting data on survivors that can be later breached, misinterpret-ed, or sold for a profit. The most visible example of something like this happening is Lex-isNexis's sale of immigrants' information to the US's Immigration and Customs Enforcement (ICE). At Amazon's summit, one of the survivor panellists pointed out how data given to law enforcement had made their own family members more vulnerable to traffickers. In response, a director from one of the largest anti-trafficking NGOs in the world glibly said, "shit happens".
While tech has the potential to greatly help trafficking survivors, it was obvious that few survivors were involved in developing the tools on display at Amazon's Tech Against Traf-ficking Summit. They were not what trafficking survivors need - they were what tech com-panies wanted to build. The tech industry is effectively co-opting human trafficking as a means to justify eroding privacy rights for everyone.
Wolf in sheep's clothing
It is counterintuitive to expect companies to ensure trafficking survivors' safety when their business models profit from the sale of users' data - especially when they're developing tools for law enforcement. There are also good reasons why most trafficking survivors do not trust the police to effectively protect them. Apart from concerns around immigration status, some survivors I know were trafficked by law enforcement officers. Others' traffickers were protected by police. From this vantage point, these tools are like giving a ser-pent a second set of fangs while telling you to trust the snake with your life.
At its core, human trafficking is the exploitation of the most vulnerable people. Creating tools that give police more power does not support trafficking survivors. It does not con-nect survivors with housing, healthcare, funding or access to education. In nearly every country, anti-trafficking legislation funnels more money and power to law enforcement than to any department or organisation providing for survivors' long-term needs.
Surviving the physical abuses of being trafficked is only the first step in not dying. Every trafficking survivor I know lives with chronic illness or disability. If the tech industry actual-ly cared about trafficking survivors, they would build apps to make life easier for people living with disabilities. They would build apps that help survivors communicate more easily. They would build apps that protect me from my trafficker, and make it easier for me to whistleblow when he enjoys impunity from law enforcement.
If we want tech companies to build in the public interest, especially in this area, we need to convince them to hire and listen to trafficking survivors. And not just one. Survivors are not a monolith - their experiences, perspectives and areas of expertise can differ greatly - and their views in all their variety must be heard. For if the Tech Against Trafficking Sum-mit is any indication, tech leaders obviously do not understand survivors' needs or safety concerns.
Instead of building anti-trafficking tools for police, tech companies should hire trafficking survivors as project managers. They should teach us to code and to analyse data, so that we can directly participate in building more robust, accurate and secure datasets. They should empower us to veto dangerous ideas, and let us choose the direction in which their anti-trafficking work heads.
Because we don't need to be saved; we need to be listened to.
From openDemocracy
Leave a Comment
Recent Posts
Curtain rises on 6th National ...
The month-long '6th National Sculpture Exhibition 2024', organ ...
Thailand's sea nomads strive t ...
When Hook was a child, he started his days by jumping off the boat tha ...
Liliums grown in Bagerhat show surprising promise fo ..
Bangladesh’s three divisions brace for rain
Prioritise reconstruction of Gaza, West Bank, Lebano ..
In support of the vision set forth by the CA