Fighting Surveillance Tech with Trademark Transparency
Recapping Decoding Stigma's first event with Hacking/Hustling
On Friday November 20th, Decoding Stigma co-hosted its first event, Fighting Surveillance Tech with Trademark Transparency, alongside Hacking//Hustling. The workshop (full video at the end of this post) explored how activists and investigative journalists can use the Federal trademark database to uncover surveillance technologies and other potentially invasive goods and services currently in development or deployed in the marketplace.
(above: flyer image for Trademark Transparency workshop, linking to event page)
The panel features Amanda Levendowski (Director, Intellectual Property and Information Policy (iPIP) Clinic, Georgetown Law) and Kendra Albert (Cyberlaw Clinic, Harvard Berkman Klein Center). Albert notes that “often the first and best tool to get information on surveillance is talking to the folks who are being surveilled about the things that they experienced… because that is often the first and best clue as to what’s happening more broadly behind the scenes.”
This speaks directly to the work that many organizations are doing, such as Hacking//Hustling’s community reports about account shutdowns or shadow bannings:
(above: a screenshot from Hacking//Hustling’s Posting into the Void: Studying the Impact of Shadowbanning on Sex Workers and Activists.” image links to the full report)
As well as Algorithmic Justice League’s cross-industry mapping of racial bias in AI, investigative journal reports on “walking while trans” laws and research into government surveillance of Indigenous pipeline activists.
(above: screen capture of community-led analysis of drone flight paths, documenting drone surveillance of Indigenous pipeline protesters’ homes, linking to a report released by Gizmodo. Graphic by Jim Cooke)
(above: full video of November 20th’s trademark transparency workshop, full transcripts available upon request)
Sex/work stigma has been used to justify research funding for—and deployment of—surveillance technology that ultimately profiles and further harms at-risk communities. Nonconsensual image and data scraping of online sex work platforms in order to train AI “victim-tracing” technology: for instance, Stanford bragged about scraping more than 30 million advertisements for sex work from online sources for its Darpa-funded DeepDive search engine. There are zero accountability measures on how these images are actually stored or used; the quasi-legal status of sex workers + the lack of nuance in machine learning algorithms likely means these images now sit in a database potentially tagged as evidence of criminal or victim without any accountability models in place for how these images are classified or used.
Algorithmically-informed account shutdowns strip laborers of their sources of income, we don’t know how algorithms codify sex workers—literally what does a “sex worker” look like to a machine—or who has defined what counts as “high risk” behavior. The sex worker is both indispensable yet disposable for those behind the development of these technologies.
By learning how to access the Trademark Electronic Search System, or TESS, sex workers and their allies can use trademark disclosures to take action against surveillance technology development that threatens the privacy, safety, and livelihoods of the sexually marginalized and other at-risk communities.
Questions from workshop attendees included:
How is data collected by tech platforms shared with law enforcement?
How are algorithms trained to detect nudity and how does this target sex workers?
What technology is being used to analyze external links that would fall under new community guideline bans on social platforms such as Instagram?
Who can be held accountable for the deplatforming of marginalized sex workers, especially in the time of covid when much of sex work (and all work) has been pushed online where people can be even more pervasively surveilled?
Levendowski notes that surveillance transparency is particularly difficult for a number of reasons, including that law enforcement agencies aren’t developing their own surveillance technologies (a process which would be vulnerable to investigative practices such as Freedom of Information requests) but are procuring these tools from largely invisible private vendors. Further, private technology companies shield themselves from public scrutiny via trade secrecy or intellectual property laws.
But the trademarks database is especially important because trademarks must have specimens (aka proof or documentation) that they are related to something currently being used or intended to be used in commerce (unlike patents, which can indefinitely exist in the realm of the imaginary). Levendowski for instance discovered the trademark registration for Amazon’s Ring doorbell, now a nightmare “civilian surveillance dragnet” technology, a full year before it was exposed to be partnering with publicly-funded law enforcement. “In other words, if we had been watching the trademark registrar more carefully, we could have found out about Amazon Ring a year before it was on the public consciousness radar and may have been able to mount more of a defense and resistance.”
Decoding Stigma will be following up this workshop with a shareable toolkit so you can easily learn how to do your own community-led investigative research into these surveillance technologies, please stay tuned! And as always if you want to be involved or help us in any way with similar actions or events, please write us at email@example.com.
Click below to share this post and spread the word!