A foray into Trust and Safety (come with me)
How will Design meet the sociotechnical challenges of our time? We will need designers with expertise in the Trust and Safety space. + a brief interview with Wikimedia's Global Head of T&S.
In August of 2022 I joined the Trust and Safety Tools team at The Wikimedia Foundation and picked up the work of Amin-Al Hazwani, the designer before me. Our team was tasked with designing an Incident Reporting System for Wikipedia — a way for users to report incidents, such as harassment, illegal content, and illegal behavior, to volunteers and Wikimedia staff. Reporting systems are a fundamental component of safe platforms, but they are a relatively new feature. So, how should they be designed? What are best practices? Who are the users? What are their needs? What are the needs of the staff and volunteers monitoring the system? What about the organization responsible for the system?
Since joining the team I have conducted research on harassment on Wikipedia, reviewed and synthesized existing research, designed possible solutions, and collaborated with community members, my product manager, engineers, and Trust & Safety professionals to think through the complexity of designing a system that will work at scale. To beef up my knowledge of the T&S space, I created a “T&S Reading List for Product People” and helped start a reading club with other T&S product folks. My (our?) journey has just begun.
Why do I care about any of this?
While I am new to working in T&S, the thinking it requires is similar to the disciplines that raised me: feminist-gender studies, critical race theory, decolonial theory, and what we find at the intersection of technology and justice. I was part of a generation of American women introduced to feminism between 2012-2016, during social movements like Black Lives Matter and #MeToo. These movements created an intense cultural shift towards abolition and reconstruction, rather than reformation. I came out on the otherside interested in both. I wanted to help build new institutions, fix the ones that are redeemable, and dismantle the ones that no longer serve us.
Technology as a sector, a concept, a cultural phenomenon, and a tool is too pervasive and powerful to be exclusively shaped by white, capitalist, and patriarchal imagination. We have seen this happen again and again. We saw it with the early web, personal computers, mobile technology, social media, and now AI. To quote AI researcher Timnit Gebru from her conversation with Sean Illing on The Gray Area, “What I really worry about [regarding the rapid progress of AI] is the centralization of power and the deepening of inequity and oppression…”
Over the last seven years I have tried to step into my power as a woman and decolonial thinker in technology. This has looked like building community with like-minded women and deepening my expertise in decoloniality, equity-based thinking, and systems thinking. This process has been undoubtedly difficult, with many a night spent questioning my own intelligence and judgment. The self-doubt, however, is a blessing and a curse. It has compelled me to continuously learn and sharpen my critiques. In times of doubt, my sustenance has been studying and discussing feminist theory, and specifically black feminist theory, with peers. I have cultivated a community of technofeminists who keep me honest and focused. A few of these powerful women and non-binary folks are Sierra Kaplan-Nelson, Divya Budihal, Simone Delaney, Julia Kieserman, Carolyn Li-Madeo, Jazmin Tanner, Moriel Schottlender, Selene Yang, Natalia Rodriguez, and Genoveva Galarza Heredro. We have reflected on our lived-experiences, honed our politics, and found ways to bring our perspectives to decision-making in Product, Technology, Engineering, Architecture, Research, and, now, Trust and Safety.
When I joined the Trust and Safety Tools team at Wikimedia, Elon Musk had just eliminated the Trust and Safety team at Twitter, along with teams that are critical for T&S such as tweet curation, ethical AI, social good, and accessibility. I felt it was incumbent to understand my own commitment T&S as a political decision, while also arguing for its indispensability in an apolitical way.
For example, I wanted to understand harassment on Wikipedia through an equity lens. How does harassment affect female, queer, nonwhite, and non-Western Wikipedians? We already know how sexist, homophobic, and racist the internet can be, surely an Incident Reporting System will be a critical part of protecting marginalized communities on the platform?
To document this journey, I wrote an initial blog post on using a feminist framework to build the Incident Reporting System. After proposing a possible way forward and thinking of what to do next, I came across the work of researcher and activist Caroline Sinders. I am not an expert in gender-based online harassment, but Caroline is, and lucky for me she was teaching a “10-week responsible technology design intensive” at the Gray Area in San Francisco, a hub for social transformation! I took this course and grew my toolkit (more on this later). I created this living list of essays, articles, and reports on gender-based online harassment, and have continued to read, write, and learn about the subject.
T&S Product will be shaped by the people who contend with some of the biggest, most alarming T&S questions of our time. What role will software play in our mission to detect, dismantle, and prevent CSAM and misinformation campaigns on the internet? How will the tools we build prevent racial supremacist organizations, terrorist networks, and human exploitation operations from scaling their efforts using technology and digital networks? And how might we build inclusive, accessible, and equitable products that examine, address, and mitigate these problems?
Coming up next
I am in a working group at Wikimedia developing a DEI-focused software development framework called the “inclusive product development playbook”. In my next post I’ll be stepping through the following six prompts and examine how these prompts can shape the Incident Reporting System into furthering gender equity.
Establish a baseline of current users and use cases, and opportunities for equity based on your context. Include existing research (internal and external) of the problem space you want to address.
Be intentional and clear about who you will empower and engage with (have a clear why), and what that engagement will look like (target audience). If your answer is everyone, consider thinking about who you will engage with first for learning and then scale.
Establish “Who are we leaving out?” and be clear if you will include those that are being left out in this iteration or in the future based on your baseline understanding of “established” audiences and “growth” audiences.
When establishing objectives and key results, be specific about who you are serving, what assumptions you are making and how you will validate or challenge those assumptions.
Clearly define what barriers and/or knowledge gaps you are aiming to address as well as what new opportunities you are planning to create for your target audience and why your team is uniquely positioned to do so.
Establish what partners (internal and external) need to be brought into each phase and give them advance notice of the goal you intend to accomplish if not sooner. Advanced notice should occur as soon as you are aware you will need the team’s support, ideally at least a quarter in advance.
A conversation with Wikimedia’s Head of Trust and Safety
I briefly interviewed our Head of T&S, Jan Eissfeldt. I admire how Jan thinks and speaks of Trust and Safety. When I first met Jan he told me to think of T&S as a frame. So long as people’s behavior stays within the frame we have no problems. The platform is okay. It’s when people start to push on the boundaries of the frame that we start to require meaningful T&S policy and enforcement. According to Jan, on Wikipedia “the communities govern themselves reasonably effectively as long as the problems are within the framework.” Our new Universal Code of Conduct, a community-led initiative, is guidance for when people break the frame.
please note I did not record my conversation with Jan so I’ve done my best to summarize his responses.
Q: “Cornell professor Tarleton Gillespie calls teams like ours the “custodians of the internet.” Does the internet really need custodians? Why can’t it be self-cleaning?”
Jan: “The internet has always had custodians. There has never been a time where the internet did not have custodians….The huge amount of increasing human activity on the internet has always required very meaningful regulation… What we need to ask is, what is the purpose of the service and how does that manifest in the technical infrastructure and architecture? And how do these decisions impact the user?”
Q: “In our Trust and Safety strategy doc we say, “The Foundation needs to be able to comprehensively demonstrate to public policy makers and the public that our community-leveraged business model works at least as good as industrial T&S practices by for-profit platforms.” I love this. What are the unique challenges of a community-leveraged business model vs. for-profit platforms?”
Jan: “Our solution to being a “custodian” is hybrid. It is not a binary. There are advantages and disadvantages to a community leveraged versus centralized model. The advantages are that we, the Movement, can be more context sensitive, more language sensitive, our officials are elected, it costs less to run, and there is a co-dependent relationship between us and the Community. The disadvantages are that we, T&S, don’t control what happens on Wikipedia, we’re not as accountable, and progress is slower and becomes a multi-year journey. Wikipedia is, by design, created through friction. Knowledge is produced through friction, so we do have a very effective self-governing system in that way and people do become more reasonable, with time, on Wikipedia. So again, the question must be asked, what are your platform architecture choices? What are your user experience choices?”
Q: “What tips do you have for people like me who are new to Trust and Safety? How do I get a grip on the field?”
Jan: “The Trust and Safety Professional Association has a curriculum that is written for people who are new to the field. Otherwise you can dive into a wide diversity of scholars and players, emerging from within the tech industry.”
Closing remarks
Trust and Safety and product design are star-crossed lovers. Any designers interested in digital rights as human rights should consider Trust and Safety. Come! Let us learn and solve, together.
I am from the binary world of 0 and 1's. I have dealt with innate objects which hardly deflects from its specified path. There it is so easy to deal with. Out in the human world Aishwarya is dealing, it is so tough. It takes lot of effort even to quantify the problem itself... Good work and good luck Aishwarya