Building Trust in Tech: The Imperative of Responsible Technology
A designer's perspective on the 3rd Annual Responsible Tech Summit
With great power comes great responsibility
Diversity, equity, and inclusion has been a core value of mine since I joined the Movement for Black Lives in 2015. Over the years I have learned that, at its heart, DEI is about the redistribution of power in society. My interest in race, feminism, and decolonialism has inevitably led me to think a lot about power and how it moves.
Technology is incredibly powerful which places an immense, but often eschewed, moral responsibility on the Tech industry. It is this sense of moral duty that compelled me to join the Responsible Tech movement and attend the 3rd Annual Responsible Tech Summit in New York, hosted by All Tech is Human. The summit had people from civil society, government, industry, and academia in attendance, and I wanted to hear from a broad swath of folks who, like me, want a more equitable, just, and peaceful world. Below I have synthesized the talks and pondered what’s next.
MIT social scientist Prof. Sherry Turkle and journalist Julie Scelfo on the youth mental health crisis fueled by Big Tech
The conversation between Turkle and Sclefo was one more people need to hear. It was two thoughtful, knowledgeable, and concerned women encouraging those of us in Tech to be much more discerning with what we build and mindful in our use and promotion.
Professor Turkle spoke from the research perspective of what we know about technology’s impact on youth cognitive and emotional development. Sclefo, a tech reporter, spoke as a concerned parent and founder of Get Media Savvy on the need for a massive cultural movement in demanding better for our children.
Late-stage hyper capitalism, which was never explicitly stated by the two speakers, is the water in which we swim and the conditions which gave rise to the problems we are trying to solve. A crisis in youth mental health is directly tied to corporations that benefit massively from increasing user screen time and harvesting user data. Furthermore, the average American household increasingly has two working parents and American childcare costs are high. Giving children screens is a quick fix, but the two speakers dove into the details of how screens impact development (in a word, it's negative). Children need face-to-face social interaction, eye contact, touch, physical play, and conversation.
A few of Prof. Turkle’s statements left me deep in thought. First, it was that technology changes not just what we do, but who we are. This is not a new concept, and it is one I have heard many younger creatives of color express, such as journalist and storyteller J. Wortham. Technology changes us is at the feeling level. “Technology is the architect of our intimacies”, quipped Turkle. It triggers emotions we would not otherwise have. “It is crucial that one is able to spend time in solitude, where one exists in the presence of and recognizes the self. Only then can we connect with the other,” Turkle told the audience.
If solitude is essential for empathy and technology is shrinking our tolerance for it, this so-called tool is undoubtedly the “architect of our intimacies”. Of course empathy is one aspect of how we connect. The other side of empathy is vulnerability. With social media we learn to curate ourselves and receive curations of others. But knowing how to be the unguarded, improvisational self is a much-needed quality for human connection. With my generation, for example, you see our preferences for text messages over phone calls. With generations younger than millennials you may see an inability, or resistance, to interact in-person versus online.
Sclefo noted that technological changes are ecological. They change the ecosystem. So let us cast aside the notion of “technology as a tool”. It is, instead, a new phenomenon introduced to our environment and it will have both predictable and unpredictable ripple affects.
While it is lovely and important for consumers to be more mindful with their technology use, any “mindfulness tools” or practices are band-aid solutions to systemic problems. We still need to hold Big Tech responsible. Technology should not be designed to be addictive. Engagement metrics should not be the only measures of success for a company. Data should not be harvested for no stated reason, and definitely not collected or sold without user awareness or consent.
Shortly after these women, former White House policy advisor Tim Wu reminded us that this year Congress did not successfully pass child protection bills such as Kids Online Safety Act (KOSA) or COPPA 2.0. So it was interesting to see the other half of the conversation.
And this was the beauty and power of the Summit and more broadly the RT community. Each “fireside chat” was in conversation with the other chats. The speakers highlighted important themes, issues, and solutions, and they were all working in tandem, as a community should. My hope is for all of us to coordinate more and get more done, quickly.
The Summit helped me understand the different Responsible Tech level we can pull for a better world. We need strong policy and government regulation but to both craft and pass legislation we need public awareness, pressure, and organizing. We need consumers pressuring companies to be more ethical, we need alternatives to Big Tech, and we need people inside Big Tech with aligned values, skills, and dedication to being more responsible. We need it all.
The heroism of Frances Haugen, moderated by Nabiha Syed
I remember when Frances Haugen blew the whistle on Facebook. I followed the news story and watched her testify in Congress. When I met her Chief of Staff, Bryson, I told him that Frances felt like the modern American Dream, or rather she usurped Facebook as the American Dream. No longer the upstart or underdog, Facebook became “the man” and Frances stuck it to him. Very rock and roll of her, I must say.
Haugen kept it simple. “The fundamental problem of social media,” she said in a voice filled with conviction and warmth, “is the decision making process behind it didn’t have enough people seated at the table.” The room was rapt with attention. “So then, what does ideal transparency look like?” asked lawyer and CEO Nabiha Syed. “The path to transparency is a journey, and each company’s or organization’s journey will look different.”
Haugen is an advocate for corporate accountability and transparency; transparency into a company and machine learning models and algorithms, which should be coupled with measures to protect user privacy. I was blown away by her ability to connect the dots. Haugen, a data scientist, explained how misinformation on Facebook is connected to metrics of engagement, but it was a narrative I had not heard before. Power users are hyper engaged in the early hours of the morning, 2am-4am. In their sleep deprived state they are more likely to doom scroll, reshare incendiary content, and leave toxic comments. These same users spread misinformation, often at a cost to their own mental health. Facebook has the power to cut a user off at a certain hour, which would actually give someone a couple additional hours of sleep. "You want to see better mental health in the world? Let people sleep more.”
Haugen also mentioned coalition building within Responsible Tech, levers that companies can pull to understand what’s happening on their platforms better (she called them “minimal viable queries”), and asked the big question, “Are companies using the opportunities available to them?”
She reminded the audience that Trust and Safety is viewed as a cost function, not a revenue function. Of course, it can be a cost-saving function if regulation is at our doorstep, but otherwise T&S is perceived as a function that does not grow business revenue. This is an opportunity for us as Trust and Safety technologists to incorporate storytelling and persuasion into what we do. Let us more often make the business case for T&S, and raise awareness on the risks of not having a T&S function at a company.
According to Haugen, overwhelmingly the stock prices of Facebook have gone down when the company has committed to safety-related work or the news cycle has implied they needed to spend more on safety. What if instead public sentiment regarding safety was positive? We, the RT community, can make the business case for how and why T&S ensures a stronger, healthier, and sustainable business. Otherwise, we allow hate speech and other forms of divisive, negative content to be the bedrock of social media revenue. Making a business case for T&S is making a business case for ethical, inclusive, and equitable social media platforms and technology products.
Haugen is making a powerful case for us to shift paradigms in what incentivizes companies. As with the very first conversation at the summit, I am reminded of late-stage hyper capitalism, labor rights (e.g. the ongoing writer’s strike), and how all ethical and responsible tech conversations lead to people over profits.
In her talk, Haugen cited “The power of the powerless” by Czech political dissident Václav Havel. How appropriate given that earlier in the day Tim Wu called for technology companies, from startups to Big Tech to VCs, to stop creating and enabling a spy network that collects and measures data constantly.
I can only imagine how Haugen felt going up against Facebook which, while not a totalitarian regime, was still a formidable institution to dissent against. As Havel argues in this essay, “a free society can only be achieved through a paradigm based on the individual… and a fundamental reconstitution of one’s respect for self, for others and for the universe… to refuse to allow the lie to oppress oneself, and to refuse to be part of the lie that oppresses others”; a lie which I believe pervades Silicon Valley through either the complacency, apathy, or willful ignorance of tech leaders and workers. To step out of rank-and-file and blow the whistle on an entire apparatus takes an immense amount of courage, but in doing so Haugen “revealed to others that they [too] have power.”
The fact that Haugen cited this essay only shows how aware and historically educated she is and was at the time of blowing the whistle. Intersectional thinkers like Haugen (who told the summit audience she minored in “cold war history”, but don’t quote me) are what tech companies need more of. Haugen concluded her talk reminding us that tech companies won’t make changes to safety without regulation or challenges to their authority. I now feel inspired to reconnect with my activist roots and think more on what organizing, agitation, and advocacy looks like for the Responsible Tech movement.
How AI will transform the 2024 Elections?
This was a phenomenal panel on what we should anticipate and prepare for in the 2024 elections around the world. AI has already impacted elections, but with generative AI, bot networks, and deep fakes accessible to governments and politicians, the sophistication and scale of misinformation will only heighten. Targeted ads can be smarter, more believable, and personalized at scale. This reduced cost to high-quality propaganda will interfere in elections. Platform algorithms have and continue to push out incendiary content, this is what Frances Haugen warns us against, but tech platforms are even more susceptible than before. A layer above all this are the tools, data, and products cyber criminals use to execute these attacks and the black markey economy surrounding it.
According to the panelists, the biggest challenge right now is creating a standardized approach to regulating AI. How do we get everybody on some kind of standardized system? How does the system evolve over time? The EU is a global leader in tech regulation and working right now on an AI act. The White House did issue an AI Bill of Rights and the Biden Administration is supposed to be releasing AI legislation, but strengthening democracy against AI-enabled degradation will require civic participation, such as education, local journalism, and media literacy programming.
The last bit that I was moved by and most interested in was how misinformation campaigns either target or harm minority communities. The most effective misinformation campaigns are ones that weaponize racism, sexism, homophobia, and religious differences, so these communities need to be a part of the conversation in combatting misinformation online. In my own family Whatsapp chats I see how quickly misinformation about vaccines and politics spreads.
How do we get the government involved? White House policy advisor Tim Wu and Justin Hendrix discuss how
This was an amazing conversation between Justin Hendrix, the founder of Tech Policy Press, and Tim Wu, lawyer who coined “net neutrality” and former White House policy advisor on antitrust law. Wu showed us the other side of the conversation, which is, how do we get stuff passed by Congress? How do we get government officials to care?
This is what I loved about this Summit. If taking an idea from concept to implementation is a relay, the summit had folks at every leg of the race speak on their experience. Wu opined more broadly on how scale concentrates wealth and creates monopolies, such as Google’s monopoly on search (which I’ve written about). He made an astute comment in saying that 21st century power is about the attention economy (see Ezra Klein on the Attention industry and Cal Newton on Digital Minimalism), and Google is what modern economic power looks like. Wu is interested in understanding that power and its concentration (AI ethicist Timnit Gebru feels similarly on this podcast about ethical AI). He also touched on labor rights and AI, both from the perspective of automation and also on the rights of content moderators in the Global Majority. To paraphrase Wu, while it is hard to predict the future of work, it is quite possible that AI will create more middle management jobs overseeing AI decision-making. Which begs the question, what about the quality of the work that we do? How creative and fulfilling do we need work to be?
Ultimately this talk, and all of the others, beautifully looped back to the human condition. What is to be satisfied and joyful? How can our world have more of this in the present and future? This is also where I want to concentrate my focus and energy, by embodying values of community, meaningful work, creativity, and spirituality. All of which irresponsible technology threatens.
Digital Spaces and Public Goods
For the last Summit panel I decided to mostly listen and not take notes, which I do regret because the panel was fantastic, albeit more varied than prior talks. I learned from Theodora Skeadas that Twitter had worked with advisory groups that advised them on product ideas like Twitter Spaces and Community Notes. The panelists encouraged companies to optimize for bridging people versus divisiveness.
One of the panelists, Sabhanaz Radhi Diya, shifted the Summit’s center of gravity when she reminded the audience that both GDPR and DSA benefit only the EU, and they don’t shape the experience of the majority of global users. She also warned against attempts to plug-and-play regulation from one context into another. What works for the EU won’t necessarily work for Vietnam, Venezuela, Bangladesh, Nigeria, etc…
What did these panelists advise to the rest of us? Write a bill, ask more and demand more of companies and governments, challenge your own assumptions about technology, insert yourself in the conversations that matter to you, and do not settle for irresponsible companies or products.
We need each other
Many crises in modern American society can be directly traced to social media, whether its polarization, culture wars, the rise of fringe movements, mental health issues, isolation, election interference or public mistrust in democratic institutions. Specifically, the disintegration of our social fabric can be traced to Facebook and Twitter. But it’s not all so grim. These platforms are still new and they can be fixed if we adopt new processes and frameworks. For these changes to take place at scale we need a cross-sector, coordinated effort. It will require external pressure on companies (i.e. government regulation), internal pressure from employees, capacity building and investment in Trust and Safety, and demands from informed and concerned consumers. We need optimism, grit, and fortitude. There is a lot of cynical, fearful language and attitudes around the harms of technology, especially the potential catastrophic effects of AGI. Whether its prevention, response, or mitigation, I believe at the heart of what we do is how we do it, together. Let us embrace one another with humility, courage, and, above all, hope.
Such an awesome recap of a truly incredible event!! Thanks so much for writing this and adding your two cents...
Loved this line.. "If taking an idea from concept to implementation is a relay, the summit had folks at every leg of the race speak on their experience. "
And also super with you when you say... "... tech companies won’t make changes to safety without regulation or challenges to their authority. I now feel inspired to reconnect with my activist roots and think more on what organizing, agitation, and advocacy looks like for the Responsible Tech movement." 🔥🔥
Talking of trust and safety. They say it is in the eye of the beholder. A person irrespective of the age will look differently depending on the person who is looking. Same person, same position, same location can convey totally different emotions depending the "eye of the beholder". Hence lot of people do not want their even 1 year old kids photos in Public. FB is "quite" aware of this and exploit it by depicting a simple photo in a twisted angle. Reels in FB is quite adept in it. I don't know what sort of affect it has on kids.