Is our humanity greater than our technology?
Those who build technology, use it, regulate it, or invest in it are connected in their humanity are they not? We lose ourselves when we lose sight of this truth.
Yesterday I spoke about open and community-centered design at the 2024 State of the Net conference in Washington D.C.. I also touched on inclusive and responsible product development and product design. It’s common to only say that efforts to build ethically are hard, and they are, but the sense of purpose I gain is worth the sweat. As a designer, I enjoy the process of creation more than the end product, and if you’re wired this way, you will enjoy responsible product design.
This work that I do is a lifelong commitment to transforming technology as we know it. It is not a pursuit of personal accolades or achievement. For me, it’s a spiritual journey. When I die I hope to see a different landscape. Not only in the diversity of people who conceive, design, and build technology, but in how it’s done and what social problems are prioritized. I want to see decolonial technology, feminist technology, and anti-surveillance technology. I want to see technology that brings us closer to ourselves, to one another, and to the earth. These are the paradigm shifts I hope to live to see, at scale.
The purpose of the State of the Net conference was to explore “important, emerging trends” which, according to the event, are:
AI governance
How do we (i.e. the government) detect risks in AI systems?
How do we measure them?
How do we create safety benchmarks that all LLMs must meet?
How do we build capacity for government sanctioned AI auditors? (e.g. what the IRS is for auditing businesses)
Internet connectivity in America
Yes there are parts of rural America and parts of cities without connectivity in 2024.
What to do with Tik Tok?
Should we regulate it? Ban it entirely?
Child safety online
How do we ensure that platforms are safe for the mental and physical health of minors in America?
In discussing “trending” topics at the conference, I did feel that something was missing. Emotion; emotion was missing. For me I need stories that tell me how a trend impacts a person, a family perhaps, or a community of folks. Often a well-stated problem contains the solution. Through the stories we tell about real people and how technological trends are changing their lives, we will uncover the solutions.
The conference happened just one week after the Senate hearings on child safety where Senator John Kennedy (R-LA) asked, “Is our technology greater than our humanity? Or will our humanity be greater than our technology?” Our generation should hold on tightly to the lessons of Kennedy’s generation; those who grew up free of modern technology.
The “data” might not yet show causation between social media and teen mental health, and the well being of children might not be a business priority, but are we not all of us, ultimately, humans? Those who build technology, use it, regulate it, or invest in it are connected in their humanity are they not? We lose ourselves when we lose sight of this truth.
What is success
I asked a policy friend at the conference, who works at the Stanford Internet Observatory, what does success look like in your line of work? My understanding, based on his response and what I’ve heard from others as well, is that in the policy world conversation is the process. Through increasingly nuanced conversations, lawmakers write increasingly more sophisticated policy. These types of conferences give policy makers a sense of the landscape and what gaps might exist. Just like Product, there are short, medium, and long term goals for policy, and with each iteration the policy, ideally, matures and keeps up with the times.
What role does Design play in Trust and Safety?
In the child safety hearing, senators make clear that improving discovery of child safety features is a design problem. Senator Butler explicitly asks about user research and if trust and safety is considered at the very beginning of the design of the product. Senator Klobuchar passionately explains that parental controls need to be simpler and easier otherwise the parent is left “trying to stop a flood with a mop”.
Waiting on a framework
In the child safety hearing, the senators ask Mark Zuckerberg about problematic use cases of filters on Instagram. For example, a face filter that lightens dark skin or shows what your face would look like post-plastic surgery. While I do not think Zuckerberg personally approves of racist or sexist filters, he is ambivalent towards regulating them because he, like the majority of Silicon Valley, does not know where to draw the line and on what basis. He is asking for a framework. What do we allow and not allow? Do we have to evaluate each submission of a filter, case by case? He might also ask, won’t that be gatekeeping? Won’t that limit creativity? Who are we to impose our values on the public?
That all may be true. But it’s also true that this effort will be expensive and require hiring Trust and Safety professionals.
“The government will regulate social media companies out of business if they don’t get trust and safety issues right.” - from the hearing. Here is the golden incentive that these social media companies need. My question is, how will congress work with Trust and Safety teams at powerful companies? And, equally importantly, how will Trust and Safety teams work with Product and Engineering teams at these companies?
A hopeful future
The State of the Net conference made me feel optimistic about the future of tech policy regulation in the United States. As many speakers said, the government won’t make the same mistakes it made with social media. Now to my fellow technologists I want to ask, what will we prioritize? In the age of misinformation and disinformation, deep fakes, and a decline of trust in media, what role will we play in rebuilding the social fabric of American society? And not through our words, but in our actions? Will we allow our humanity to be greater than our technology, or will our hubris yet again be our demise?