Israel-Palestine, Instagram, and the ethics of social media
A discussion of the pressing need to address upstream design at social media companies.
I am interested in the design of the spaces in which we are having some of our most important conversations as a society.
Shadow banning on social media has been raised as a socio-technical issue in the context of the Israel-Hamas war, but I believe we need to address upstream issues, such as how the platform is designed.
Social media has been heralded as a tool for social movements (e.g. Arab Spring, BLM, #MeToo), but there are aspects of social media’s design that hinder these movements as well. I argue that the addictive design and echo chambers of social media are counterproductive to the progress of a social movement.
The prompts
In the “design world” we like to use “how might we” statements to generate ideas. We frame a desired goal — e.g. “prevent media fatigue” — as a “how might we” statement.
How might we prevent burnout and emotional exhaustion in people who are politically engaged online? How might we allow them to stay engaged in an online conversation, in a sustainable way? What might a mainstream online space that fosters community organizing look like?
About me
In college I was active in student organizing for Black Lives Matter. This resulted in many of my friends being activists or highly politically engaged. Naturally many of them are people of color, queer, and/or women. This social circle has translated to my online social life where social justice content circulates regularly.
Some solutions
I currently feel helpless as a user. Sometimes I need a break from the intense content I am receiving, but I cannot control my experience without unfollowing friends. I can only imagine how much more helpless younger users may feel, given they are at an even more impressionable phase of life.
I believe that if users aren’t bombarded by traumatic (albeit important) content, they are more likely to engage with it. Here are some ways Meta can give users more control over their Instagram experience:
Content preferences settings
Introduce a detailed content preference setting where users can specify their interests, prioritize certain topics, and deprioritize others.
Trending content filters
Allow users to apply temporary filters during trending events or social justice issues. This could include the ability to limit the amount of content seen on a particular topic.
Diversify your feed options
Implement a feature that allows users to diversify their feed by introducing content from a broader range of sources, perspectives, or genres.
Time-limited topic muting
Allow users to mute specific topics or hashtags for a defined period, providing a temporary respite from overwhelming or emotionally charged content.
Customizable algorithm controls
Provide users with more granular control over the content algorithm, allowing them to adjust the intensity of content recommendations based on topics or themes.
Diverse Content Suggestions do not provide
Introduce a feature that suggests a diverse range of content creators, ensuring users are exposed to different perspectives within their areas of interest.
Contextual Education
Provide users with contextual information about the content they are seeing, helping them understand the broader context and nuances surrounding a topic.
Algorithm Transparency
Increase transparency about how Instagram's algorithms work and the factors influencing content recommendations, allowing users to make more informed decisions.
Addictive design
A key part of a social movement is transforming the hearts and minds of people. The Israel-Hamas war has taken center stage in my social media ecosystem, and it has compelled me to attend protests in support of the Palestinian people, and educate myself on the modern history of the conflict.
But feelings are complex, and my newsletter is dedicated to making meaning of them. I would be lying if I said that I had no questions, doubts, or concerns about some of the content I am seeing on Instagram. But I am unsure how to express uncertainty online. I want to engage in meaningful, good-faith online discussions, but I don’t know how.
On October 15, 2023 some friends were liking posts that congratulated Hamas. I did not understand what “liking” the post meant. Did they agree with the sentiment behind the attack or with the violence itself? Others were reposting a popular, irreverant tweet about decolonization and violence. I am a decolonial scholar with roots in nonviolence or ‘ahimsa’; was this now to be my opinion too? More recently, a friend shared the video of Aaron Bushnell’s act of self-immolation (this is not a link to the video). I watched the first 30 seconds and stopped.
Bushnell’s story is dark. I worry that the content circulating will encourage more people to harm themselves, and that self-harm (and yes, I have heard the arguments that self-immolation is not self-harm) is being glorified to a younger, more impressionable, and possibly mentally distressed audience.
Most Silicon Valley designers have read Hooked by Nir Ayal, which promises designers the secrets to designing products that people can't put down. The book talks about the four components of the “Hook Model”:
Trigger: Cues that prompt users to take action. They can be external (e.g. push notifications) or internal (emotions, thoughts).
Action: The behaviors or tasks users take in response to the trigger. This could be scrolling through a social media feed or any other user interaction.
Variable Reward: The unpredictable and varied positive reinforcements users receive after taking action. This has also been called the “dopamine-driven feedback loop” by ex-Facebook technologists.
Investment: Involves the user putting effort, time, or resources into the product or service. This investment increases the likelihood of the user returning and engaging with the product again.
In the context of a social media ecosytem flooded with charged language and violent imagery, the “Hook Model” proves absolutely toxic.
Echoes
Echo chambers leading to radicalization is a known trust and safety problem. We also know there is some correlation between extremism and mental distress.
Self-immolation is an extreme form of protest. Streaming it to send a message about the genocide of Gazans has real tradeoffs that I think need to be articulated in one place. Self-immolation can lead to change, but it is also possible that in the addictive and unhealthy context of social media, celebrating Bushnell’s sacrifice may encourage others to follow in his stead.
It is well documented that people who are hyperactive online have poor mental health and their vulnerable state is manipulated to increase engagement. Furthermore, those who live at the intersection of being hyper online, mentally unwell, and politically engaged are even more susceptible to the addictive design of social media. My concern is that Instagram is, yet again, making a profit off of a vulnerable population.
As of Monday Feb 26, 2023, most of the stories I see are reshares of posts about Bushnell. Occasionally people will layer their own thoughts on top of a reshare. Sometimes I click through to the original post and swipe through multiple slides. I condemn the onslaught against Gaza, believe the U.S. should immediately call for a ceasefire, and that Hamas should release the hostages. Yet I have also removed Instagram from my phone multiple times, unfollowed accounts, and muted friends. While I am doing all this to protect my own well-being I am told by some accounts that taking a break or logging off is an injustice too. How can I turn away when Palestinians cannot?
But the reality is, I am not helping them by spiraling downward into traumatic content. After a tipping point, I am not really learning more about the issue or galvanized into action. I am left drained and, if anything, more likely to politically disconnect.
Meta has a responsibility and an awesome opportunity to effect social change. The U.S. government can provide the incentive they need. So let’s make it happen. It’s about damn time.