A Call for Milquetoastification

There's a sentence in System Error that contained so many disparate elements, I needed to unpack it:

We must confront the algorithmic sorting of users into filter bubbles that contribute to growing polarization, extremism, and decreasing social trust, all of which threaten the health of democracy.

This is stated baldly, and with no explanation, so I'd like to examine two questions: First, are these things related in a self-evident way? and second, what are the causal chains at play?

Photo by Alexander Dummer from Pexels


First, let's consider filter bubbles. Is this a concept that the general public understands well? Perhaps, but let me explain it just to be clear. In the early days of mass media, there was no way to individualize news and entertainment content to tailor it to the individual interests and world views of each consumer. Outlets had their own ideological bent, but given the limitations of the platforms and market pressure to be as broadly applicable as possible, it was difficult to find a collection of information targeted to the foot fetishist paralyzed with fear about fluoride in the water. Can said fetishist seek this information out and compile it for themselves? Absolutely, but in the process they will be exposed to countless mainstream ideas and viewpoints that will influence them in the process. Is there a term for the opposite of radicalization? Milquetoastification? Losing your edge sounds like such a bad thing.

Even in the early days of the internet (get off my lawn), the amount of content available was somewhat manageable, and you could reasonably absorb all of the viewpoints on a topic. You could subscribe to the RSS feed on someone's blog and read every post sequentially. This was satisfying, it was enough. As the tools of information creation were democratized (or co-opted for profit), the deluge of mostly-useless and uninteresting posts wasn't readily managed by sequential consumption. No, the user should be shown the most interesting and useful information, and as yet-another-hierarchical-thingamabob begat Google, the humble RSS feed begat algorithm-mediated-and-filtered newsfeeds. What was deemed useful and interesting? Computers have no idea, so let's let the user decide.

Page Rank, the algorithm that brought us Google, was based on a user interest signal, but to send that signal, someone had to go to the trouble of creating a website and hand-editing a hyperlink before publishing it. So you really have to think the content is dope. With social media, you can sit on a toilet, mindlessly scroll through an app, and tap the screen to indicate your approval. Or anger. Or the momentary freedom from the shackles of ennui.

Content lives or dies in an attention-Darwinian battle, where the most engaging shit rises to the top of the toilet bowl. Which is great for profits and all, but we can do better. We can determine not only which content people will find most engaging in an aggregate, but we can also figure out that user X is a foot fetishist enraged at the fluoride in his water and serve that user otherwise marginal content that lights up the dopamine production in his brain. Our user now sees a social media flood of foot-and-fluoride content, and would perhaps become justified in imagining that the whole world has become filled with people just like him--and thus, he is now in a filter bubble.

How does this lead to polarization? The user is pushed toward increasingly marginal content in their niche. In an earlier post, I wrote about my YouTube radicalization journey with eyeshadow. This is cute and fun, and I get to follow my own little rainbow into a sparkly rabbit hole where the only harm done is to my credit card balance and bathroom counter space. (And...maybe exploited mica miners and increased carbon emissions and god damn why does everything have to be so morally fraught?) Waaaay less cute when the starting point is "I'm staring to see a lot of X people around town lately and I'm a bit uncomfortable with that," and the bottom of the rabbit hole is joining a white supremacist hate group.

Not cute at all.

So algorithms are pushing us into increasingly polarized camps with increasingly extreme views over time, but how does this erode democracy? I think this can be linked by the third of the social ills listed in the original sentence--decreasing social trust and how this erodes the tenants on which democracy is built.

What does a functioning democracy need? A believe that all people in a society deserve a voice and say in how it should be run. A sense that there is something that unites all members, like foundational human rights, or that we should limit suffering. An acceptance that things won't always go your way, and a willingness to live with that. Trust that the institutions of government essentially work, in the aggregate at least.

Does algorithmic polarization/extremism erode these tenets? Yes, by limiting one's access to a collection of potentially challenging viewpoints raised by people in the community that others trust and respect. Marginal views become so normalized in an individual that holding conflicting views becomes unthinkable. So unthinkable, that those who hold them must be unfit to have a say in the running of society--or that they are irredeemably corrupt.

Comments

Popular posts from this blog

Just what my TBR stack needs...

Reading Guilty Pleasures