algorithmic emotions

Algorithmic Emotions

I was born in a very different world. I know, everybody at a certain age will tell you they were born in a different world and, to a certain extent, this will always be true. Each generation has its own specific. There are very clear patterns before and after each generational gap. 

But when I say “a different world” I’m talking about something even more profound than generational gaps. Something so disruptive, so fundamental at a historic scale, that it may affect the entire evolution of the human species. I know, again, that this sounds very serious, and I’m ready to accept I’m wrong. Even more, I hope I’m wrong. But still, I think it’s worth writing about it.

Algorithmic Fear

In the world I was born, news were slow. Really slow. Information channels were mostly paper (like newspapers and magazines) radio and tv. The way we consumed news was more or less similar, in patterns, with the way we consumed food. There were some news in the morning, a lunch break in which we were either reading the paper, or hearing in the background some radio, and then the evening tv news edition. And that was it. Those were the only contact surfaces between us and the news.

As the internet became prevalent, this slowly started to change. At the beginning of this century, news were already everywhere, anytime. In the early 2000s, you could already open a browser, go to a website and boom, news were updated in real time. The contact surface was now continuous. The only problem was intent: news wouldn’t chase you, you would have to actively go and see what’s happening.

The real shift happened when social media became the norm. I would say the “black swan” event that started this shift, the “moment zero” of this new age, is Covid-19. There were a few clues before that, like the Cambridge Analytica scandal, but history will probably remember only the Coronavirus crisis, because the implications of this event were game changing.

Before continuing with this, let’s make something clear: Covid-19 is a viral disease, for which we don’t have yet a cure, nor a vaccine, and, for certain risk groups, this disease is very, very dangerous. I’m not challenging that in any way. But on top of this disease, there was another contagion taking place: fear. More precisely: algorithmic fear.

Let me explain.

The propagation of the Covid-19 induced fear was overwhelmingly through social media. And social media works in a very interesting way: it tries to maximize the time spent in interaction. That’s how the network generates profit, by selling the time spent by its users. More users, more profit. This entire process is scripted. There are algorithms that are optimizing this, based on user’s behavior. So, if a certain type of event generates more interaction, the algorithms will push it higher in the network, giving it more visibility. It’s like it feeds on its own popularity.

When Covid-19 erupted, the first reaction of the common people was fear. It’s a healthy reaction. We, humans, survived because we were afraid of other animals, and, eventually, devised ways to overcome them, not because we weren’t afraid and faced them directly. We wouldn’t survive if courage to face a lion barehanded was our main trait, as a species. So, fear, as the primary reaction, is understandable.

But here’s an interesting inflection point: after the initial Covid-19 reaction, after the initial wave of fear, the algorithms continued to push fear-related news higher in the feeds. The initial threat was mitigated somehow (lockdown, testing, supplementing medical capacity, etc), yet the algorithms were still very fond of fear, because, you guessed, that type of news generated the biggest amount of time spent on the network. At the moment of writing, the algorithmic fear is still prevalent in social media, although its support is not predominantly Covid-19 (riots in the US, for instance, were another algorithmic fear event).

The problem here is that algorithms are dissociated from reality. They aren’t built to mirror our world in a balanced way, but to increase time spent in the network. And, because of this, algorithms can create a different “reality” in social media, one with some traits more visible than others. And they actually do this as we speak.

The more engagement with this “scripted” reality is created, the more people stick with it. The algorithms are getting validated and they become even more prevalent. It’s a downward spiral, a vicious, never ending circle.

Algorithmic Emotions

I’m not a conspiracy theory adept, never was, and hope I’ll never become one (if I will, please push me aside). I’m just observing some trends and try to infer some logical conclusions. In this case, I think we’re witnessing the rise of algorithmic emotions in the human race. 

Social media is not news. To consume news we still need intent, we still have to make an effort to connect with it. On the other hand, social media is something that we actually need. And we need it to quench our thirst for human interaction. Bonding is part of our nature, and even the Covid-19 lockdown showed us how difficult it is for us to stay in isolation. Our contact surface with social media is way bigger than the surface contact we have with news.

And yet, because of the way the algorithms are working, these two are blending, their contact surface is mixed together, and the end result of this mix is that we tend to favor a certain emotion, based on certain news. Like fear of an epidemic, instead of joy for another beautiful spring.

The world is literally changed by algorithms right now. We’re at the very beginning of this trend, and, to be honest, most of these algorithms are “good” (it’s a dangerous word in this context, but I can’t find another). It’s “good” in the sense that creates a bit more healthy awareness (Covid-19), a bit more distancing from authoritarian states (Trump) and a bit more care for the environment (climate change). All these are fear inducing algorithms, by the way, in the sense that the reality behind those events is less scary than it looks like.

But, ultimately, there is no “good” or “bad” in an algorithm. It’s just the outcome, always the same, always as it was scripted. As the world will become more and more entrenched in these patterns, the speed and power at which an algorithm could swing us in any direction will become bigger and bigger. Even more, we would be even “forced” to believe that the presented reality is “right”. Simply because any other option will barely make it to our focus, filtered by the algorithms.

An optimistic person would respond to this by saying: “well, if social media is the medium, just give up social media entirely”. I think this would be a healthy choice: limit the time on social media, limit the exposure. In time, the power of these scripted algorithms will diminish.

I wish this would be easy.




Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.