Index > ANBOUND Geopolitical Review (AGR)
Back
Thursday, September 05, 2024
How Recommendation Algorithms Fuel Collective Extremism
Kung Chan

Recommendation system algorithms are a technological method adopted by major online platforms. This translates as everything we see online today is selectively recommended by these platforms, and many remain unclear about the implications of this.

The advent of recommendation systems was originally meant to solve the problem of information overload. Today, the speed at which the publications of books, newspapers, and other information far exceeds the reading limits of an individual, let alone the internet. Even a moderately popular forum can make it very challenging to read all new posts daily, to the extent that larger forums require many moderators to manage.

In fact, humanity transitioned from information scarcity to information overload in just a few decades. Facing such vast amounts of information, new technological solutions are essential.

The first method introduced was categorization, and Yahoo's early content directory also reflects this idea. This method of categorization is not uncommon. Historically, when categorizing books, the ancient Chinese used four major categories: Confucian classics, history, philosophical works, and literature. Similarly, when categorizing biological organisms, seven levels were proposed: kingdom, phylum, class, order, family, genus, and species. It is also common to see website portals presenting various categories: news, video, pictures, sports, entertainment, finance, technology, fashion, automobiles, education, horoscopes, games, and so on.

However, categorization alone fails to address the issue of information overload. As the volume of information expands, the process of categorizing each item becomes increasingly burdensome, and constructing a usable index or directory presents significant challenges. This is where search engines become essential. Platforms such as Google function as search engines, providing a mechanism for locating information that one is aware exists but cannot find. In this context, there are two categories of the unknown: known unknowns and unknown unknowns. Known unknowns refer to information that one recognizes as lacking, whereas unknown unknowns pertain to information one is not even aware of needing.

For this reason, recommendation systems are introduced. A fundamental principle of recommendation systems is "like attracts like", both in terms of items and people, and this principle is embodied in algorithms. For example, if someone enjoys reading hard sci-fi novel "The Three-Body Problem" but is unaware of the existence of "The Wandering Earth" and "Ball Lightning", the platform will use algorithms to automatically recommend these books. Many people would likely welcome such recommendations. If the recommendation system identifies someone as a science fiction fan, it will suggest science fiction books, which is certainly more reliable than recommending something like "Chicken Soup for the Soul".

It is important to note that a significant negative issue with recommendation systems is that they can exacerbate societal biases. Rather than mitigating bias, they can inadvertently reinforce it.

In reality, as long as individuals are not the type to sample everything they come across, internet users are likely to focus their attention on things they like and support. This behavior creates distinct boundaries that are centered around one's closed self. What recommendation algorithms do is reinforce these preferences, effectively creating these often limiting and biased boundaries, akin to confining oneself in a mental prison. Even though one may only inhabit a small circle, many internet users often perceive it as the entire world.

Similarly, individuals who support viewpoint A will be pleased to see that 95% of the content on their news feed aligns with their views. In reality, those who support viewpoint B will observe that 99.85% of the content supports viewpoint B, which makes them even happier. To them, it seems that only 5% or 0.15% of people oppose their viewpoint, leading them to form a concept that, while supported by data, is not necessarily accurate. Both of them would believe they are on the right side.

This is how observational errors or biases are generated. Recommendation algorithms create a false sense of reality by tailoring what we see based on our preferences, effectively constructing a small, personalized circle of information. It is therefore crucial not to mistake this limited perspective for the entire world. The information bias created by such systems and other factors can easily lead to brainwashing that molds our lives and personalities.

Modern online platforms use recommendation algorithms to easily inject data into the information flow and shape our views, for instance, to perceive that COVID-19 is not a serious issue. Similarly, it seems to have become easier to bully others. This is because people feel emboldened by their large numbers and their own logical reasoning, even though this may be based on incorrect facts. They think that with so many people on their side, their collective voices and criticisms can overwhelm dissenting views. Consequently, society becomes increasingly polarized and extreme. All of this is related to the influence and drive of recommendation algorithms, which create misleading impressions.

In today's world, many individuals can only hold one viewpoint at a time. During the era of the "recommendation algorithm" pandemic, suppressing opposing views is often perceived as a form of "cleansing". Comprehensive and objective understanding usually resides with a small group of specialists, who alone can offer diverse and thorough information. However, due to challenges like information overload, observational bias, and information filtering, even these experts may struggle to achieve broader recognition.

Final analysis conclusion:

In the era of information overload, the algorithms of recommendation systems in modern online platforms pose a significant negative issue, leading to widespread societal extremism and polarization. Unfortunate as it may be, thorough and objective understanding is now the domain of only a very few specialists.

ANBOUND
Copyright © 2012-2025 ANBOUND