Algorithms are turning you into a troll

By Maria Barquete,
Staff writer

We watched a documentary called “The Social Dilemma” in my English 2 Honors class. It was about the dangers of social media, including how apps use their algorithms to catch our attention and how they make money by selling our eyeballs to their advertisers. Those algorithms show us only what we want to see. They know what we will agree with.

Depending on opinions and interests, our phones show us completely different worlds from one another. This results in people misunderstanding each other, since they do not have access to different points of view. In order to end this disconnection between each other, we need to make an effort to look beyond what our phones want us to see.

I have even seen this in our school, when people viciously attacked Rampage’s current Co-Editor-in-Chief, Noah Shifter, on social media for the article he published about abortion. Most of the Instagram stories I saw about the topic did not contain counterarguments for the facts he was presenting, instead they attacked Noah himself, or gave broad arguments on how he wrote something unacceptable. It was almost as if those people attacking him only glanced at the article’s headline or skimmed its first paragraph, not reading its actual content.

Some may argue that I am defending Shifter because I agree with what he believes in. However, I am not taking either side here. Rather, I am defending peaceful discussions about the topic.

This instant hate that comes to people whenever they see a differing opinion is amplified due to how social media only shows us what we want to see. It happens because we choose to get our information from places we know will make us feel comfortable.When we are faced with opposite ideas, it enrages us. We are unable to understand how someone could think differently than how we do–and I say we because I have recognized that in myself.

A Wall Street Journal investigation found that “when [Youtube] users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints.” This proves how algorithms fuel extremism that causes people to not even be able to hear what the other side has to say. Stanford economist Matthew Gentzkow told the American research group Brookings Institution “people are seeing political content on social media that does tend to make them more upset, more angry at the other side [and more likely] to have stronger views on specific issues.”

Each one of us can solve this by popping our media bubbles that block us from viewing and understanding the other side of the argument. Whenever we hear or read something about a certain topic, it is important to make sure that the information is accurate, and look for counterarguments to get the full scope of the story. Only then we will be able to decide who or what we agree with.

If each one of us lives in different worlds created by algorithms, it makes it harder to understand each other and reach conclusions together.