Holdan Hitchcock
Associate Opinion Editor

In my senior year of high school, my classmates and I were given a task from our English teacher to persuade our audiences about anything. It wasn’t meant to be a political assignment although I’m sure a few other students presented political ideals. However, I went with a more topical discussion. I remember seeing news reports around this time in high school that social media was in fact causing more harm than good. This notion I certainly didn’t believe, so I set out to persuade my peers that these reports were bogus. And if you were in that first period English class with me at Wyalusing Valley, I would have told you the true blessings that social media and tech has provided. A few examples I poured out were ideas of being able to get news from reputable sources within a few button presses, the connections created with one another that increased dopamine levels, and entertainment content found on the web. I argued that social media and other ventures in the tech social space brought about good, systemic changes in human interaction. Yet this piece isn’t about how I got an A- in a high school presentation. I reference this story because after viewing the Netflix Original documentary “The Social Dilemma;” I do not believe I’d give the same optimistic type of presentation today.

“The Social Dilemma” interviews tech industry engineers and insiders about some hidden truths of the tech industry at large. The documentary opens with Tristan Harris a, former engineer at Google and Co-Founder of Center for Humane Technology who is propped up as the catalyst for the whole documentary itself. Tristan Harris states at the beginning of his interview, that he was working at Google at a time in which they were working on a new design for how email notifications work and was feeling burnt out. Harris says that even he was addicted to his email and he began to question the integrity of what Google was doing, “Shouldn’t we be making this less addicting?” He asked. This idea gained traction with hundreds of employees at Google and was even brought to attention by Larry Page, one of the Co-Founders of Google.

Ultimately the idea would never come to fruition at Google. When most people think of social media and other tech ventures, they are usually broken down to the simplest bits of what they are. Google is a search engine, while Facebook is where I interact with my friends, and Twitter is where I see a feed tailored to me and my hobbies and interests. YouTube is where I go to get short-form entertainment. The reality is that really isn’t the case. All of these companies are in the advertising business and they all compete for your time and attention. Tristan Harris says a common phrase in the industry is “If you don’t know what the product is, then you are the product.” This is a sentiment that author and computer scientist Jaron Lanier doesn’t fully get on board with. Lanier is considered a founding father for virtual reality. Lanier in his interview in the documentary, says that the idea that you are the product is “too simple’, it’s the gradual slight in change of perception and behavior that is the product.”

Being able to obtain your data and understand what data you would like to see through forms of what an algorithm is telling you is what is called surveillance capitalism. In fact, surveillance capitalism is still a prominent force of revenue for advertisers. Former presidential candidate Andrew Yang has stated on Twitter that users should be able to profit off of the data that is being collected from them.

So how do they do it? How is all the data tracked, and what is being tracked? It’s not something I really thought about until watching the documentary, then it all became quite clear as I was writing this. While writing this article I received several notifications from apps I have not used in quite some time and I’ll preface that I have a Google Pixel phone that I’ll get back to here shortly. The apps that notified me were: Moe’s Southwest Grill notifying me that I can get double the reward points today (the nearest Moe’s from where I live currently is 2 hours away) and the ESPN app notifying me about tennis matches the following day. The ESPN tennis match thing is bizarre because I’m not a fan of tennis, but I have a brother who is very into tennis and has mentioned it recently in a group chat I have with two of my older brothers and my dad, that group chat is run through Google Hangouts messaging app. More commonly you’ll see ads pop-up through a Twitter feed or before a YouTube video that is usually tailored to your data.

This time, though, it had backed into tailoring the data from a group chat into my feed. And Moe’s Southwest Grill just misses me and notices I haven’t had Moe’s to eat in awhile. I miss Moe’s but that is beside the point. That is just an example of surveillance capitalism that I happened to notice because it’s not part of my niches. Notifications is just one of the many design aspects in the tech/social industry to try to gain our attention.

That design aspect may seem trivial and really doesn’t bring about much concern that social media is harmful. A problem with some of the tools that social media has brought about is where there should be a serious concern. Dealing with perceptions with Facebook, Instagram, and Snapchat in particular. Chamath Palihapitiya is a venture capitalist who was once a former Facebook executive who was in charge of user growth. In the documentary, Chamath Palihapitiya isn’t interviewed but there is video footage of him giving a talk at Stanford in 2017 about how Facebook’s tools and design deflated personal perceptions.

“We compound the problem. … We curate our lives around this perceived sense of perfection because we get rewarded in these short-term signals: hearts, likes, thumbs up. And we conflate that with value, and we conflate it with the truth. And instead what it really is is fake brittle popularity – that’s short term, and that leaves you even more, and admit it, vacant and empty [than] before you did it, because it forces you into this vicious cycle where you’re like, ‘What’s the next thing I need to do now?’ ‘cause I need it back,” said Chamath Palihapitiya.

Something as simple as the “like button” had created detrimental consequences to one’s mental health. Unfortunately, this type of perception feeding loop is the most common and most predatory in sites like Facebook and Instagram. These perception feeding loops prey on Gen Z the most. As stated in the documentary Dr. Jonathan Haidt of NYU Stern School of Business’ Social Psychologist, shares that “Gen Z, is the first generation to have social media since at least middle school (and before) and because of that, they are the generation that is most depressed, most anxious, most fragile, and the least likely to take risks.” Dr. Haidt also goes on to cite the statistics gathered by The CDC on self-harm in preteen and teenage girls, in which girls ages 15-19 self-harm has increased 62 percent and suicides up 70 percent. Self-harm in preteen girls ages 10-14 had increased 189 percent and suicides were up 151 percent since 2009. We allowed these perception tools and mechanics to really mess with our perceptions of ourselves. Where we value happiness with ourselves with likes, and where we compare ourselves and how our self-esteem is set by unrealistic standards of beauty, that even I have fallen for.

In 2019, I had fallen into this trap of negative self-perception inadvertently through circumstances in life, and the things I would see on Twitter and Instagram and then later in YouTube video recommendations. Back in 2019, I had gone through a break-up in what was my first serious relationship. And as people know break-ups are sucky.

So as a dumb 20-year-old kid living by himself in Pittsburgh, I thought it would be in my best interest to buy a gym membership that financially made no sense. So why would I buy a gym membership? It’s because of things I saw on Twitter and Instagram. I distinctly remember a tweet that showed up on my feed because of somebody I followed liking or retweeting it. I don’t remember exactly what the tweet said but it was along the lines of “Men under 5’8” are useless.”

This tweet in particular had something over 30,000 retweets and over 100,000 likes. I bring this up because as a man that’s well under 5’8”; to know that the perception of me to over 100,000 people is that I am considered “useless” is incredibly jarring. The human brain isn’t suited to comprehend what thousands of people may perceive us. This is something I’ve learned to get over as I get older, but surely this type of thing is happening to teens all over the country on a daily basis.

The tools of social media sites not only are detrimental to the perceptions of ourselves but to society’s perception of information. The example given in The Social Dilemma documentary is on if you were to type in “climate change is…” and let the Google engine auto-complete you’ll get different information just based on where you are physically in the world. This is also the case in Facebook and Twitter feeds. Where each user has their own echo-chamber tailored to the user. The best examples of how people can receive totally different streams of information is based on how Facebook and Twitter algorithms read it’s users.

For example, Facebook’s algorithm is designed in a way in which it has the ability to find users that are susceptible to believing in conspiracy theories and suggest different conspirator groups, such as Pizzagate and qAnon. A normal person wouldn’t ever see this type of faux-information. Sometimes these conspiracy theories can be as frivolous as “The Flat-Earth” conspiracy or can be extremely dangerous with Pizzagate. Pizzagate conspiracy was rooted in the idea that buying a pizza pie would, in turn, mean a human being was being trafficked, this information allowed a gunman to try and take over a pizza shop to check out a basement that did not exist because he believed there was a pedophile ring. Facebook had essentially created a propaganda machine that has been used by countries in under-developed countries to control its citizens.

Towards the end of Netflix’s tech documentary, the ideas that are being reflected about the threat of this disinformation age is that the existential threats are not using the phone itself, or advertisers getting you to watch one more video, or spend 5 more minutes on a site. The biggest threat is that misinformation is causing more polarization and more division. Creating more offline harm that is relatively violent. Tim Kendall, who was a former President of Pinterest was asked what he is most worried about happening in the near future. He responded with “In the shortest-horizon…Civil War.” How horrifying is that? That some of the people who created the very tools of what technology and social media believe that at the current rate we will be in a civil war in the new future.

How do we stop this prospect? Well, the opinion shared with many of the tech engineers in The Social Dilemma and something I agree with; is that the problem isn’t with the tech or social media. The real problem is within the business model. There is no regulation on data mining users for companies. The idea is that if you regulate the amount of data that is taken from users or tax it. There is now no incentive for companies to pursue all the data out there because it wouldn’t fiscally benefit the company or the shareholders to do so. The reason none of it has changed is that the criticism hasn’t reached the mainstream. Who knows if it even gets there? The failures expended upon the tools of these technologies are not widely known, and that is why this documentary was made, to shine a light on the failings of the tech industry. Not in the tools they have created but being negligent to the ways the tools are being used to do harm.