VISION SIDING

View Original

The Unforeseeable Dystopia

In the not-so-distant future….you wake up to the soft hum of morning light filtering

through your curtains only to find that your thoughts are not entirely your own. As you rub the

sleep from your eyes, a subtle murmur permeates your consciousness—a voice not from within,

but deceivingly originating from the depths of your own mind. It's not your inner monologue; it's

an advertisement, an intrusive whisper heralding the latest product with nerving precision: "Buy

the newest Neuralink Enhancement Module," your inner monologue says. You ponder, "How did

this happen? Have my dreams been hijacked by the digital realm?" This dystopian awakening is

not a figment of science fiction but a potential chilling future reality. The intrusion of ads into the

sanctity of one's thoughts serves as a disconcerting prologue to the age where technology and

consciousness intertwine, a narrative that urges us to question the price we pay for progress. In

the ever-evolving landscape of technology, social media has become an integral part of our daily

lives. However, the question arises: Should we embrace this digital frontier or sever ties with it

altogether? This essay contends that the time has come to take a definitive stance and delete our

social media accounts. The ramifications of prolonged exposure to these platforms are vast and

multifaceted, ranging from the modification of behavior to the insidious intertwining of

technology and our cognitive processes.

In this broad belt of human unconsciousness, which is immune to conscious criticism and

control, we stand defenseless, open to all kinds of influences and psychic infections. Social

media, once a tool for communication, has transformed into a behavior modifier. It not only

reflects but actively shapes our beliefs and perspectives. A study published in The Journal of

International Communication found that individuals who frequently engage with content

supporting their political views become more entrenched in those beliefs, leading to increased

polarization (Kubin). The research shows that people become more polarized and less moderate

in their views when exposed to content that ratifies their pre-existing opinions. The result is a

fractured society, with individuals inhabiting echo chambers that reinforce their biases (Lanier).

Algorithms designed to maximize engagement and keep users scrolling serve as powerful

amplifiers of this effect. As a result, people are increasingly isolated from diverse viewpoints and

are less likely to engage in constructive dialogue with those who hold different opinions.

Beyond behavior modification, social media operates on a psychological framework akin

to gambling. The allure lies in the perfect mix of good and bad experiences, much like a slot

machine. Platforms such as TikTok, Instagram Reels, and YouTube Shorts exploit this dynamic,

creating addictive loops where users hope for a payoff of engaging content amid the sea of

mundane or even harmful material. Social media, in many ways, resembles a form of digital

gambling, where users are drawn into a cycle of engagement, not entirely different to the allure

of casinos. Much like gamblers who continue to bet despite losing streaks, social media users

often find themselves scrolling endlessly, even when they encounter content that they might find

offensive or disagreeable (Zentall). This paradoxical behavior can be attributed to the

intermittent rewards system employed by both social media platforms and gambling

establishments. Just as the house allows the gambler to occasionally win to keep them hooked,

social media offers a mix of desirable and offensive content, ensuring that users keep coming

back, hoping to strike the digital jackpot of compelling posts or interactions. This peculiar

inclination to tolerate occasional losses, or in the case of social media, unwanted content, speaks

to the powerful psychological grip these platforms have on our behavior and emotions.

The recent global pandemic has catalyzed this shift in societal norms, pushing more

interactions into the digital realm. This has given rise to an altered reality where physical and

virtual spaces intertwine. As we navigate this new normal, the desire to return to the "good old

days" becomes a nostalgic longing that most likely may never be fulfilled. Ever since the

pandemic people are spending more and more time on social media, but less and less people are

feeling those social connections in-person; 56% of people feel more isolated since the onset of

the pandemic, and 29% report feeling more depressed (Davis). Even through personal

experience, my current in-person biology class consists of 55 students and a mere 15 show up on

a consistent basis, so much so that my professor had to make attendance an assignment just to get

a few more heads to show. In contrast, on the opposite spectrum of the world stage, China has

began to instill the implementation of computer software and devices monitoring students' focus

levels through headsets which is a clear testament to the growing surveillance state facilitated by

technology; “Chinese pupils must wear 'mind-reading' headbands which scan their brains and

will alert teachers if they are not concentrating in class” (You). Apple's recent patent for a brain

wave sensor further exemplifies the oncoming encroachment of invasive technologies into our

personal lives.

The insidious control exerted by tech giants like Google and YouTube extends beyond

mere surveillance. These platforms aim not only to influence behavior but also to control and

manipulate users for financial gain. The Wall Street Journal reported on Google's algorithms

promoting sensational and divisive content to keep users engaged, thereby maximizing

advertising revenue (Horwitz). In China, the birthplace of TikTok, there exists a profound

acknowledgement of the inherently addictive nature of this slot-machine-style social media

content. Consequently, Chinese regulators have implemented measures to address this concern.

These measures include not only imposing screen time limits on the app but also curating the

content to predominantly feature educational material. This proactive approach showcases a deep

understanding of the potential harm associated with unrestricted social media usage and

underscores the commitment to safeguarding users from the adverse effects of excessive

engagement with such platforms. The Chinese example serves as a compelling illustration of the

responsibility that social media creators and regulators across the globe, including the United

States, ought to assume in mitigating the addictive qualities of these digital spaces.

Even platforms like YouTube, once considered a haven for informative and entertaining

content, are succumbing to the addictive allure of short-form videos. The shift towards YouTube

Shorts undermines the platform's original purpose, leaving users, including those who had

distanced themselves from mainstream social media, struggling with addictive content. This

change represents a notable departure from YouTube’s original standard of content, as it

increasingly caters to the demand for quick, snackable content. Consequently, users (including

myself) who initially sought refuge from mainstream social media platforms now find

themselves grappling with the captivating yet potentially addictive nature of these shorter videos.

While there are positive aspects to social media, such as connecting with friends and sharing

ideas, the pervasive negative consequences overshadow these benefits. Viewing social media as a

mere bulletin board for information rather than an authoritative source is a healthier perspective.

Instances of misinformation and manipulation on social media platforms have been widely

reported, undermining the credibility of information shared on these platforms.

The old saying holds true: If a service is free, you are the product. The monetization of

user data, facilitated by targeted advertising, transforms users into commodities sold to the

highest bidder. An investigative piece by Forbes highlighted how social media platforms profit

from user data by selling targeted advertisements, emphasizing the commodification of user

information (Leetaru). The practice of monetizing user data raises significant privacy concerns.

As users interact with various online platforms, they unknowingly leave digital footprints that are

harvested and analyzed for commercial purposes (Lanier). This not only undermines the privacy

of individuals but also poses a threat to their autonomy, as their personal information is used to

manipulate consumer behavior. Furthermore, the lack of transparency in data collection and

usage practices often leaves users in the dark about how their information is being utilized,

calling into question the ethical implications of such business models.

As technology advances, the integration of artificial intelligence and brain-machine

interfaces becomes a looming reality. Elon Musk's Neuralink, while holding promise for

enhancing human capabilities, raises concerns about the extent to which our cognition will be

tethered to AI, potentially amplifying the risks associated with unchecked technological

advancements. Musk's statement in Forbes regarding Neuralink's potential to mitigate the risks

of AI underscores the intertwined future of human consciousness and artificial intelligence

(Hart). This integration poses fundamental questions about the preservation of humanity as we

know it and privacy. As our thoughts and neurological patterns become increasingly interfaced

with AI systems, there arises the potential for unprecedented forms of surveillance and

manipulation. The ethical and societal implications of such technologies are profound, ranging

from the potential loss of individuality to the blurring of the lines between human and machine

intelligence/programing. Moreover, the reliance on AI-enhanced brain interfaces could lead to

new forms of dependency, where individuals are no longer able to function or make decisions

without the aid of technology. These concerns highlight the need for robust ethical frameworks

and regulatory oversight to ensure that such advancements benefit humanity without

compromising fundamental human values and rights.

In conclusion, the convergence of technology, surveillance, and behavioral modification

paints a dystopian picture. The current trajectory, marked by addiction, manipulation, and

corporate control, leaves me little room for optimism. The merging of Neuralink or Apple’s brain

censors and AI paints a future where those who resist the integration of technology risk being left

behind economically and socially. These technologies resemble the biblical metaphor of Adam

and Eve's apple from the Tree of Knowledge, bestowing a newfound consciousness and

unveiling a realm previously unexplored, except this realm is full of total egalitarian control by

your favorite social media companies. The iceberg of today's technology is merely the tip, and as

we hurtle toward an uncertain future, the choice to delete our social media accounts emerges as a

radical act of reclaiming your mind in an increasingly controlled world. It's a decision to resist

the addictive allure, the surveillance state, and the corporate manipulation that defines our digital

existence. The hope for a better future lies not in blind optimism but in the collective awakening

to the perils of unchecked technological progress. The time to act is now, before we find

ourselves irreversibly ensnared in the web of our own creation…

Works Cited

Davis, Sarah. “Form Relationships since COVID-19, Survey Reveals .” Forbes, 28 Sept. 2023,

www.forbes.com/health/mind/social-anxiety-since-covid-survey/. Accessed 7 Nov. 2023.

Hart, Robert. “Elon Musk Says Neuralink Could Slash Risk from AI as Firm Prepares for First

Human Trials.” Forbes, 26 Sept. 2023,

www.forbes.com/sites/roberthart/2023/09/21/elon-musk-says-neuralink-could-slash-risk-f

rom-ai-as-firm-prepares-for-first-human-trials/?sh=46c30c782956. Accessed 9 Nov.

2023.

Horwitz, Jeff. “The Facebook Files, Part 4: The Outrage Algorithm - the Journal. - WSJ

Podcasts.” WSJ, 18 Sept. 2021,

www.wsj.com/podcasts/the-journal/the-facebook-files-part-4-the-outrage-algorithm/e619

fbb7-43b0-485b-877f-18a98ffa773f. Accessed 8 Nov. 2023.

Kubin, Emily. “The Role of (Social) Media in Political Polarization: A Systematic Review.”

Annals of the International Communication Association, 2021,

www.tandfonline.com/doi/full/10.1080/23808985.2021.1976070,

https://doi.org/10.1080//23808985.2021.1976070. Accessed 7 Nov. 2023.

Lanier, Jaron. Ten Arguments for Deleting Your Social Media Accounts Right Now. Random

House Uk, 29 May 2018.

Leetaru, Kalev. “What Does It Mean for Social Media Platforms to “Sell” Our Data?” Forbes, 12

Oct. 2022,

www.forbes.com/sites/kalevleetaru/2018/12/15/what-does-it-mean-for-social-media-platf

orms-to-sell-our-data/?sh=bee4b32d6c4d. Accessed 9 Nov. 2023.

Zhuk 8

You, Tracy. “Chinese School Makes Pupils Wear Brain-Scanning Headbands in Class to Ensure

They Pay Attention.” Mail Online, Daily Mail, 31 Oct. 2019,

www.dailymail.co.uk/news/article-7634705/Chinese-school-makes-pupils-wear-brain-sca

nning-headbands-class-ensure-pay-attention.html. Accessed 8 Nov. 2023.

Zentall, Thomas R. “An Animal Model of Human Gambling Behavior.” Current Research in

Behavioral Sciences, vol. 4, 1 Jan. 2023, pp. 100101–100101,

www.sciencedirect.com/science/article/pii/S2666518223000062,

https://doi.org/10.1016/j.crbeha.2023.100101. Accessed 7 Nov. 2023.