Are You Being Manipulated?

By Ines Garcia (Y11)

propaganda
/ˌprɒpəˈɡandə/
noun
1.
information, especially of a biased or misleading nature, used to promote a particular cause, doctrine, or point of view.

The idea for this article came to me while studying for my history IGCSE, or more accurately, procrastinating by scrolling through TikToks that summarized the content instead of actually revising. It was through a video discussing the propaganda used in Nazi Germany that I began reflecting on propaganda itself. I wondered how many people truly believe they have not encountered propaganda in recent years, and beyond that, how many think they have not been influenced by it.

I do not blame them. In today’s digital world, propaganda presents itself very differently. It is no longer bold or striking. Instead, it is calculated and precisely targeted through algorithms. We no longer walk through cities surrounded by colorful posters urging us to join a cause. Now, we open our social media feeds and absorb thousands of subtle, carefully selected messages, often without even noticing.

When I think about this new wave of influence, I am reminded of the viral video of a young Barron Trump playing with his suitcase, or “sootcase,” as the internet fixated on. I laughed at it myself before scrolling on without a second thought. At the time, it felt harmless, but looking back, it raises questions about how easily we consume and move past content without considering its purpose or impact.

Young Barron Trump in the devil’s office (Cred: CNN)

The spread of ideas by companies, brands, political parties, and even governments is now so subtle that it often goes unnoticed by people who turn to the internet for distraction. It is a fact that in this modern, increasingly digital world, the attention economy has grown exponentially. Meaning that our hunger for distraction and constant stimulation has created its own competitive market as companies fight to keep you entertained and engaged on their platforms. Click, scroll, post, repeat. This cycle has become a defining feature of modern life. As a result, more understated messages blend into the background, and we become desensitized to underlying political or persuasive content.

This raises a concern that in the future, we will see our internet-focused society plagued by subliminal messages and not bat an eye. In such a world, propaganda would not need to be obvious to be effective. Its strength would lie in its invisibility.

In a somewhat cynical sense, there is little we can do to completely avoid this, short of abandoning the internet altogether and becoming Amish, which is clearly unrealistic for most people. Given the choice, many would likely accept the risk of misinformation rather than disconnect entirely.

This idea connects to a widely discussed conspiracy known as the Dead Internet Theory. This theory suggests that much of the internet is no longer driven by real human interaction, but by bots generating content, comments, and engagement. According to its supporters, these artificial interactions are designed to imitate human behaviour and influence public opinion.

While the theory itself may be exaggerated, there is some truth behind it. Bots are already used to spread misinformation and to amplify certain narratives. Influencers and companies can purchase engagement to promote products, making them appear more popular and ensuring they reach wider audiences.

More concerning, however, is the use of these techniques by the government. Russia, for example, has been associated with a strategy known as the Firehose of Falsehood. This approach involves rapidly spreading large volumes of information, often inconsistent or misleading, in order to confuse audiences or shape their perception of events. Online trolls are paid to post comments and messages that support a specific narrative, while opposing viewpoints are targeted and undermined.

This method is particularly effective because repetition creates credibility. When an opinion is reinforced multiple times by seemingly different individuals, it begins to appear more trustworthy, regardless of its accuracy. With many social media videos or posts, sources are rarely provided. This allows such tactics to thrive, as users are less likely to fact-check claims that are widely supported in comment sections. In these cases, quantity begins to resemble quality in the reader’s mind.

Returning to TikTok, there are several patterns I have noticed as a user myself. One is the way brands adopt a relatable and quirky tone, commenting on viral posts in an attempt to capture attention and appear relevant. While this may seem harmless, it is ultimately a marketing strategy. A company like Sour Patch Kids does not genuinely care about someone’s cute dog that just so happened to get one million likes on Instagram. Instead, it is using the moment to promote itself. This form of advertising is effective because it creates the illusion that large corporations are part of everyday online interactions, when in reality they are simply capitalizing on them.

Similarly, political parties have begun using platforms like TikTok to share memes and humorous content in order to appeal to younger audiences. While this may seem like a modern way to engage voters, it also blurs the line between entertainment and political messaging. Moments like the viral clips of Baron Trump, including the suitcase video, show how easily political figures can be turned into digestible, shareable content that shapes public perception without deeper analysis.

In conclusion, the evolution of propaganda in the digital age was almost inevitable, given how much time we spend online. However, it is important to remain aware of what we are consuming. It is always worth asking: Is this simply entertainment, or is it trying to influence how I think?

Leave a Reply