top of page

The spiritual case for banning TikTok

Published in UnHerd.


Four years after Donald Trump’s failed attempt, Democrats and Republicans have just passed a bill which could see social media giant TikTok banned in the US. “We are united in our concern,” said spokesmen for the two parties, “about the national security threat posed by TikTok — a platform with enormous power to influence and divide Americans.” The move was made in response to fears that the app’s parent company ByteDance shared user data with the Chinese government, and is part of a wider effort to mitigate the dangers of major firms managed by “foreign adversaries”.


Could — or should — the UK follow in their footsteps? If these security concerns weren’t enough, the social impact of the app should always have been cause for alarm. There is something uniquely pernicious about TikTok’s algorithm, which feeds users endless loops of short-form content — content that is increasingly sped-up and, inevitably, dumbed-down. It is hardly a climate for cultivating informed or thoughtful opinions, and yet 28% of teenagers now use TikTok as their primary source for news and political activism.


More subtly, it is an algorithm that renders users almost completely passive. TikTok does not even require the autonomy of scrolling, like other social media giants. Instead, it automatically plays a series of videos that become hyper-personalised with even the slightest engagement.


Rewatching a video just once triggers the algorithm to show more and more related content, which has been found to lure teenagers into spirals of exposure to self-harm, eating disorders and harmful ideologies. Arguably more so than other platforms, TikTok fixates users on content that appeals to their darkest impulses and insecurities – content they would not otherwise actively choose to consume.


Shrug it off as “doomscrolling”, but as a society we should take anything that encourages such behaviour seriously. In this case, the medium really is the message: the lack of agency wired into the TikTok algorithm tells us everything we need to know about its tendency to manipulate and corrupt young minds.


A little under a decade ago, the French philosopher Bernard Stiegler warned that this kind of technology is dangerous precisely because it removes the need for rational agency in the consumption of content. This he described in terms of “participation”: whereas historically human beings would actively participate in the entertainment they consume — for example, playing an instrument in order to experience music — we are now subjected to media as passive recipients.


While harmless in small doses, the tyranny of constant stimulation inhibits our potential for creative and rational thought. This, Steigler said, has a stupefying effect, making society susceptible to all kinds of manipulation. Gen Z’s shortening attention span, then, could be just the tip of the iceberg.


This is hardly a good situation for democracy. Healthy societies depend on the participation of rational subjects, capable of making decisions for the overall good of society. This has, of course, always been difficult to achieve, but any technology which artificially exacerbates the worst tendencies of human nature ought to be condemned — especially when it is being swayed by forces beyond our control.


Political philosophers as early as Plato saw major risks to democracy as arising when people exercise freedom in accordance with individualistic desires rather than the common good. It is at this point when democracy turns into tyranny, beginning with the enslavement of people to their own irrational impulses and addictions.


Spyware or not, TikTok is a technology which, by design, manipulates its users in ways which work against autonomy and, as a consequence, genuine democracy. This alone should be sufficient enough a reason to question its place in Western societies — not least in times of geopolitical instability.

Comments


Commenting has been turned off.
bottom of page