Should Social Media Be Banned?

RSS available  RSS available   |   RSS available  Printable version

A recent story in British newspapers linked the suicide of a 14-year-old boy to social media use. In 2017, a 12-year-old girl in Miami streamed her suicide live on Facebook.

These and similarly tragic tales have boosted an already fervent debate on the links between social media engagement and mental health.

At 2030Plus, we’ve been reviewing for some years the links between cognitive function and internet involvement. Other, larger research organisations have done likewise. There is little doubt that a growing reliance on digital technology has changed the way our brains work.

Governments are under pressure to act. Last year, the Australian government opened an investigation into Facebook, the largest and, for many, the most troubling of the new media giants.

The company released, without authorisation, data from 300,000 Australian user accounts to the now defunct Cambridge Analytica. Worldwide, Facebook compromised the privacy of 87 million accounts in this way.

Meanwhile, Germany’s Federal Cartel Office has looked into punishing Facebook, for gathering data on non-Facebook users, without their knowledge.

In Britain, a government minister has proposed the banning of social media platforms that refuse to remove harmful material from their sites. This on the back of consistent reports about unhealthy or potentially illegal material on YouTube and other platforms.

This sounds laudable, but is the threat of a ban the right approach? After all, huge numbers of people rely on these platforms, as gateways into news gathering and business opportunities, as well as day-to-day communication. Are there more workable starting points?

At the very least, social media companies ought to be treated like drug companies. There are clear links now between social media use and a recognised mental condition known as internet addiction.

Drugs are subject to government regulation. When we use a legal drug, whether it’s for our physical or mental health, we can be sure that our government has deemed it safe. The same should apply to social media platforms.

As an immediate starting point, we might take the approach used with film ratings.

In many countries, movies are given ratings by a government-appointed film board, based on their suitability for certain age groups. The distributors must then display the rating at every point of delivery. Why can’t that be applied to social media?

We might also opt for video commercials depicting the potential negative impacts of social media, which must be included on users’ home pages.

Governments certainly need to insist upon proof-of-age on social media accounts. We do this with credit card application and travel booking sites, why not social media?

Requiring proof of identity might help also be useful, especially in reducing the terribly high incidence of online trolling and bullying. Anonymity encourages social disinhibition - people feel free to insult others online in ways they wouldn’t dream of doing face-to-face.

Recently, Facebook was accused of blocking a British-built app that monitors political interference on Facebook’s platform and its impact on elections. Apparently, some social media groups are happy to practice censorship or editorialising when it suits their interests.

Yet when it comes to protecting young people, some refuse to (as they see it) “interfere” with the activities of private users. When the mental health of young people is at stake they say, “We’re just a platform, we have no responsibility for content.”

This is a blatant double standard which only governments have the power to challenge.

If social media groups want to act like the publishers and news curators of our time, they must accept the concomitant social responsibility.

Some would like us to believe that they’re still just maverick organisations, a part of the wild west of the internet. But these groups are now multi-national corporations. They rake in huge profits from advertising and selling our data and often paying comparatively little tax.

Social media companies also pay expensive lobbyists to argue their case to governments. They should not be able to behave as if they are the put-upon “little guys”, standing for freedom of expression, against greedy or ignorant governments.

Some of them will clearly not, of their own accord, accept responsibility for the social impacts of their sites. Outside regulation is a must.

New laws are not the only answer, though. There are huge opportunities here for educators.

Individual schools sometimes include social media training in their curricula, but - in the British context at least - a more linked-up approach is needed, from the earliest years in school. Concerned teachers and administrators can help bring pressure for change - as can parents.

Many of today’s educators were trained under a system that advocates “value-free” education. This makes some education administrators uneasy about opposing activities that might be supported using free expression arguments.

There is, however, no such thing as value-free training. All education carries with it a series of values.  It’s time we recognised the urgency of teaching young people the ethics that underlie a healthy approach to privacy and civility in public discourse.

This would help them to be more discerning about the limitations of technology as well as its opportunities.

There is a vital role here for community leaders, too. Our culture may be averse to the preaching of morality, but it will often listen to a sensible approach to ethics.

This is especially true when the ethics one espouses are demonstrably helpful to the mental health of young people.

Community groups - including charities, clubs and religious groups - can set up support groups for young people who experience mild depression. Sufferers often turn to social media for support, only to find that it exposes them to dark and stressing material.

Sending senior staff members for approved short courses on identifying and alleviating mild depression, in volunteers or team members for example, would assist early detection.  
Designated support groups would also help those who face the misery of online bullying.

Community leaders can also help by modelling a more healthy, well-rounded approach to social media use. Very often, leaders of organisations see social media merely as opportunities to promote an agenda, project or event.  

This approach ignores the fact the social media are at their best when used for fostering conversation and the respectful sharing of ideas that add value for others.

Social media represent not just an opportunity to indulge in some free (or at least cheap) marketing. They represent an opportunity to proactively serve the city, for the common good.

Community organisations can become more proactive about engaging the human impact of emerging technologies in general. The health and privacy implications of social media present us with a brilliant opportunity to make a start down that road. 

Hear Mal Fletcher's most recent BBC interview on this issue

© Copyright with Mal Fletcher

Fascinating Times - NEW BOOK by Mal Fletcher. Released: 30 November. Click for more info...

Recent Social Comments

››  The Hot Response Culture (Why Reason & Compassion Should Prevail)

Posted on: Tuesday 25 August 2020

››  COVID-19 Emergency: An Opportunity

Posted on: Friday 13 March 2020

››  Safer Internet Day For Children!

Posted on: Tuesday 11 February 2020

››  The Tyranny of Digital Shopper Anxiety

Posted on: Sunday 1 December 2019

››  Give Yourself the Gift of Social Media Down-Time!

Posted on: Tuesday 19 November 2019

Bookmark with:

Recent QuickThinks


Posted on: Monday 21 September 2020

››  John Hume: Peace Pioneer

Posted on: Tuesday 4 August 2020

››  GOOD day for free speech

Posted on: Saturday 15 February 2020

››  Prince Harry and the Media

Posted on: Wednesday 2 October 2019

››  Broadband Satellites

Posted on: Thursday 11 July 2019