facebook rss twitter

Ofcom given new powers to regulate social media content

by Mark Tyson on 12 February 2020, 13:11

Tags: Ofcom, UK Government, Twitter, Facebook

Quick Link: HEXUS.net/qaeiop

Add to My Vault: x

UK government Digital Secretary Nicky Morgan and Home Secretary Priti Patel have recommended that Ofcom be appointed as the official online harms regulator. This decision is the government's official initial response to the Online Harms White Paper consultation, and more details concerning the scope of the regulations and Ofcom powers, as well as freedom of speech protections will be forthcoming.

The idea of the white paper is to make changes to social media regulation to particularly protect children and vulnerable people - and give consumers greater confidence to use technology. Social media firms are keen to stay self-regulating but with some companies this means little or no regulations at all, or simply that money, fame, or power allows for what regulations there are to bend.

UK government ministers reckon it is now time for them to step in to "enforcing a statutory duty of care to protect users from harmful and illegal terrorist and child abuse content". This move is said to be following through on the government's pledge to make the UK "the safest place in the world to be online".

As the stock to beat unruly social media firms, Ofcom will get new powers to carry out its extended responsibilities. Nicky Morgan said that Ofcom will provide a "proportionate and strong regulatory regime," to nurture a thriving digital economy which is trusted and protects all users. Morgan dismissed concerns that this outside regulation could stifle the vibrant and open internet. Priti Patel said that a strong regulator would "ensure social media firms fulfil their vital responsibility to vulnerable users," and would help stop criminals from using social media for their benefit. In the government's own news blog about the Ofcom appointment, children's charity Barnados was very supportive of the plans, citing the growing risks to minors online.

In a statement about the news emailed to HEXUS, the Head of Reputation Protection at Mishcon de Reya Emma Woollcott was generally supportive. "Regulating the behaviour of global corporations will always be challenging – but essential if we are to ensure that platforms take greater responsibly in exercising the enormous power they wield. The possibility of meaningful sanctions when platforms fail to properly protect users should drive greater investment in transparent complaints processes and shorter response times," wrote Woollcott.

Another email we received, this one from the Internet Services Providers' Association (ISPA) was a little more cautious. "In order to effectively address online harms, it is important for interventions to be targeted at the specific part of the internet ecosystem, so we welcome the proposed approach of focusing measures on platforms that facilitate user generated content," wrote Andrew Glover, the Chair of ISPA. Glover went on to raise potential issues about ISP level blocking, and technical developments such as DNS-over-HTTPS.

Going forward, Ofcom may be tasked specifically to decide what platforms will fall under the scope of regulation and then make sure social media companies remove illegal content quickly and make steps to stop it appearing in the first place with "particularly robust action on terrorist content and online child sexual abuse". Contrasting to previous and ill fated government internet regulation initiatives the regulator will not do anything to "stop adults from accessing or posting legal content that some may find offensive".

Expect a follow up to the above recommendations by spring. So in a few weeks we should find out further details of the potential enforcement powers Ofcom may have.



HEXUS Forums :: 30 Comments

Login with Forum Account

Don't have an account? Register today!
Welcome to the kingdom of united censorship. How may we help you and your fascist beliefs? woops
PC-LAD
Welcome to the kingdom of united censorship. How may we help you and your fascist beliefs? woops
I have no problem with people helping to protect my daughter from the utter she might encounter on social media and the like. I see it as no different to having Mods and Admins on a forum…… and that seems to work rather well.

Also, "the regulator will not do anything to stop adults from accessing or posting legal content that some may find offensive"… so assuming you're a grown-up, you're still free to be the end of a bell, long as it's not illegal content. Sounds alright to me…
PC-LAD
Welcome to the kingdom of united censorship. How may we help you and your fascist beliefs? woops

So you believe platforms, usually hosting user-generated content, should be free to ignore the nature of that content?

Because they've been given ample chance to clean out their own stables and despite loads of fine-sounding, there is still loads of horse effluent left.

But presumably, showing terrorists how to build bombs or the best place to stick a knife in people …. no probs, can't censor that.

Showing vulnerable kids how to self-harm or commit suicide? Itterly necessary to a free democracy.

Cyber-bullying? We'll reprimand or even expel kids doing that to other kids in person, but as long as it's on the ‘net, fine, no problem.

The devil is in the detail, in issues like :-

- what platforms does it apply to?
- how are things like “harmful content” defined?
- exactly what ’teeth' to regulators get?

Finally, it's worth remembering that every country on the planet/B] has “censorship”, which is better put as limits on free speech to protect both “the people”, and individual people. Even the US, which puts so much behind the constitutionally protect limits on free speech still has limits on free speech.

For example, in this country, go jnto a large, crowded theatre and shout “Fire, fire”. In the resultant stwmpede to the door some trips, gets trampled and dies. How protected is your free speech? Suffiently so that you the possibility of manslaughter charges and, if convicted, a maximum sentence of life.

That is not a hypothetical. It's a real case.

There is really harmful content out there. Social media companies especially have been given plenty of chances to sort it, and years to do it. I'll bet that facing massive fines, or even jail time for executive breaching their duty of care, focuses minds wonderfully.

And you compare that to fascism? :rolleyes:
Saracen999
:rolleyes:
What I'm saying is the law should be there to enforce those who display the content to provide quality filters so that people shouldn't be subject to what they don't want to see. Kids, yes need protecting. Terrorist propaganda needs cutting off before being shown to those who absorb it. I was making a stupid statement to as to what this news is, its stupid, because nothing is really going to be enforced because there is no incentive to do so.
PC-LAD
What I'm saying is the law should be there to enforce those who display the content to provide quality filters so that people shouldn't be subject to what they don't want to see. Kids, yes need protecting. Terrorist propaganda needs cutting off before being shown to those who absorb it. I was making a stupid statement to as to what this news is, its stupid, because nothing is really going to be enforced because there is no incentive to do so.
Then we agree.

Almost.

On changes and enforcement, if and I stress if a big fine (as in x% of global revenue) is enforced, and especially if potential prison sentences are added, platform bosses have a big incentive to pull their finger out.

And politicians have an incentive, to whit, stopping media and public holding their feet in the fire. And they can sure incentivise OfCom.

Will it work? Dunno.

But the thing is, we agree there's a serious problem, yes?

If we don't try this, what do we do short of wringing our hands and moaning. If it doesn't work, we either adapt, strengthen it or find some that does.

But the “somethings” seem to be a bit scarce.

All this really is, is trying to drag legislation into the digital age, to deal with things dealt with long, long ago for non-digital media (except, a bit, in the US where free speech rights are more protective that here, even if that's not what the framers of the constitutional amendment intended).