The UK's media regulator, Ofcom, is going to get expanded powers to force social media firms to take responsibility for users' safety online. The move comes following a consultation by the UK government in 2019 looking into the dangers of extreme online content.
Under the new rules, Ofcom will be able to enforce the idea that tech firms are responsible for protecting users from harmful content, whether that means violence, extremism, bullying or abuse, and indeed for removing that content quickly and taking steps to make it less likely to appear in the first place.
The move comes in advance of a more legally binding imposition of a "duty of care" obligation that the government is said to be planning. However, there are not details as yet of how Ofcom will enforce the rules exactly - whether this means fines, and of what severity, for example.
The rules will target firms that host user-generated content, meaning that the likes of Facebook, Twitter, YouTube and more should all come under its umbrella.
While the government will set the policy in place, Ofcom as a regulator will have the power to adapt the details of its standards and demands as situations develop. This means that, should the social media landscape shift suddenly, as it is wont to do, it shouldn't take lengthy new procedures for adaptation to catch up.
The end of self-regulation?
The regulation comes at an interesting time for social media - a growing number of countries are starting to impose regulations on the industry, with fines levied should responsibilities not be met. However, the likes of Facebook and Twitter still largely advocate self-regulation, with the belief that they can more quickly adapt their systems to suit users' needs.
The UK clearly disagrees, and it will be interesting to see if this sort of regulation spreads over time. Of course, time will also tell whether Ofcom is able to get a handle on the problem in the first place, and whether these new rules will be enforced effectively.
Another issue might be whether the new rules are wide enough. Caroline Normand, director of advocacy for Which? commented that "there are a number of areas not covered by these proposals where consumers are increasingly suffering serious harm."
In Which?'s view, online scams, bogus products sold as genuine and fake review farms should also be being tackled by regulation. Whether there are any plans in those areas remains to be seen.