Online Safety Legislation

Online Safety Legislation

While social media platforms, ecommerce and interactive websites have brought enormous benefits to society, education, and economies, the increase in online harms has forced regulators in the US, EU and UK to urgently review existing laws governing online behavior.

Social media platforms have been subject to allegations of political interference; public health risks created through disinformation; harm caused to children through inappropriate and graphic online content; and the spread of extremist material which risks public safety.

Governments in Europe, the UK and the USA are formulating new regulations which will determine where responsibilities lie for the governance and rapid removal of user-generated content that is illegal or deemed offensive or harmful to the wider public.

Examples of regulations that are currently under review are:

  • Section 230 of the US Communications Decency Act 1996 currently protects website operators from liability for third party content posted to interactive websites. Owing to the power and influence of large social media platforms and their potential effect on elections, and, as a result of the fatal insurrection in Washington DC on 6th January 2021, which was orchestrated via social media, law makers are seeking to re-examine Section 230.
  • The UK Online Safety Bill proposes to give Ofcom the power to sanction online platform operators and interactive website owners that fail to remove illegal content, as well as legal, but harmful content. All platform operators will have a duty of care to protect children who use their services. Organizations failing to protect people face fines of up to 10% of turnover, or the blocking of their sites. The UK government will also reserve the power for senior managers to be held liable.
  • The EU Digital Services Act (DSA) proposes that any platform that is used by more than 10%, or 45 million members of the European population is deemed ‘systemic’ and therefore has a duty to oversee the content posted by users. Under the DSA regulation, Digital Services Coordinators will gain the power to directly sanction platform operators who fail to oversee content posted by rogue traders, traffickers, pornographers, and extremists and will be empowered to impose penalties of 6% of global turnover in the preceding year. It is anticipated that the DSA will come into force in 2023.

“The risk of young people being harmed by toxic content they encounter online is too great for a single platform operator to tackle on its own, or to build from scratch. By collaborating with Image Analyzer, we can block offensive live-streamed video at unprecedented levels and make online communities safer.”
CEO of North American online gaming content moderation company

Image Analyzer provides artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users. We help organizations minimize their corporate legal risk exposure caused by people abusing their digital platform access to share harmful visual material. Our technology has been designed to identify visual risks in milliseconds, including illegal content, and images and videos that are deemed harmful to users, especially children and vulnerable adults.