Introduction
Operators of online services play a crucial role in maintaining a safe digital environment for all users. With the enforcement of the Online Safety Act on 26 October 2023, such services now have key legal obligations to protect their users; non-compliance with these obligations will have severe consequences.
OFCOM has been given a new regulatory role under the Online Safety Act (OSA). It has been publishing a number of consultations and codes of practice, which will shape its work in establishing new regulations on online safety in the coming months.
This article will specifically address OFCOM’s consultation and draft codes of practice regarding a key aspect of the OSA; the duty to protect users from illegal harms online. The consultation and accompanying draft codes of practice on illegal harms were released on 9 November 2023.
Additionally, this article will set out how user-to-user services, such as forum and app developers, can achieve compliance with regulations and take proactive steps to protect their users.
Understanding Illegal Harms
Illegal harms encompass a wide range of activities that violate laws and regulations. Within its guidance, OFCOM has grouped these into fifteen kinds of priority offences, as below:
- Terrorism Offences
- Child Sexual Exploitation and Abuse (CSEA) Offences
- Encouraging or Assisting Suicide (or Attempted Suicide) or Serious Self-Harm Offences
- Harassment, Stalking, Threats, and Abuse Offences
- Hate Offences
- Controlling or Coercive Behaviour (CCB) Offence
- Drugs and Psychoactive Substances Offences
- Firearms and Other Weapons Offences
- Unlawful Immigration and Human Trafficking Offences
- Sexual Exploitation of Adults Offence
- Extreme Pornography Offence
- Intimate Image Abuse Offences
- Proceeds of Crime Offences
- Fraud and Financial Services Offences
- Foreign Interference Offence (FIO)
These illegal harms therefore involve a range of offences, which can be broadly categorised as content-related offences, (which involve harmful content disseminated online), behavioural offences, (relating to harmful actions undertaken within online platforms), and financial and security offences (which can impact financial well-being or national security).
Service providers must take down content where they have ‘reasonable grounds to infer’ that the content is illegal.
Illegal Content Risk Assessments
All service providers in scope of the OSA must regularly undertake an illegal content risk assessment.
This will address the risk of any illegal harms appearing on their platforms, alongside measures that can be taken to combat these harms and retain user safety.
OFCOM has provided a four-step methodology to ensure that service providers are following best practice when carrying out these risk assessments, as follows:
- Understand the Harms:
- Identify the 15 priority illegal harms outlined by OFCOM.
- Consider risk factors specific to your service, such as features like image sharing or livestreaming; OFCOM has explained how each risk factor could increase the risk of harms covered by the Act in a list of risk profiles (currently in draft form, found here).
- Note that for each risk factor, OFCOM e
- Assess the Risk of Harm:
- Evaluate the likelihood of users encountering illegal content or your service being used for criminal activities.
- Understand how harm could impact users, alongside how your service’s user base and features may increase risk of harm.
- Decide Measures, Implement, and Record:
- Based on the assessment, implement safety measures to protect users, especially children.
- Keep your risk assessment accurate and up-to-date.
- Report, Review, and Update Risk Assessments:
- Regularly review and update your risk assessments. OFCOM recommends that risk assessments are reviewed annually and requires that they are reviewed if OFCOM makes a significant change to risk profiles (as referenced in stage 1. above). OFCOM also requires that service providers carry out a new risk assessment if you are planning on making a significant change to your service, prior to the change being enacted.
- Maintain records of your assessments and actions taken.
Under the provisions of the OSA, the illegal content risk assessment should take into account the user baseand demographics of the service, the risk of encountering illegal content via the service, considering algorithms and content dissemination. The risk assessment should also evaluate the risk of misuse of the service to carry out priority offenses, alongside the potential harm to individuals through this risk. Additionally, it should consider any functionalities of the service that may facilitate illegal content dissemination. It should also examine explore different usage patterns of the service and their impact on the risk of harm that might be suffered by individuals. Finally, the risk assessment should evaluate the nature and severity of harm that could be suffered by individuals in line with these identified matters, and explore ways to mitigate risks through service design, governance, and user education of the service.
User-to-User Services: Scope and Obligations
User-to-user services include a broad range of platforms and services. Examples of such services likely to fall within the scope of OSA are as follows:
- Social Media Services: Platforms like Facebook, Twitter, and Instagram, where users create and share posts, photos, and videos.
- Video-Sharing Services: Platforms where users upload and view videos, such as YouTube.
- Messaging Services: Apps like WhatsApp and Snapchat, enabling direct communication between users.
- Marketplaces and Listing Services: Platforms where users buy, sell, or exchange goods and services, such as Facebook Marketplace or Gumtree.
- Dating Services: Websites or apps facilitating connections between individuals seeking to date, such as Tinder or Hinge.
- Gaming Services: Online gaming platforms where players interact, compete, and collaborate.
- Content Sharing Services: Audio services such as podcast hosting platforms, music-sharing apps, such as Spotify, or file sharing services, such as Google Drive or Dropbox.
- Discussion Forums, information sharing services and Chat Rooms: Online spaces for discussions and conversations, or contribution of knowledge and answers such as Wikipedia or Reddit.
- Fundraising Services: Crowdfunding platforms and donation websites.
Alongside safeguarding users from these illegal harms through risk assessments as set out above, OFCOM has set out proposed illegal content codes of practice, providing further guidance on how user-to-user services can implement protective measures to prevent illegal content appearing on their platforms. These include the following:
- Governance and accountability:
- Ensure there are appropriate internal measures in place to mitigate and protect users from internal harm.
- Content moderation:
- Actively track content appearing on the platform and have a content moderation function in place to allow for the rapid removal of illegal content.
- Reporting and complaints:
- Deploy an effective and transparent complaints reporting system.
- Default settings and support for users who are children:
- Provide safety defaults and general additional support for users who are children.
- Recommender systems:
- Establish safety metrics for on platform testing of recommender systems
- Enhanced user controls
- For certain services ensure that such actions as user muting, blocking and disabling of comments is possible.
- User access
- Ensure accounts of proscribed organisations (e.g. terrorist organisations) are disabled.
- Terms of service
- Clearly outline how they will implement these actions in their terms of service.
The applicability of the above guidance varies depending on the size of the user-to-user service, and the risk that their platform poses to users.
Consequences of non-compliance
OFCOM as the designated regulator under the OSA is able to take enforcement action against service providers. It is able to identify risks of serious harms and require compliance with illegal content safety duties, even if the service provider has failed to identify these risks in their risk assessment.
In particular, under the OSA, OFCOM is able to penalise non-compliant services to the sum of up to 10% of global annual turnover or £18 million, whichever is greater. Additionally, OFCOM can hold companies and senior managers (where they are found to be at fault) criminally liable.
Conclusions
The consultation on illegal harms closed on 23 February 2024. Accordingly, OFCOM is in the process of considering and revising its proposals in line with the views submitted by service providers.
It is important to note that OFCOM has not finalised its guidance or codes of practice on Illegal Content Risk Assessments. Accordingly, service providers should ensure to carefully review this guidance when it is released to ensure their illegal content risk assessments are compliant, and whether user-generated content appearing on their platform is illegal. A final version of the guidance and codes of practice are expected to be published by the end of 2024; once these come into force, service providers must complete their illegal content risk assessments within three months.
If you have any questions or would otherwise like to discuss any issues raised in this article, please contact Richard Hugo.
This article was written by Victoria McCarron