The Online Safety Act (OSA) is the UK’s comprehensive legislation intended to ensure the safety of online users. It places specific obligations upon various online service providers to this end. The OSA received Royal Assent on 26 October 2023. Sections of the OSA came into force on the 21st of November 2023, alongside further provisions on the 10th of January 2024. Additional provisions are expected to come into force throughout 2024 with a number of other provisions still not given commencement dates.
OFCOM are tasked with the enforcement of the OSA. The OSA seeks to make the UK the safest place to be online. One of the ways the Act implements this is by obligating services to conduct substantial and ongoing risk assessments.
Who?
The OSA imposes duties in this area on providers of category 1 and category 2A services. The impact on online services will differ according to how that particular service is categorised. The following categories are still due to be outlined definitively in secondary legislation:
- Category 1.
- These are user-to-user services that will meet certain thresholds (due to be defined in secondary legislation) relating to user quantity, service functionality and other factors.
- Category 2A.
- These are search services that will meet certain threshold conditions due to be defined in secondary legislation.
- Category 2B.
- These are user-to-user services that will meet certain threshold conditions due to be defined in secondary legislation.
More onerous obligations will be applicable to Category 1 services owing to their higher potential as sources of harm.
What areas do they apply to?
The risk assessments will apply to three main categories of content.
- Illegal content:
- Service providers are mandated to conduct thorough risk assessments to identify and combat the dissemination of illegal content. This includes offenses related to terrorism, child exploitation, and other criminal activities as defined in the legislation.
- Harmful content available to children:
- Platforms accessible to children must implement stringent measures, such as age verification, to prevent their exposure to harmful content. These measures aim to mitigate the risk of children encountering content that may cause physical or psychological harm.
- Harmful content available to adults:
- Category 1 services, as described above, will require an adult user empowerment risk assessment.
- The Act empowers adults by providing them with mechanisms to control the content they consume. Through user-controlled features, adults can filter out harmful yet legal content, ensuring a safer online experience tailored to their preferences and needs.
Four-step risk assessment process
As per draft OFCOM guidelines, services regulated under the OSA must conduct an assessment of online safety risks. The assessment should delve into how harm may manifest, considering user demographics, platform features, and other relevant factors, while also devising appropriate safety measures, particularly for safeguarding minors.
OFCOM presents a structured approach consisting of four steps applicable across various services:
- Context Establishment: Identify and understand the risks of harm, referencing OFCOM's risk profiles and addressing any knowledge gaps.
- Risk Assessment: Evaluate the probability and consequences of potential harm, factoring in user demographics, platform features, and other pertinent variables.
- Mitigation Identification: Identify and assess potential measures to mitigate identified risks effectively.
- Review and Update: Regularly review and update the risk assessment, especially in response to significant changes in the service.
It's important to note that this guidance is based on OFCOM's draft proposal for conducting illegal content risk assessments, currently subject to consultation. The final version of this guidance is anticipated to be released in late 2024.
Next steps
To prepare for obligations under the Online Safety Act (OSA) regarding risk assessments, services can work to:
- Determine what category they are likely to be placed into under the OSA.
- Stay updated on forthcoming secondary legislation.
- Understand and implement OFCOM's risk assessment process.
- Evaluate and address risks related to illegal and harmful content.
- Engage with OFCOM's regulatory process.
- Regularly review and update risk assessment practices.
Wider Picture
In the journey towards enforcing the OSA, OFCOM is set to follow its roadmap in carrying out a series of consultations and regulatory milestones. On 25 March 2024, OFCOM issued a call for evidence to solicit insights on its approach to regulatory duties. This is due to be followed by a consultation on draft transparency guidance scheduled for Summer 2024. As part of its obligations, OFCOM is tasked with producing a register of categorised services. It published advice to the Government on the thresholds which would determine whether or not a service falls into Category 1, 2A or 2B on 25 March 2024. The Secretary of State will use this advice to set the threshold conditions in secondary legislation, which is anticipated to be enacted by the summer of 2024.
Depending on the accuracy of this intended timetable, OFCOM expects to able to produce the published register of categorised services by the end of 2024, present draft proposals on additional duties for these services in early 2025, and issue transparency notices by mid-2025. These actions highlight OFCOM's commitment to fostering transparency, accountability, and user safety online through collaborative engagement and regulatory measures.
For more information on the Act and the steps the Act will pass through over the upcoming months, please see our OSA Overview article.
If you would like any further information or have queries on the content of this article, please contact Richard Hugo, David Varney or another member of our Technology team.
This article was written by Nathan Gevao and Victoria McCarron.