Online Safety Act illegal content risk assessment

Introduction

Solarcene.community is a very small experiemental mastodon instance based on solarpunk concepts. It is solar powered by a small solar array and has a small battery, it therefore has carefully tuned power saving mechanisms which include shutting down the whole server when the battery is low to conserve power. It also has experimental features that mean that images are converted to greyscale and dithered - this is both for aesthetic reasons but also to reduce bandwidth requirements. The philosophy of the server is to develop a different approach to social media which reduces the dependence of being online all the time and encourages living within out means.

Key information

Information Answer
The service to which the risk assessment relates A small, multi-user (14 users), Mastodon instance (solarcene.community) which is hosted in the UK
The named person responsible for the risk assessment James Coxon
The findings of this risk assessment have been reported and recorded through appropriate governance channels Yes: logged in this document
Next scheduled review date 2026-03-16

Document history

Date Version Nature of change (created / reviewed / updated) Who completed the risk assessment Who approved the risk assessment
2025-03-16 0.1 Created JC JC

Ofcom’s suggested steps

Ofcom's risk factors

This fediverse instance is a social media platform.

Only the users of this mastodon instance can post to it.

It is necessary to register and for a users registration to be manually screened before being accepted onto this instance. This instance is for people over the age of 18 and so children will not be allowed to register, there are currently no children registered as users on this instance

The posts and boosts of the users will be available to others who view this instance (e.g. via the web interface).

If someone replies to one of these posts, or if a user replies to someone else's post (on another instance), those replies will be available to others who view this instance.

There is no access control, meaning that it is accessible to anyone in the world, without registration.

According to Ofcom's risk factors guidance, it is possible that this instance meets some or all of the following.

Type of content Risk factors (from Ofcom’s risk profiles)1 Conclusion2 Rationale Measures3
Terrorism 1a, 4a, 5,b, 5e, 7b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is incredibly low. Where such content may be encountered from third-party web sites that can be viewed from within the service, we will routinely block access to such third-party sources when notified by user reporting. In addition, the service follows alerts issued by IFTAS from the SW-ISAC account (https://mastodon.iftas.org/@sw_isac), which recommends specific sources of Advanced Persistent Threat and Terrorism and Violent Extremist Content for blocking. None
Child Sexual Exploitation and Abuse (CSEA): Grooming 1a, 2, 3a, 4a, 5b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible. The service prohibits user accounts created by persons under the age of 18, and we are unaware of any accounts that are operated by or for a person under the age of 18. In addition, the service follows alerts issued by IFTAS from the SW-ISAC account (https://mastodon.iftas.org/@sw_isac), which recommends specific sources of CSEA and CSAM for blocking. None
Child Sexual Exploitation and Abuse (CSEA): Images 1a, 2, 5b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is incredibly low. The content is prohibited by our terms of service. This service has an extensive block list of fediverse servers with content that is not appropriate for this service. In addition, the service follows alerts issued by IFTAS from the SW-ISAC account (https://mastodon.iftas.org/@sw_isac), which recommends specific sources of CSEA and CSAM for blocking. None
Child Sexual Exploitation and Abuse (CSEA): URLs 7b Negligible risk The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is low. The content is prohibited by our terms of service. As we have no access to a database or directory of CSEA URLs, we respond to user-generated reports and/or moderator review to remove this content reactively. If such a database were available, we would implement a filter to disallow these URls from appearing on the service. None
Encouraging or assisting suicide 1a, 5e, 5g, 7b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible. The content is prohibited by our terms of service. Where such content may be encountered from third-party web sites that can be viewed from within the service, we would routinely block access to such third-party sources when notified by user reporting. None
Hate 1a, 3a, 5e Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible. The content is prohibited by our terms of service. Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. Further, the service makes use of external databases and lists of services that are known to allow this content, and we will routinely defederate (disconnect and block access to and from) such services. At time of Risk Assessment the service blocks 402 domains. None
Harassment, stalking, threats and abuse 1a, 3a, 4a, 5b, 5e, 5g Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible. The content is prohibited by our terms of service. Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. Further, the service makes use of external databases and lists of services that are known to allow this content, and we will routinely defederate (disconnect and block access to and from) such services. At time of Risk Assessment the service blocks 402 domains. None
Controlling or coercive behaviour 1a, 4a, 5b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible. The content is prohibited by our terms of service. Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. Further, the service makes use of external databases and lists of services that are known to allow this content, and we will routinely defederate (disconnect and block access to and from) such services. At time of Risk Assessment the service blocks 402 domains. None
Drugs and psychoactive substances 1a, 3a, 4a, 5b, 5e, 7b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Firearms, knives or other weapons 1a Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Human trafficking 1a, 3a, 5e Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Unlawful immigration 1a, 3a, 5e Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Sexual exploitation of adults 1a, 3a Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Extreme pornography 1a, 5e Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is low. The content is prohibited by our terms of service, and our proactive and reactive content moderation processes severely limit the accidental availability of such content. Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Intimate image abuse 1a, 5b, 5g Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is low. The content is prohibited by our terms of service, and our proactive and reactive content moderation processes severely limit the accidental availability of such content. Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Proceeds of crime 1a, 3a Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Fraud and financial offences 1a, 3a, 4a, 5b, 7b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Foreign interference 1a, 3a, 4a, 5e, 5g, 7b Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None
Animal cruelty 1a, 5e Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None

Other illegal content

Illegal content/offence Risk of encountering this content/offence Rationale Measures that you will implement using Ofcom’s Codes of Practice or alternative measures
All other content / offences Negligible The likelihood of a visitor encountering this content, or the service being used to facilitate or commit this offence, is negligible. Where such content may be encountered from third-party web sites that can be viewed from within the service, we routinely block access to such third-party sources when notified by user reporting. None

Safety duties about illegal content

The following sets out this approach to compliance with the various safety duties in s10 OSA.

See also Ofcom's recommended measures (table on page 8).

Duties and approach

Duty Our approach
To prevent individuals from encountering priority illegal content by means of the service The moderators will continue to defederate with servers likely to contain problematic content. We will not allow users to toot or boost priority illegal content nor reply to other people's toots containing priority illegal content. If priority illegal content is posted to on this instance by a user or a third party (e.g. by way of a reply), we will block the user and, where appropriate, report the toot to the user's server administrator.
To effectively mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence New user accounts are reviewed, including but not limited to the IP address (geolocation and IP reputation score where appropriate), email domain, and “reason for joining”.
To effectively mitigate and manage the risks of harm to individuals No additional actions planned
To minimise the length of time for which any priority illegal content is present If the moderators become aware of priority illegal content, we will promptly block the user and, where appropriate, report the toot to the user's server administrator. Any third party can report toots available via this instance to the moderators. Notification will be by email, enabling the moderators to view the report.
Where the provider is alerted by a person to the presence of any illegal content, or becomes aware of it in any other way, swiftly take down such content As above.
To have in place regulatory compliance and risk management arrangements This document
design of functionalities, algorithms and other features We will continue to review the features available in Mastodon to reduce the risk of individuals encountering priority illegal content, and the risk of the service being used for the commission or facilitation of a priority offence.
policies on terms of use About and Server Rules, Code of Conduct
policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content About and Server Rules, Code of Conduct
content moderation, including taking down content As above
functionalities allowing users to control the content they encounter The service provides content in several contexts: a personalised chronological (“home”) timeline that will only show content the user has elected to “follow” either by account or by topic; a public timeline of locally-hosted user-generated content; a public timeline of third-party user-generated content that is not otherwise prohibited or removed by our content reviewers; Notifications; Conversations Account users can use the following features to control the content they encounter: Filtering posts (drop or hide) by keyword or phrase These filters can be applied to each of the above contexts individually Hiding “boosts” (allowing content from a followed account, but not allowing that followed account’s reposts or boosts) Account muting Account blocking Domain (entire server) blocking Reporting harmful content to our staff Full technical specifications for these options is available at https://docs.joinmastodon.org/user/moderating/
user support measures Users can message the admin @solaradmin
staff policies and practices Currently JC is the single moderator
to include provisions in the terms of service specifying how individuals are to be protected from illegal content About and Server Rules, Code of Conduct
A duty to apply the provisions of the terms of service consistently As above
A duty to include provisions in the terms of service giving information about any proactive technology used by a service for the purpose of compliance with a duty set out in s10(2) or 10(3) (including the kind of technology, when it is used, and how it works). As above
A duty to ensure that the provisions of the terms of service are clear and accessible As above

In accordance with s66, if the moderators were to become aware of unreported CSEA, it would be reported to the NCA.

Ofcom reference Measure Comment
ICU A2 Accountable individual James Coxon
ICU C1 Content moderation function Moderators will review and assess suspected illegal content
ICU C2 Take down If service users post illegal content then this will be taken down, while we cannot take down other people's content, posted on their own instances, but we can report it to their instance's administrators, where appropriate.
ICU D1 Enabling complaints Complaints can be received in-app by messaging the instance admin/moderator
ICU D2 Easy to find complaints information About and Server Rules
ICU D7 Appropriate action for complaints Moderators will handle any complaints
ICU D9 Appeals Moderators will handle appeals promptly
ICU D10 Appeals Admin/Moderators will handle appeals
ICU D11 Proactive technology We are not using proactive technology
ICU D12 Appropriate action for complaints As per ICU D7, Admin/Moderators will handle any complaints
ICU D13 Manifestly unfounded complaints Admin/Moderators will handle complaints
ICU G1 Terms of service About and Server Rules, Code of Conduct
ICU G3 Clarity of terms See ICU G1
ICU H1 Proscribed organisations The service will block any account and/or domain if we become aware that they are a member of a proscribed organisation.

Actions

Action Date Status
None n/a n/a

Based on template from onlinesafetyact.co.uk