The UK government has set out plans detailing how it will use the new law it has created to control online platforms and social media – with one telling exception.
The Draft Statement of Strategic Priorities for online safety places an emphasis on platform providers preventing online harms in the first place, and collaborating with regulator Ofcom on how the new law – the Online Safety Act – will be implemented. But it provides little detail about how it will use the more controversial aspects of the legislation.
The set of priorities lists activities that might take place on online platforms. It expects platform providers "to take proactive steps to reduce the risks their services are used to carry out the most harmful illegal activity."
The list includes terrorism, child sexual abuse and exploitation, illegal suicide and self-harm content, illegal activity that disproportionately affects women and girls, illegal disinformation, hate that incites violence towards specific individuals or groups, UK-linked content designed to encourage or facilitate organized immigration crime by criminal groups, as well as illegal sales of weapons and drugs, illegal foreign interference such as state-sponsored disinformation, fraud, and "other priority offences."