People across the United Kingdom have been faced with a censored and partially inaccessible online landscape since the country introduced its latest digital safety rules on Friday.
The Online Safety Act mandates that web service operators must use “highly effective” age verification measures to stop kids from accessing a wide range of material, on penalty of heavy fines and criminal action against senior managers. It’s primarily focused on pornography and content that promotes suicide, self-harm, or eating disorders, but the scope of “priority content” also includes materials related to bullying, abusive or hateful content, and dangerous stunts or challenges.
Effectively, web platforms must either set up an age verification system that poses potential privacy risks, default to blocking huge swaths of potentially questionable content, or entirely pull out of the UK. Residents are finding themselves locked out of anything from period-related subreddits to hobbyist forums — it’s little wonder that they’re turning to VPNs.
Over the past several days, several large social media platforms have started requiring age verification in the UK to access certain features and types of content, in partnership with third-party software providers. Users typically have a choice between uploading bank card information, an image of government-issued ID, or a facial scan that estimates the user’s age.