Social media platforms in the UK now face steep penalties if they fail to curb illegal content under newly enforced digital safety regulations. The Online Safety Act, taking effect this Monday, mandates all major platforms—Facebook, Google, X, Reddit, OnlyFans and 100,000+ services—to block or remove harmful material like fraud, terrorism, and child sexual abuse content.
Crackdown begins:
Tech firms are legally obligated to target offenses including suicide encouragement, drug sales, extreme pornography, and revenge porn. Breaching these rules can attract fines of up to £18 million or 10% of global turnover—a potentially billion-pound blow for giants like Meta or Google. Authorities can even shut down non-compliant services.
Technology secretary Peter Kyle called this shift “just the beginning,” stressing that tech safety can no longer be an afterthought.
Ofcom’s role:
Regulator Ofcom has issued strict codes of conduct, such as hiding children’s profiles from strangers, empowering women to block stalkers, flagging fraud cases, and deploying hash-matching tech to block terrorist and non-consensual intimate content.
Ofcom previously warned that many platforms weren’t fully compliant, particularly the largest ones, citing missing safety measures. Non-cooperative companies could face investigations.
Not negotiable:
While some US leaders, like VP JD Vance, criticized the law as a threat to free speech, the UK government firmly asserts the act solely targets criminal activities, not public discourse.
#DigitalSafety #OnlineRegulations #TechAccountability #ProtectUsers