Social media giants given 'final deadline' to stop kids accessing harmful content
Share:
Tech firms have been given a final deadline to introduce “robust” age checks to stop children accessing harmful content on their platforms. Media regulator Ofcom has ordered online services to take urgent action to stop kids seeing content relating to pornography, suicide, self-harm and eating disorders. All social media must introduce age assurances by July 2025 or risk punishment under the Online Safety Act. This applied to sites like YouTube, Facebook, Instagram, TikTok or Twitter/X.
Under the measure, online sites will not have to introduce age checks for the whole of their platform but will have to ensure they are in place in relation to harmful content. If they fail to do so, Ofcom has the power to fine them up £18million or up to 10% of their global revenue or impose other business disruption measures, such as requiring payment providers or advertising services to withdraw from an online site.
BLUESKY: Follow our Mirror Politics account on Bluesky here. And follow our Mirror Politics team here - Lizzy Buchan, Jason Beattie, Kevin Maguire, Sophie Huskisson, Dave Burke, Ashley Cowburn, Mikey Smith. POLITICS WHATSAPP: Be first to get the biggest bombshells and breaking news by joining our Politics WhatsApp group here. We also treat our community members to special offers, promotions, and adverts from us and our partners. If you want to leave our community, you can check out any time you like. If you’re curious, you can read our Privacy Notice.