Facebook, WhatsApp, and Snapchat Fall in Line with the UK’s New Children’s Safety Codes of Practice for Social Media Platforms
July 1, 2024

In October last year, the UK finally passed its Online Safety Bill, which had been five years in the making. This new law will dictate how the United Kingdom allows social media companies to operate within its territory and what the government will deem illegal for the foreseeable future. The bill was primarily created to hold social media companies responsible for the safety of the children that browse their platforms and now, what this entails is being made clear through the policies that the government has elected to adopt. Earlier this month, Ofcom, the country’s TV, radio, and internet regulator, published guidelines for social media companies like Snapchat and Meta (who own Facebook, Instagram, Threads, and WhatsApp) to follow moving forward. A ‘less harmful’ algorithm and stricter age restrictions for minors appear to be the main elements of the new requirements. The new measures will be enforced from 2025 on. 

In the draft ‘Children’s Safety Codes of Practice, Ofcom details 40 practical steps that social media service providers are expected to take in protecting the safety and wellbeing of children. One element that these steps cover involves the introduction and regular upkeep of strict age restrictions. These restrictions are expected to protect children from three categories of harmful social media content. These include ‘Primary Priority Content’ (PPC) which promotes self-harmful behaviours such as suicide, self-harm, and eating disorders, and pornography. Priority Content (PC) should also come with age restrictions, as these involve content that is deemed abusive and incites hatred towards people who share certain characteristics, provides instructions for acts of serious violence, depicts real or realistic violence against people or animals, involves dangerous social media ‘challenges’, or involves the use of harmful substances. The Non-Designated Content (NDC) category will include any other social media content that may be identified through the services’ risk assessment. 

Facebook, WhatsApp, and Snapchat Fall in Line with the UK’s New Children’s Safety Codes of Practice for Social Media Platforms

The easiest possible step for social media platforms to take in this regard is, of course, simply banning this type of content outright, as this can be counted as the execution of their responsibilities. On the other hand, platforms that continue to allow this type of content to be posted and shared can expect greater scrutiny to ensure that rigorous age checks prevent children from viewing this content. This might involve either restricting certain sections of a website or app or banning children from the platform entirely. According to the stipulations set in place through the Online Safety Bill, violations of these rules can result in either being hit with fines as high as £18 million or the imprisonment of the relevant company executives. Commenting on the new regulations, both Snapchat and the social media platforms that come under the Meta company portfolio have all stated that their platforms already had measures in place to protect users under the age of 18. Other social media platforms such as X (formerly Twitter), TikTok, and Reddit have yet to comment.

The other major aspect of social media that the Children’s Safety Codes of Practice take into account is one of the key defining features of any social media platform, its algorithm. The algorithm of a social media platform simply comprises the platform’s own set of rules and data points that decide what type of content is recommended and to whom. This allows platforms to cater content according to the user’s specific tastes and interests. However, content that endlessly feeds into personal interests isn’t always a good thing for the user, especially when the user is a young child and is not equipped to separate their reality from the content they see. For example, the algorithm might feed into a person’s insecurities, ruining their self-esteem and even affecting their mental health. According to Ofcom, algorithms that are left unchecked have the potential to feed large volumes of unsolicited and dangerous content to children through their ‘explore’ or ‘for you’ pages.

Facebook, WhatsApp, and Snapchat Fall in Line with the UK’s New Children’s Safety Codes of Practice for Social Media Platforms

Under the new social media policies, companies that use algorithms to tailor content will be required to implement highly effective age assurance systems to identify who the child users on their platforms are. Ofcom suggests that social media platforms then use a separate algorithm that filters out potentially harmful content to serve this minor audience. In addition to reducing the visibility and prominence of harmful content for children, children themselves should be provided with a system to provide feedback on their recommended feed directly, which would allow them greater control over what they are comfortable seeing. Essentially, these policies will create a safety net around children in the digital space to ensure that they are allowed to grow up free from negative influences on their developing, young minds. As of now, the Children’s Safety Codes of Practice are in draft form, as Ofcom awaits responses to its consultations on the guidelines until the 17th of July. Commenting on their official website, Ofcom’s Chief Executive, Dame Melanie Dawes, assures the public that the new measures go above and beyond industry standards to protect the well-being of the country’s budding future: 

“We want children to enjoy life online. But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control… Our measures – which go way beyond current industry standards – will deliver a step-change in online safety for children in the UK.”

(Theruni Liyanage)

© All content copyright The Hype Economy. Do not reproduce in any form without permission, even if you have a paid subscription.