Social media, and the internet in general have been an inseparable part of human connection from the day it first evolved and the height of the pandemic saw online connections become the lifeline of human connection. Unfortunately, this rise in virtual engagement has also seen a rise in online crime and abuse, especially perhaps the most heinous, the sexual abuse and exploitation of children. For example, the Internet Watch Foundation, which is an England-based not-for-profit that seeks to minimise the proliferation of sexually abusive and exploitative content, especially in relation to children, flagged more than 63,000 web pages that hosted such content in 2022. This is a sharp increase compared to the 5,000 websites flagged in the year prior. The increasing availability and reliance on social media and various other online spaces to foster human connections mean that more and more children are being placed in situations that exploit and violate their vulnerability. Changing social media app policies has the potential to greatly impact these figures in both positive and negative ways, and Meta’s proposed changes to its messaging platform are causing a stir among government authorities in this regard.
For their part, Meta expresses their commitment towards the safety of their users by securing their privacy through the incorporation of end-to-end encryption on their messaging platforms. The feature is not new in the Meta umbrella of social media platforms – WhatsApp, one of the most prominent direct messaging apps in the world, has made end-to-end encryption an integral part of its platforms. However, Meta’s proposed changes seek to extend this security to both its Messenger and Instagram Direct Messaging, which is different from Whatsapp in how social interaction takes place. For one, WhatsApp requires that you already have the phone number of the person you are looking to connect with, which is not something that is usually accessible to the general public. This difficulty imposes an initial barrier for anyone looking to exploit another party in any way by connecting with them online. The situation is different on Instagram and Messenger (formerly Facebook Messenger), where the people you interact with are not always people you know personally or have at least met even in passing, offline. This can make these platforms a dangerous place to navigate, especially for children, without being at the mercy of bad actors.
According to the UK’s Home Secretary Suella Braverman therefore, any attempt to encrypt messaging on these apps puts vulnerable groups on the platform – children – at risk as it would remove the only meaningful way of monitoring what goes on in these spaces. These concerns were expressed to Meta on behalf of the people through a joint letter co-signed by various information technology experts, law enforcement, survivors, and leading child safety charities earlier in July. End-to-end encryption has always proved to be a point of contention between authorities who would rather obstruct the internet’s potential to enable antisocial acts and companies looking to provide the experience that their user base is looking for. Speaking to the media, a Meta spokesperson underpinned their priorities in serving the public.
“We don’t think people want us reading their private messages so (we) have spent the last five years developing robust safety measures to prevent, detect, and combat abuse while maintaining online security”. However, the company has also promised to continue to work closely with the relevant authorities (even ‘far more than their peers’) through measures such as reporting to guarantee that they keep people safe.
Additionally, in its report titled ‘Meta’s Approach to Safer Private Messaging on Messenger and Instagram Direct Messaging’, the organisation explains that “when E2EE (end-to-end encryption) is the default, we will also use a variety of tools, including artificial intelligence, subject to applicable law, to protectively detect accounts engaged in malicious patterns of behaviour instead of scanning private messages”. Such measures that the organisations set in place to protect vulnerable groups include, for example, the restriction of adults from engaging with minors who don’t follow them on the platform. The Home Secretary on the other hand does not believe that these measures are sufficient, as under current measures, the government arrests about 800 paedophiles a month.
The UK’s Online Safety Bill
As paradoxical as the refusal to compromise on user privacy and promise to deliver on consumer safety is, it hasn’t stopped the government from passing the UK’s Online Safety Bill, which has been a reform to existing legislation years in the making. The bill chiefly aims to make access to harmful material difficult (and yet not impossible) for minors within the territory. This includes pornographic material as well as other harmful content such as content that promotes suicide or suicide ideation, eating disorders, body image issues and the like. The bill is in no way all-encompassing and simply places a ‘duty of care’ on social media service providers to ensure that online spaces are safe for vulnerable groups. The legislation promises to make the UK the “safest place to be online”, but the simple passing of a bill does not mean that the necessary changes will take place automatically. Effecting these changes falls within the scope of the UK’s Office of Communications (Ofcom), which will now have to go through the much more difficult process of making the bill something that can be enforced in the online space.
This is of course where the above-mentioned paradox comes into play – the bill has even been criticised by human rights watch groups as a potential conflict with the fundamental right to privacy. Decrypting personal messages by any other party other than the recipient completely defeats the point of encrypting them in the first place. On the other hand, the Internet Watch Foundation also points out that the material frequently seeing circulation does not necessarily originate from, or is hosted on websites or IP addresses in the UK, which makes the content especially difficult to eliminate. Sadly, content which features the abuse of minors is often self-generated, another element that makes such content difficult to regulate. Any discussion on the suitability of allowing end-to-end encryption should also consider whether encryption alone addresses all these facets of an unavoidably multidimensional issue.
(Theruni Liyanage)