On International Women’s Day, it’s time to change the narrative on online abuse
If social media will not voluntarily protect women from online harms, the government’s response must adequately address the issues women face every day online.
Social media can be a force for good; today we’ll see so many posts about the women who inspire others and conquer boundaries. But what about the online harms that women face every day? What are social media companies doing to protect the women that it purportedly supports?
The BBC’s Global Disinformation Team recently reported on the sharing of women’s intimate images without their consent around the globe. The emerging social media network of choice for these wrongs: Telegram.
Are social media companies doing enough to protect women?
Telegram, along with many other social media networks, has facilitated the sharing of intimate images online. The network has developed a reputation for being particularly pro-privacy and pro-freedom of speech for its account holders, apparently to the detriment of women’s rights. Most other mainstream social networks have policies when it comes to intimate image abuse, promising that if reported, they will remove intimate images
How easy is it to take down images on Telegram?
Telegram do take some complaints seriously, removing content that relates to copyright for example - which is particularly helpful should a victim of intimate image abuse have taken the images themselves - or terrorist material.
Strikingly, this is an area of the law where in various jurisdictions, if the content is not removed, the host may start to accrue liability for hosting the content. Is the publication of women’s intimate images without their consent just an unfortunate by-product of the host’s stance to maintain freedom of speech?
It is open to the government to legislate to incentivise social media companies - including Telegram - to do more. The Online Safety Bill, a subject of hot debate at present, had its beginning in a 2019 White Paper which argued for tougher regulation of social media platforms because, quite simply, the companies did not “voluntarily” do enough to protect users. The Bill proposes that OFCOM will be given responsibility to determine the form social media’s duties of care will take, and to take the initial enforcement action. Platforms could be threatened with a fine of 10% of global annual turnover or £18 million for non-compliance.
The Bill in its current form has been criticised, including for not adequately addressing violence against women and girls. It is expected that, following recommendations published by DCMS Committee in January 2022, the updated Bill will address these gaps, but it remains to be seen how much further the Bill will evolve in the coming weeks and months.
That’s not to say that there isn’t also law coming specifically concerning intimate images abuse. The Law Commission conducted a consultation around intimate image abuse in 2021, considering to what extent current criminal law adequately addresses intimate image abuse in the context of an increased use of smartphones and online platforms which have made it ever easier to share intimate images – from those taken without consent to deepfakes and photoshopped images that have been maliciously created. A report from the consultation is expected this summer, and it is hoped this will lead to the development of more effective policy in this area of criminal law. We added our expertise into this consultative process.
For now, there is limited redress for victims who have had their intimate images shared on Telegram and social media companies stand shoulder-to-shoulder in continuing to take a hands-off approach.
And what about when it comes to identifying those who are sharing the images online?
Telegram in its FAQs proudly state that “To this day, we have disclosed 0 bytes of user data to third parties, including governments.” Their company core values are to protect individuals from privacy risks, but in the same breath it’s become allied to their brand to protect wrongdoers from accountability. In this age of misinformation and online wrongs, their position makes it more difficult for women who have been victims of intimate image abuse, or anyone who has been the victim of online fraud to trace back to the original wrongdoer.
Telegram states that it is “Protecting your private conversations from snooping third parties, such as officials, employers, etc.” Unfortunately, these “snooping” third parties are often law enforcement or victims of abuse trying to enforce their legal rights. At best this shows disregard for whether user-posted content is lawful or not and emboldens the wrongdoer.
There are mechanisms in the civil courts to obtain disclosure against social media companies to identify information they hold relating to those who have infringed a victim’s rights. However, legally, post Brexit, the position on civil disclosure litigation outside of jurisdiction means that any application faces difficulties; this is particularly relevant when considering that most social media networks are not based in England & Wales. Telegram hold their data in various jurisdictions, which makes the disclosure process that little bit more costly and lengthy. Such claims will often require local lawyers to be instructed and in these cases, the victim has to stump up costs that often run into the thousands. Even then, obtaining an order may only reveal an IP address or an email address that requires another application to obtain more information. If the individual identified as wrongdoer is based in another jurisdiction, further challenges follow. We have a single jurisdiction approach to the law that is seeking to resolve a multi-jurisdictional answer. Redress is a long-haul journey with limited prospects of success for victims of online abuse, in particular intimate image abuse.
It is disappointing to see that the government hasn’t yet considered this or the identification of anonymous individuals who commit online harms. It is not a novel consideration; in Australia, the draft Social Media (Anti-Trolling) Bill 2022 proposes unmasking obligations where social media companies will have to provide contact details for users who have made defamatory comments. The UK Government’s proposal has taken steps forward, but many believe it still lacks the teeth to do what is necessary to protect those who use the internet, particularly women.
It’s time for the law to catch up with the internet and it all starts with the government legislating to protect all women. We wait to see what the UK government will do next, with consultations and Bills pending. Action rather than words would be the greatest practical support this International Women’s Day.
Words by Jess Alden and Georgie Lyon.