As the controversial Online Safety Bill (“OSB”) makes its way through the House of Lords, we consider what implications it might pose for the online art world.
Introduced in March 2022, the stated purpose of the OSB is to ‘make the UK the safest place in the world to be online while defending free expression’. There has, however, been complex disagreement over how to strike an appropriate balance between those twin priorities.
Representing one side of the argument, Sir Peter Wanless, Chief Executive of the NSPCC, has argued that enacting the OSB is vital to help protect children from online harm. That argument has gained momentum in the wake of the Molly Russell Inquest and the Coroner’s finding that ‘the negative effects of on-line content’ contributed to the 14-year-old’s death. Representing the other side of the argument, former Supreme Court justice, Lord Sumption, has highlighted what he calls the bill’s ‘stratospheric vagueness’ and labelled elements of it ‘a patronising abuse of legislative power’. Lord Sumption’s concerns (expressed in August 2022) may have been at least partly assuaged by recent amendments to the bill. Nevertheless, the OSB remains subject to a range of conflicting views.
What does the OSB actually do?
The OSB is wide ranging. Its latest iteration runs to 247 pages, with 212 sections and 17 schedules. In essence though, it creates a broad range of duties that some Internet service providers (such as social media companies) will be obliged to comply with.
Those duties relate to, inter alia, how Internet companies deal with two categories of content on their platforms: the ‘illegal’ and the ‘harmful’. By way of example, all regulated services would be under a duty to carry out risk assessments in relation to ‘illegal content’ that might appear on their platforms (s.8) and then adopt measures aimed at mitigating the risks identified (s.9). For services that are ‘likely to be accessed by children’, they would also be required to undertake risk assessments in relation to ‘content that is harmful to children’ (s.10) and then take steps to protect children from the potential harm (s.11). The meaning of ‘content that is harmful to children’ is to be set out in supplementary regulations made by the Secretary of State (s.54).
At the same time, when deciding on, and implementing, safety measures and policies, regulated services will also be under a duty to ‘have particular regard to the importance of protecting users’ right to freedom of expression within the law’ (see s.18(2)).
Enforcement of these complex duties will fall to the regulator, Ofcom, and the consequences of failing to comply with them are potentially significant. For example, non-compliant service providers could face fines of up to 10% of worldwide revenue (sch.13, para.4) and/or court imposed ‘service restriction orders’ (s.131). In recent weeks there has also been a move by a number of Conservative backbench MPs to introduce an amendment that would create criminal liability for failure to comply with a relevant child protection duty. If the amendment becomes law, tech executives found to be complicit in the failure could face up to two years in prison.
What might the OSB mean for the artistic community?
In recent years, technological development has redefined the ways in which art is created, viewed, marketed and traded. However, the movement of art into the online realm has not always been a seamless process. For example, there has been a well-publicised issue around social media companies censoring art that contains nudity. The issue stems from the conceptual difficulty of distinguishing between art and pornography. While social media companies have determined to keep their online spaces free of explicit content, they also recognise that a blanket ban on nudity is neither desirable nor practical. They have therefore, in effect, created two categories of nudity: the ‘permitted’ and the ‘not permitted’. For example, Facebook has made clear that posting ‘photographs of paintings, sculptures and other art that depicts nude figures’ is permitted (Instagram makes similar allowances). However, distinguishing between these amorphous categories in a reasoned manner would be challenging for even the most conscientious human moderator; when left to an algorithm it becomes all but impossible.
That reality is borne out by the various examples of both Facebook and Instagram failing to abide by their own policies. For example, in February 2018, Facebook removed an image of the The Venus of Willendorf, a 30,000-year-old nude statue that forms part of the collection of the Naturhistorisches Museum in Vienna. Similarly, in September 2019, Instagram deleted a post by the Palazzo Strozzi in Florence because it contained female nudity – the offending image was the cubist nude, A Model (Against a Blue Background, 1910) by Russian painter, Natalia Goncharova. Lastly, as if to demonstrate the scale of the problem, in March 2020, a group of artists, collectors and human rights organisations came together to form Don’t Delete Art, a platform to promote the work of artists who have experienced online censorship. At the time of writing, Don’t Delete Art’s online gallery publishes the work of 57 visual artists whose work has been suppressed by more mainstream platforms. While the issue described has garnered most attention in relation to nudity, it’s easy to see how it could also impact art that employs violent imagery or that addresses other challenging themes.
To date then, social media companies and their algorithms have been the arbiters of what artists can and cannot publish on their platforms. They have had the freedom to tread their own line between promoting free expression and protecting users from what some see as harmful content. The question we are now faced with is what happens when some of that autonomy is wrested away and regulation becomes more centralised.
Instinctively we might think that more stringent regulation will lead to less freedom for those sharing their art online. When faced with the possibility of enormous fines and/or criminal sanctions, it seems likely that Internet companies will err on the side of caution and, if anything, overregulate.
However, as noted above, there are specific freedom of expression duties built into the OSB that could see companies overregulate at their peril. It’s possible that those safeguards will force companies to develop more nuanced approaches to filtering content on their platforms. For example, will they be forced to rely more on human moderators or perhaps invest in developing mechanisms that can better at distinguishing between art and pornography?
Ultimately, we won’t know where the line is to be drawn between these competing duties until some time after the OSB comes into force. Only when we know what ‘content that is harmful to children’ means and when Ofcom has published its ‘codes of practice’ (in accordance with OSB ss.36-48) will we have an understanding of what the new regime means for freedom of artistic expression online. In that sense the OSB represents a leap into the unknown.
In respect of the ‘illegal content’ category, we might expect that this wouldn’t be of any concern to the artistic community. However, within the definition of ‘illegal content’ there is a slightly surprising reference to the Obscene Publications Act 1959 (“OPA”) that might, at least fleetingly, raise a few eyebrows.
At one time, artists (literary or visual) may have held legitimate concerns about their work catching the attention of the censors and falling foul of the various obscenity laws that existed under the law of England & Wales. However, such laws have now either been abolished or fallen into desuetude. It is therefore something of a surprise to find reference to the OPA in this decidedly modern piece of legislation.
The OPA is probably best known as the statute at the heart of the Lady Chatterley’s Lover trial (1960). Its origins can be traced to the 17th Century and it creates the offence of publishing an item that is ‘obscene’, i.e. one that ‘tend(s) to deprave and corrupt persons who are likely… to read, see or hear… it’ (OPA s.2). This offence has been little used in practice and has attracted all manner of criticism for its uncertain scope. Grappling with the ‘deprave and corrupt’ test in 1972, Lord Wilberforce remarked that it involved ‘deep questions of psychology and ethics’. He questioned whether the Act would ‘continue to be workable…’. In 1979, the parliamentary Williams Committee on Obscenity and Film Censorship concluded that obscenity law was ‘in short… a mess’.
So, do photographers, filmmakers, painters or writers need to worry about their work being deemed ‘obscene’ by an algorithm and potentially facing criminal prosecution? The short answer is – almost certainly not.
The combined effect of s.53 and paragraph 1 of Schedule 6 of the OSB is to require service providers to police against content that infringes s.2 of the OPA. As suggested above, determining whether a publication ‘depraves and corrupts’ is a deeply problematic task. Requiring social media companies and their imperfect algorithms to perform it is unrealistic and unreasonable. Implicitly recognising this, the OSB adds a gloss to s.2 of the OPA. What paragraph 1 of Schedule 6 actually says is this: ‘[illegal content means content that amounts to]… An offence under section 2 of the Obscene Publications Act 1959 relating to an obscene article tending to deprave and corrupt others by encouraging them to commit an offence specified in paragraph 2, 4, 5, 7 or 8′ [emphasis added] (“the OSB gloss”). Paragraphs 2, 4, 5, 7 and 8 set out various offences relating to child sexual abuse.
The OSB therefore effectively changes the meaning of the OPA. It defines an obscene item as one that ‘encourages the commission of a child sexual offence’ and it does so in order to circumvent the unmanageable ambiguity of the ‘deprave and corrupt’ test. However, this new test may give rise to a number of problems of its own. For example, how do you establish that a particular publication encourages others to commit a child sexual offence? And will there now be an arbitrary distinction between what ‘obscenity’ means for online publications and what it means for offline publications? It should also be noted that the OPA contains a defence for publications that are deemed to be ‘in the interests of science, literature, art or learning…’ (s.4). It is not clear from the wording of the OSB whether this defence has any part to play in the determination of whether something constitutes ‘illegal content’. For example, if something posted to Facebook is flagged because it might encourage the commission of a child sexual offence, does Facebook then need to consider, for example, whether it is in the interests of science, literature, art or learning?
From a freedom of expression perspective, the OSB gloss must be welcomed. It ensures that Internet companies won’t have to wade into the legal quagmire that is the ‘deprave and corrupt’ test. However, it is only necessary because of what looks like an ill thought out and unnecessary reference to the OPA.
It is perhaps a little surprising that after all the debate and the drafting, we still don’t really know what impact this piece of legislation will have. The extent to which it is able to both improve online safety and protect freedom of expression remains to be seen and everyone who wishes to share their content online, the art world included, will need to pay close attention when its supplementary materials are published and its provisions start to bite.
Image credits: The Venus of Willendorf, via Wikimedia Commons, CC 4.0, user Jakubhal.