FiLiA’s statement on the Bertin Review and the Government response to it

CONTENT WARNING: This statement describes some distressing content. Unfortunately the same sort of content warning won't be found on the platforms described here, which are being accessed by children.

Baroness Bertin’s Independent Pornography Review - Creating a safer world: the challenge of regulating online pornography (known as the Bertin Review) was a welcome and long-awaited step towards ending the harm done to women, directly and indirectly, by the creation of pornography. 

FiLiA is clear that there is no such thing as ‘safe’ pornography. The sexual objectification of women and girls is directly linked with violence against them and cannot be separated. Porn is not simply a fact of life; it is private companies profiting from the control, degradation and harm of women. The current lack of regulation includes actual crimes filmed then shared for profit, as well as graphic depictions of crimes like incest, strangulation, child sexual abuse, sexual violence and rape. 

However, the review recommends some potentially useful tools to reduce the level of harm caused in the production of some pornography and to restrict the availability of content. And the Government response makes clear that the issue is being taken seriously, especially with regard to illegal acts and the harm to children. 

We are reassured to see the Government note the link between pornography and violence against women and girls (VAWG) and that it has clearly identified the urgent need to tackle the harms of pornography if it is to achieve its mission to halve VAWG in a decade.

We would now like to see a further update on its plans to protect all women, with efforts to challenge not only violent and misogynistic content but all forms of pornography that objectify, exploit and degrade. This must explicitly include non ‘professional’ pornographic platforms, such as Only Fans, alongside the major online platforms when applying the recommendations. 

We would also welcome clarity on the proposed sanctions framework for regulatory violations. This must be effective and the level of sanctions proportionate to the significant profits made by this industry. Funds from sanctions should be ring-fenced for services addressing and reducing violence against women and girls, and contribute to the Government’s target to halve VAWG. 

The Government should commit to measures specifically targeted at protecting and supporting the women within the industry and particularly to tackle the mental and physical harm they are suffering. This support should include options for those women who wish to exit the industry, which take into account the specific traumas and issues they face.

We call for a clear timeline for action, setting out exactly where responsibility will sit and how laws will be developed, enforced, regulated and policed effectively. We stress that all policies reviewed or developed in response to this report must each be subject to a thorough Equality Impact Assessment. This will help to ensure that particular consideration is given to the voices and experiences of Black and minoritised women, women with disabilities, and migrant and trafficked women within the porn industry.

We provide a more detailed response to the Review’s recommendations, below.

FiLiA responses to the detailed review recommendations (quoted in italics below)

  • Violent, harmful and misogynistic pornographic content, which is illegal to distribute in physical formats, should also be treated as illegal content on online platforms.

  • Non-fatal strangulation pornography (commonly known as ‘choking’ in pornography) should be illegal to possess, distribute and publish. 

  • The non-consensual ‘taking’ and ‘making’ of intimate images – whether real or deepfake – should be made an offence

We do not believe there is a ‘safe’ form of pornography so restrictions to what can be shown, and any proposed code, should be branded ‘safer’, rather than ‘safe’. 

We agree that material that is illegal to distribute in the offline world, should be illegal online. Not only that but behaviour that is illegal in the real world should also be illegal online. We know that men seek to mimic what they see online in their own sex lives so any illegal behaviour, including racism and disablism, should be banned. 

There is no safe way to strangle someone, and this behaviour should be made illegal in line with the ‘real world’ offence. We are pleased to see intimate image abuse (IIA) included in the Crime and Policing bill this year. The motivation for this offence, which is overwhelmingly against women and girls, is harassment and intimidation. It is important that this includes both real and AI-generated images, as the impact on women and girls is equally traumatic.

We challenge the reliance on ‘consensual’ content. Many exited porn performers have detailed the force, coercion and manipulation used in order to get them to sign consent agreements. They have also shared experiences of abusive and non-consensual behaviour during filming which they were powerless to stop.

  • Illegal pornography offences should be accurately tracked in the police database and a nationally agreed and consistent approach should be implemented across police forces in the UK to better record incidences of these crimes. 

  • Platforms should develop a campaign to raise public awareness about intimate image abuse – how to spot it and report it and where to seek support.

It is essential that these crimes are accurately recorded and tracked, including disaggregation by sex and race to enable effective monitoring of trends. Persistent offenders should be regarded as high risk and monitored accordingly. 

We agree a pilot phase would be helpful, to identify any practical issues before wider roll-out. Lack of key data is a common issue with VAWG-related crimes so testing the approach, and educating officers on the importance of accurate data, will be critical.

Any awareness campaign should be both off and online. It must be easy to make reports of incidences of IIA, and reports should get a quick response including information on where to get support, and what to do if not satisfied with the response.

  • Resources and funding should be focused on school and community programmes, specifically for boys and young men, in order to encourage healthy discussions about positive masculinity and relationships, and to counter misogynistic culture. 

  • Clearer guidance should be given to schools on the role of teachers and staff in preventing harmful sexual behaviours.

  • An online space for parents and carers should be created, to access easy to understand information on how to talk to their children about pornography and its impacts.

Quality, evidence-based education can be effective in addressing misogyny and the harms of porn culture. Programmes should help young people understand the impact of pornography on developing sexualities.

Teachers should be equipped and supported for this work, including a recognition of the difficulty faced by female teachers discussing difficult topics in classrooms full of almost-adult men.

School staff should be trained to spot the warning signs of grooming and other online risks, and create safe spaces for children to talk about them.

We endorse the work of Culture Reframed and feel this should be part of wider work in schools to address misogyny and sex inequality. 

We would welcome easy-to-access support for parents and carers to normalise open conversation around respect at home as well as at school. It will be important that parents are helped to understand what their children could be accessing and how, so that they are able to put protective measures in place even as technology advances.

  • A separate body should conduct content audits, to ensure platforms hosting pornographic content are tackling illegal and prohibited content effectively.

We agree that this would be important work. Any such body must have enhanced training from groups supporting those harmed by the porn industry to understand the impact of that harm. This must be funded by those profiting from the porn industry. Technology companies must be mandated to cooperate with the authorities to detect and remove harmful content, including deepfake pornography.

  • An accreditation scheme should be set up, so that it is clear to the public, government, banks and payment providers, which companies are compliant with regulation tackling illegal and prohibited pornographic content online.

  • Industry should collaborate on a ‘watch-list’ of types of pornographic content which are restricted, or purposely made harder to find, so that it is only available to users if they intentionally seek it out.

We cannot support a scheme which promotes ‘good’ porn as we believe the porn industry and its blatant sexual objectification of women is harmful and affects the way men and society treat and value women. There may be companies that could claim to be ‘industry compliant’ but that is quite distinct from ‘good’.

Content such as incest and teen pornography is far too accessible and should not be on home pages. We challenge anyone to justify where it would ever be appropriate. Of course, content that simulates sex with children or relations should – as in the offline world – be illegal.

  • Increased, effective and quick business disruption measures across the ecosystem of pornography […] should be in place to ensure swift removal of illegal and legal but harmful pornographic content. 

We agree it is essential that illegal or harmful content can be identified and permanently removed swiftly from content holder and ancillary sites. If there is doubt, it should be removed until its status is clear. 

Platforms should be required to submit regular reports on actions taken, for review by a relevant authority, which could impose additional measures in the event of failure to comply. There must be a clear timeframe for action and substantial financial sanctions for non-compliance. Senior management of companies should also be held personally responsible for multiple breaches. Any funds raised through these sanctions should be ringfenced for specialist VAWG services. 

  • The aims, priorities, and capacity of current regulators should be reviewed by government, with a view to ensuring online safety is regulated by a single, focused regulator.

  • The Advertising Standards Authority (ASA) and Committees of Advertising Practice (CAP) should review its approach to advertising on online pornography sites.

We agree a single regulator could develop the specialist knowledge and understanding to deal with this area of work. The ASA should prioritise regulation of advertising on these sites.

  • An ombudsman or Commission should be set up to receive reports and give support following incidents of intimate image abuse (IIA), abuse, control, coercion and trafficking in the pornography sector. 

  • Specialised training should be given to ensure support services are equipped to effectively support victims of intimate image abuse.

We know that many victims are unsure where to find help, and we believe that the police are currently not capable of dealing with these issues, especially around issues of IIA, so a Commission would enable a clear point of contact. It should be a compassionate female-led service, so that women feel more able to access it. 

Even more welcome would be additional training (and services) for anyone who has been exploited by the sex industry. Often women experience stigma and misunderstanding, and for them to recover from the harm they have been subjected to it is essential that specialist services which understand their specific needs are resourced to respond to them.

A separate body would also gather significant and useful information on which to learn and improve processes. It would improve the collection of sex-disaggregated data (which could be shared globally), assessing the prevalence and adverse effects of online sexist and sexual violence.

  • Urgent action should be taken to better understand the links and prevalence of human trafficking in pornography to guide future policy and law enforcement response on this issue.

  • Those working in the sector should not be vulnerable to financial exploitation or illegitimate ‘de-banking’.

We must liaise with partners across Europe and globally to ensure we have a joined-up approach to this crime. New technologies and digital platforms play a central role in trafficking human beings particularly for the purpose of sexual exploitation. The digitalisation of pimping and victim recruitment via social networks and online ad sites has considerably strengthened traffickers' ability to operate transnationally while evading legal authorities.

We would not want any woman working in the industry to experience discrimination from their bank. However, we feel it is necessary to pay attention to the risk of sexual exploitation that they face. Many women enter the porn industry due to poverty, or to pay off a debt, and this would also include other pornographic platforms, such as Only Fans. We would support the extension of the definition of pimping to the act of profiting from any form of paid sexual act, whether online or offline, to cover all circumstances of exploitation, including ‘camming’.

  • Further consultation should be undertaken to understand whether problematic pornography use (PPU) should be formally recognised as an addiction.

  • Mental and physical health impacts of pornography should be recognised and represented in existing health strategies.

The creation of pornographic content causes mental, physical and emotional damage. Health professionals across various disciplines should be supported to offer accurate, evidence-based and non-judgemental care both to those working within the industry and those consuming pornography. In the wider public, knowledge about the wide ranging negative health effects of porn use should be shared so that individuals can make informed decisions.

  • Companies that host pornographic content should have consistent safety protocols, processes and safeguards in place to ensure that all performers/creators are consenting adults, are of age (18+), and have not been exploited or coerced into creating content.

  • There should be clear and standardised processes across the sector to enable performers and creators to withdraw consent and to have content they appear in removed from sites.

  • Platforms that host pornographic content should have robust protocols and processes to prevent and respond to stolen content. This should include easy reporting and removal of content stolen from performers.

While we understand the difficulty of applying such safety measures, we agree they are important. Age verification is an absolute minimum standard.

We completely agree that performers and creators should be able to withdraw consent at any time. It might well be difficult, but it is important that this does not prevent the attempt. 

Regulators must keep up to date with new methods of sharing so that technological advances don’t render measures useless. Every measure to identify and delete such content should be taken swiftly. Platforms must provide tools that enable anonymous reporting, and respond swiftly to alerts.

  • The current criminal justice response is ineffective in tackling illegal pornography online. Government should conduct its own legislative review of this regime to ensure that legislation and Crown Prosecution Service (CPS) guidance is fit-for-purpose in tackling illegal pornography in the online world.

  • Pornographic content that depicts incest should be made illegal.

We welcome Lady Bertin’s recognition of how ineffective the criminal justice system has been in responding to these crimes. This is compounded by the lack of trust in police following issues raised in both the Casey and Angiolini reviews. 

Like many crimes against women, uploading stolen or AI generated content and producing so called ‘extreme’ pornography is an offense that is rarely punished. For these measures to be meaningful we need to see robust prosecutions and tough sentencing, so there is a clear deterrent. Survivors of harm should be consulted within the proposed review.

There is no place for any illegal behaviour online, including incest. 

  • Platforms should prevent individuals who have previously uploaded illegal pornographic content, IIA or CSAM, from uploading further content on platforms which accept user-generated and uploaded content.

  • Companies hosting pornographic content should be required to use proactive technology to identify and remove intimate image abuse (IIA) content.

  • Government should urgently explore what proactive technology could be most effective to identify and tackle deepfake/AI-generated IIA and CSAM.

  • ‘Nudification’ or ‘nudify’ apps should be banned.

  • Creators of AI models should build safety mechanisms into AI tools that allow sexually explicit content, to ensure illegal pornographic content and CSAM material is not created.

We agree that all measures should be taken to address illegal and IIA, whether real or AI generated content. IIA has a hugely harmful effect on those that have been targeted by it and its only purpose is to degrade, intimidate and harass. Similarly, the only purpose of ‘nudify’ apps is to humiliate and harass women.

AI developers should also clearly indicate artificially generated content, to prevent any confusion or manipulation.

  • Clear policy ownership and responsibility across Whitehall on pornography needs to be decided, with one department having oversight on pornography policy.

  • A global coalition should be formed between civil society, academia, governments and industry to work together on tackling global issues and harms from online pornography.

A cross-Government approach is needed, and the issue must be treated as the global problem that it is. That said, a single point of ownership, for example within the VAWG Strategy, will allow proper accountability. 

We would encourage Government to work closely with international partners who are also developing law and practice to tackle this facet of the epidemic of violence against women. 

FiLiA Male Violence Against Women and Girls and
FiLiA Legacy Project – Campaigns and Policy Teams