LONDON (Reuters) – Britain proposed new age-check guidance on Tuesday to protect children from accessing pornography online, including a suggestion to use AI-based technology to see if a viewer looks to be of legal age.
The government's newly passed Online Safety Act requires sites and apps that display or publish pornographic content to ensure that children are not normally able to encounter pornography on their service. The legal age to watch porn in Britain is 18 or over.
On average children first see online pornography at 13, while nearly a quarter come across it by age 11, and one in 10 as young as 9, a 2021-2022 study by the Office of the Children's Commissioner for England showed.
"Regardless of their approach, we expect all services to offer robust protection to children from stumbling across pornography, and also to take care that privacy rights and freedoms for adults to access legal content are safeguarded," media regulator Ofcom CEO Melanie Dawes said.
The regulator described its suggestion on facial age estimation as using AI to analyse a viewer's features. That would likely require taking a selfie on a device and uploading it.
The watchdog said its proposed guidance also included photo identification matching, requiring a user to upload a photo ID such as passport or driving licence to prove their age, and credit card checks.
Another suggestion was open banking, whereby users can consent to their bank sharing information with online porn sites to confirm they are over 18.
The Institute of Economic Affairs, a free-market think tank, said mandatory age verification threatened user privacy and would expose users to breaches and abuse by increasing the amount of sensitive data held by third parties.
The regulator said weaker methods such as self-declaration of age, online payment methods that do not require a person to be 18, and disclaimers or warnings, would no longer meet the standards in its new guidance.
Ofcom said it expects to publish its final guidance in early 2025.