The UK government has said it could legislate to require age verification checks on users of dating apps, following an investigation into underage use of dating apps published by the Sunday Times yesterday.
The newspaper found more than 30 cases of child rape have been investigated by police related to use of dating apps including Grindr and Tinder since 2015. It reports that one 13-year-old boy with a profile on the Grindr app was raped or abused by at least 21 men.
The Sunday Times also found 60 further instances of child sex offences related to the use of online dating services — including grooming, kidnapping and violent assault, according to the BBC, which covered the report.
The youngest victim is reported to have been just eight years old. The newspaper obtaining the data via freedom of information requests to UK police forces.
Responding to the Sunday Times’ investigation, a Tinder spokesperson told the BBC it uses automated and manual tools, and spends “millions of dollars annually”, to prevent and remove underage users and other inappropriate behaviour, saying it does not want minors on the platform.
Grindr also reacting to the report, providing the Times with a statement saying: “Any account of sexual abuse or other illegal behaviour is troubling to us as well as a clear violation of our terms of service. Our team is constantly working to improve our digital and human screening tools to prevent and remove improper underage use of our app.”
We’ve also reached out to the companies with additional questions.
The UK’s secretary of state for digital, media, culture and sport (DCMS), Jeremy Wright, dubbed the newspaper’s investigation “truly shocking”, describing it as further evidence that “online tech firms must do more to protect children”.
He also suggested the government could expand forthcoming age verification checks for accessing pornography to include dating apps — saying he would write to the dating app companies to ask “what measures they have in place to keep children safe from harm, including verifying their age”.
“If I’m not satisfied with their response, I reserve the right to take further action,” he added.
Age verification checks for viewing online porn are due to come into force in the UK in April, as part of the Digital Economy Act.
Those age checks, which are clearly not without controversy given the huge privacy considerations of creating a database of adult identities linked to porn viewing habits, have also been driven by concern about children’s exposure to graphic content online.
Last year the UK government committed to legislating on social media safety too, although it has yet to set out the detail of its policy plans. But a white paper is due imminently.
A parliamentary committee which reported last week urged the government to put a legal ‘duty of care’ on platforms to protect minors.
It also called for more robust systems for age verification. So it remains at least a possibility that some types of social media content could be age-gated in the country in future.
Last month the BBC reported on the death of a 14-year-old schoolgirl who killed herself in 2017 after being exposed to self-harm imagery on the platform.
Following the report, Instagram’s boss met with Wright and the UK’s health secretary, Matt Hancock, to discuss concerns about the impact of suicide-related content circulating on the platform.
After the meeting Instagram announced it would ban graphic images of self-harm last week.
Earlier the same week the company responded to the public outcry over the story by saying it would no longer allow suicide related content to be promoted via its recommendation algorithms or surfaced via hashtags.
Also last week, the government’s chief medical advisors called for a code of conduct for social media platforms to protect vulnerable users.
The medical experts also called for greater transparency from platform giants to support public interest-based research into the potential mental health impacts of their platforms.