Children left to rely on ‘instinct’ to protect themselves from online harm, new study finds
Children often rely on their own instincts when faced with threats to their safety online, such as explicit content or online grooming, a new study reveals.
The findings uncover an uncomfortable truth about how children are left to their own devices when navigating the digital world, increasing their vulnerability to harm.
Conducted by non-governmental organisations ECPAT International, Eurochild and Terre des Hommes Netherlands, the study involved focus group discussions with 483 children from 15 countries, including ten EU member states.
Many of those children said they prefer to keep their online activities to themselves and struggle to talk with adults about the risks they face online. Others said they filter what they tell their parents and caregivers about the harms they encounter.
These harms include cyberbullying, violent content, or negative mental health experiences. But across all 15 countries studied, sexual abuse and exploitation online – such as grooming, self-generated sexual material and live-streamed child sexual abuse – constitutes the biggest, single threat for minors.
“We see that children feel very alone in ensuring their safety from child sexual abuse and exploitation. And of course, this is an enormous responsibility,” Eva Notté, technical advisor on child exploitation for Terre des Hommes Netherlands said.
“But we see that through their own behaviour, they’re trying to self-censor what they do. They try to look out for risk, but they really lack the necessary tools and information to effectively navigate the online world,” she added
The report comes amid gridlock in the EU institutions on a planned new law to crack down on child exploitation online by using emerging technologies to detect new and existing child sexual abuse material (CSAM) and child grooming activities.
The law has faced stiff opposition from digital privacy advocates, who claim that allowing platforms to snoop on content would be a grave infringement on the right to privacy online.
But the NGOs say the study highlights the urgent need for EU countries to find a compromise so that legal guardrails are in place to make the internet safer for kids.
“There is an acute need for regulatory frameworks that actually put that responsibility and that burden not on children, but on online service providers,” Fabiola Bas Palomares, policy and advocacy officer for Eurochild, explained. “We have to work together to ensure that children are protected from online child sexual abuse.”
It also comes amid mounting concern about the use of AI to generate deep-fake child sexual abuse material.
According to the EU’s Joint Research Centre, a large portion of such abusive content is generated by adolescents themselves, showing how children also need to be educated about the dangers of disseminating and creating abusive content.
Platforms’ role key
As the study’s findings were unveiled in Brussels on Monday, NGOs called on digital platforms to step up to the mark and play their role in fighting illegal content that puts children’s safety at risk.
Speaking to Euronews, Tomas Hartman, senior public policy manager at Snap Inc. said the company and its app Snapchat – which has some 102 million registered users in the EU – stand ready to play their role in cracking down on online sexual exploitation of children.
“We are well aware that our app is used by a lot of young people, and that’s why the safety and privacy of our users is our key priority, and especially for minors,” Hartman said, listing several safeguards Snapchat has introduced to safeguard teen users, such as limiting contact settings to friends and phone contacts, and turning off location-sharing by default.
Hartman also said that the planned EU law to tackle child sexual abuse material (CSAM) is “absolutely crucial” for Snapchat.
“It’s important for us to be able to proactively scan for this known CSAM material. And, we have reliable technologies to do that: we use photo DNA for the pictures,” he explained. “This is our utmost priority.”
The Snapchat app, used predominantly by young users to share images that disappear after they’ve been viewed, has an age requirement of 13-years-old, and additional privacy settings for users aged 13-17. It has come under scrutiny for failing to keep underage users off its platform.
Snapchat, along with Meta, received a request for information from the European Commission last November about the measures it is taking to “comply with their obligations related to the protection of minors.”
One of the concerns is the ‘My AI’ chatbot available to Snapchat users, powered by Microsoft’s ChatGPT. On its website, Snapchat acknowledges that the chatbot ” may include biased, incorrect, harmful, or misleading content.”
Read the full article Here