It’s all about Children!

2025-07-21_Panel-Regulators-Day1.jpg
  • Report

From left to right: Chloe Bennett, eSafety Commissioner, Australia; Edward Wee, IMDA Singapore; Anna Lucas, Ofcom, UK; Niamh Hodnett, DSC, Ireland; Tayyba Mahmood, Ofcom (Moderator)

We are the new kids on the block taking part for the first time in TrustCon, an international conference on trust and safety in the digital world. And we are excited how prominently children’s best interest in the digital environment features among this highly technology oriented crowd.

Around 1,400 people from all over the world have gathered in San Francisco for a three-day event packed with sessions addressing the challenging work of keeping platforms and communities safe. Conference participants create an enduring and supportive community focused on the practice of trust and safety, exploring success stories, lessons learned, and the future of the field.

With regard to child online safety, a large number of sessions were held on 21 July, the first day of the conference, to provide opportunities for platform providers and regulatory authorities to exchange perspectives. For example, Leah Buck from the UK Home Office, Deborah Welsh from the Australian eSafety Commissioner and Erin Kennedy, ChildFund International discussed prevention and awareness efforts with SnapChat’s Safety Lead Viraj Doshi emphasizing the importance of nuanced awareness raising campaigns. With AI generated child sexual abuse material and sextortion based on deep fake nudes on the rise the importance of diverse partnerships to improve the situation was stressed. 

This serious phenomenon also took centre stage in the session ‘Combating financial sextortion of minors’. The representatives from Meta, Snap and PayPal who took part in the discussion agreed that it makes sense and is expedient to act together and work together. By sharing findings and information, it is possible to identify perpetrators who operate across platforms and pass on the relevant information to the law enforcement authorities.

The following session with the title “Protecting Children Online: What regulators expect from You in 2025” referred to 2025 as the year when compliance requirements kick-in in some key jurisdictions. Representatives from UK, Australia, Ireland and Singapore shared their expectations for what companies need to do to keep children safe ranging from risk assessment and mitigation to age assurance and age-appropriate design. This is where children’s rights to protection, provision and participation come into play, although one could get the impression that not all governments are following an approach as balanced as the European Commission has chosen with the Digital Services Act Art. 28 Guidelines for a high level of privacy safety and security of minors online.

Real teens’ needs were in the focus of the next session that dealt with designing spaces for protection rather than exclusion. Kristelle Collins, Global Teen Policy manager at Discord cautioned from unintended consequences of safety measures and feature. Good intent does not always lead to best results she said and explained how Apple’s screen time function was used by teens for competitions in regard of the maximum amount of online time during school lessons, instead of helping them to self-regulate their online behaviour. Melissa Stroebel from the research and strategic impact department at Thorn referred to children’s development stages and their evolving capacities. Age assurance, she said, would help to adapt the services to children’s needs and give them spaces to grow in the digital environment. 

In the session ‘Boys, Gaming and Online Safety: Rethinking Risk and Protection Strategies’, the report ‘Nerfing gender roles and rigid masculinity’ by Ecpat International was presented. Together with Francesco Cecon from Ecpat, Maria Oliveira Tamellini from GamerSafer and Patricia Noel from Discord spoke about manifestations of masculinity in online gaming communities that can affect boys and young men in their development, particularly in their behaviour towards girls and young women and other people who are not perceived as male.

Eventually, combatting the rapidly increasing amount of child sexual abuse material – both real and AI-generated photos and videos – was discussed, based on the most recent reports from the Internet Watch Foundation and the National Center for Missing and Exploited Children.


Jutta Croll & Torsten Krause, Stiftung Digitale Chancen