Children's rights took center stage in the Plenary Hall of the Internet Governance Forum today. Under the title “Securing Child Safety in the Age of Algorithms”, high-ranking representatives from the fields of politics, civil society and business discussed the topic. Introducing the topic, Leanda Barrington-Leach, 5Rights Foundation, pointed out that adults must stand up for children and their rights. With this in mind, she advocated regulating companies and service providers instead of restricting children and young people in exercising their rights. “The digital world is 100% man-made, it can easily be misused for the worse, but it can also be changed for the better,” Barrington-Leach said with conviction. She received support for this view from Karianne Tung, Digital Minister of Norway: “Protecting children online requires empowerment, not restriction.” In their opinion, age verifications are suitable for giving minors access to online content designed for children. In response, Christine Grahn and Emily Yu have stated that TikTok and Roblox are already focusing on protection and safety aspects when designing entertaining and enjoyable experiences. However, other participants called for more action to protect young people from addictive designs and harmful content. Furthermore, Minister Salima Bah of Sierra Leone voiced her concerns that artificial intelligence could reduce the visibility of cultural diversity. She emphasised the importance of digitally showcasing the experiences and perspectives of children from minority communities. Thomas Davon from UNICEF also emphasised that the perception of truth is changing alongside the potential loss of diversity. Speaking on behalf of the European Commission, Thibaut Kleiner referred to the current strategies aimed at ensuring the safety of young people using services and platforms. The Digital Services Act requires providers to conduct risk analyses and implement targeted safety and precautionary measures. The guidelines on the implementation of Art. 28 DSA, which are still in development, aim to provide guidance on this. These will be available alongside an age verification tool in late summer 2025.
The afternoon session focused on the importance of young people's perspectives in designing artificial intelligence (AI) services, and on how users of the digital environment can be empowered. According to the results of a study by the Family Online Safety Institute (FOSI), young people and their parents use artificial intelligence for different purposes, but equally appreciate the potential of the applications. The common concern shared by children and parents is the possibility that AI could result in job losses, facilitate the spread of misinformation more extensively, and outperform human abilities to the point of devaluing them. Parents see it as their responsibility to ensure their children can use AI safely. However, children often mistakenly assume their parents know more about it than they do. To better understand the applications and modes of action of AI, children and parents want more transparency from manufacturers and providers. Against this backdrop, there is a demand for them to contribute to users ability to engage with artificial intelligence, and to invest in developing their skills.
During the third cluster session of the Dynamic Coalitions, it became clear that developing expertise comes with various challenges. The Dynamic Coalitions for Children's Rights in the Digital Environment, for Accessibility and Disability, for Internet Regulation and for Internet Standards, Safety and Security consulted together on this topic. It was emphasised that being able to use the digital environment purposefully is a prerequisite for taking advantage of opportunities and realising one's human rights. In this context, Jutta Croll for DC CRIDE referred to a demand from the morning event, according to which no attempt should be made to adapt people to services that are inadequately designed for their needs. Rather, services should adapt to people in order to meet their needs through design and functionality appropriate to their age and abilities. There was consensus among the participants that both an agreement on quality standards and better funding of structures are needed to educate users and improve their understanding of new technologies. Joint efforts should therefore be made to hold governments accountable in this regard.