Safer Internet Day is celebrated annually on the second day of the second week of the second month of the year. It is an event promoted with the support of the European Commission, whose objective is to raise awareness about Internet safety, promoting safe and positive use of digital technologies, especially among children and adolescents.
This year's edition will be held on 10 February under the slogan ‘Digital safety to educate in times of artificial intelligence’, focusing on promoting the safe and positive use of digital technologies among minors and their immediate environment, fostering their digital skills and helping them to show respect and develop critical thinking and creativity, in line with European digital principles.
This year, to mark Safer Internet Day, the European Consumer Centre in Spain wishes to highlight that social media is often the main cause of cyberbullying and, more generally, mental health problems among young people, which can lead to symptoms of depression or suicide risk, hence the importance of providing adequate protection for minors. According to UNICEF's report, Childhood, Adolescence and Digital Wellbeing, published last November, 5.7% of children and adolescents have a problematic use of social media that interferes with their daily lives and is associated with symptoms of anxiety, poorer quality of life or greater exposure to bullying. For this reason, countries such as Australia, France, Portugal and, more recently, Spain are already working to ban its use among children and adolescents.
Similarly, the European Centre points out that online safety issues do not originate with young people, but rather with structural aspects such as the design and navigation of online platforms based on algorithms that maximise user attention; the lack of secure and effective age verification systems that prevent minors from accessing inappropriate content; and the ease with which harmful content can be disseminated. ECC-Spain therefore points out that work must continue to transform the online environment into a safer, more inclusive and stimulating space.
Likewise, the necessary measures should be taken to ensure the development of ethical and responsible technology that guarantees that young people can also benefit from the positive aspects of the new digital environment, such as the development of social relationships and identity, creativity, and access to information. For this reason, the challenge should also involve creating digital environments that are appropriate for each age group and promoting new spaces for education and guidance.
Recommendations for creating a safe Internet:
• Literacy. Based on real-life practices to promote responsible use of technology, develop critical thinking and digital autonomy. Children should be introduced to the digital environment gradually, supporting them to participate in the digital world in an accompanied and supervised manner. Likewise, online environments must be adapted to the age of the user and children should gradually take on responsibilities. This literacy would help empower young people to explore the digital world with insight and knowledge. To this end, interventions must be gender-sensitive, age-appropriate and universal, providing equal support to highly vulnerable children and those at risk.
• Regulation and supervision of platforms. Platforms must guarantee safe digital spaces for children and young people. They must also guarantee fundamental rights, freedom of expression, and the right to receive transparent, honest, and truthful information.
• Support for parents and educators. Both parents and educators should receive clear guidelines, practical tools and appropriate resources to help children learn to navigate the digital environment in a responsible, ethical and healthy manner, as well as to encourage intergenerational dialogue. This would ensure that adults are well prepared to guide and support younger people.
• Secure and accessible information channels. Ensure channels are in place so that users—especially minors and vulnerable individuals—can report online abuse or inappropriate content. It is also important to foster a culture of trust within schools, families, and parent networks so that minors are not afraid to report incidents.
Recommendations for safe Internet browsing
• Update your equipment and software.
• Check and update plugins and extensions.
• Manage browser security and privacy.
• Do not save passwords in your browser if it gives you the option to remember them.
• Delete session cookies.
• Encrypt communications using protocols that encrypt communications, for example, use HTTPS whenever possible.
• Verify that the sites you visit are trustworthy. Type the URL (with HTTPS://) and open the information (certificate) in the padlock in the navigation bar. Check that the certificate is valid and that you are on the correct site verified by a trusted company.
• Enable two-factor authentication to strengthen the protection of online accounts.
How does the Digital Services Act protect minors online?
Its aim is to create a safe and reliable online environment by ensuring that very large online platforms, such as Instagram, Snapchat, TikTok and YouTube, protect users' rights and prevent the spread of prohibited or inappropriate content. It also requires social media platforms, for example, to be more transparent about their recommendation algorithms, to quickly remove harmful content and to have uniform complaint procedures across Europe. In this way, social media platforms must ensure that minors are not exposed to inappropriate content, that their data and privacy are protected, that they do not receive personalised advertising, and that information is clear and not misleading. Another aim of this law is to ensure that online content is appropriate for the age and interests of children. Online platforms are therefore obliged to identify risks at an early stage and take appropriate measures.
Platforms must offer easily accessible reporting functions, for example via a clearly visible report button. Platforms are obliged to respond promptly and provide feedback to users. If a platform does not respond quickly enough or if the content is particularly dangerous, so-called trusted flaggers can offer support.
In Spain, the CNMC acts as the Digital Services Coordinator, supervising service providers to ensure they comply with their obligations and protect the rights of users, especially minors. Under no circumstances is the CNMC competent to moderate or remove content, to act as an appeal body for decisions made by online service providers, or to settle disputes between different parties or different users. To file a complaint, the Digital Services Act provides for two avenues:
1. Internal Complaints Mechanism of the Provider
2. Out-of-court Settlement
>> List of dispute resolution bodies certified by Digital Services Coordinators.