EU Launches Formal Snapchat Probe Over Child Safety Concerns Under Digital Services Act
The European Commission has opened a formal investigation into Snapchat, raising serious concerns about whether the popular social media platform is doing enough to protect children from online risks. The move comes under the European Union’s Digital Services Act (DSA), which sets strict rules for online platforms to ensure user safety, particularly for minors.
Regulators suspect that Snapchat may be exposing children to dangers such as online grooming, criminal recruitment, and access to illegal or age-restricted content. The probe marks a significant step in the EU’s ongoing efforts to hold major tech companies accountable for how they safeguard younger users in the digital space.
According to the Commission, there are concerns that adults may be impersonating younger users on the platform in order to contact minors. These interactions could potentially lead to sexual exploitation or involve children in illegal activities. Authorities believe Snapchat’s current systems may not be strong enough to prevent such behavior, leaving vulnerable users exposed.
The investigation will closely examine multiple aspects of Snapchat’s operations, particularly its age verification systems. Regulators suspect that the platform’s existing “age assurance” tools may be inadequate, allowing underage users to bypass restrictions and access the app without proper checks. This could also enable adults to pose as minors, further increasing risks for young users who believe they are interacting with peers.
Another major focus of the probe is Snapchat’s content moderation practices. The Commission is concerned that the platform may not be effectively preventing the spread of information related to illegal or restricted goods such as drugs, alcohol, and vaping products. Reports suggest that such content may still be accessible to minors, highlighting potential gaps in enforcement that could have serious health and safety implications.
In addition, officials are examining whether Snapchat’s default account settings provide adequate protection for children. There are concerns that these settings may not prioritize privacy and safety, potentially leaving minors more vulnerable to harmful interactions. The investigation will also assess whether users can easily report harmful or illegal content, with regulators suspecting that the reporting mechanisms may not be user-friendly or effective in addressing complaints promptly.
The European Commission has also raised questions about the use of so-called “dark patterns” in the app’s design. These are techniques that may subtly influence user behavior, potentially encouraging choices that users—especially younger ones—might not otherwise make. Such practices are under increasing scrutiny as regulators seek to ensure that platforms operate transparently and ethically without manipulating user decisions.
This investigation builds on earlier scrutiny of major tech companies under the DSA. The legislation, which came into force to tackle illegal content and improve online safety, gives the EU significant powers to investigate and penalize companies that fail to comply. If Snapchat is found to have violated these rules, it could face substantial fines—potentially up to 6% of global annual revenue—or be required to make major changes to its platform.
The Commission’s action also incorporates an earlier probe by Dutch authorities into the alleged sale of vaping products to minors via Snapchat. By combining investigations, regulators aim to create a more comprehensive picture of the platform’s potential shortcomings in protecting young users. This coordinated approach allows the Commission to draw on existing evidence while expanding the scope of the inquiry.
While the investigation is ongoing, Snapchat’s parent company, Snap Inc., is expected to cooperate with EU authorities. The process will involve a detailed review of the platform’s policies, risk assessments, and technical systems. The company may also be given an opportunity to address the concerns and propose improvements before any final decisions are made.
The case highlights growing global pressure on social media companies to take stronger responsibility for user safety, especially for children. As online platforms continue to play a central role in young people’s lives, regulators are increasingly focused on ensuring that these digital environments are safe, secure, and free from exploitation. Similar investigations into other platforms have already resulted in significant operational changes.
Also Read: Space Breakthrough: NASA Announces Massive $20 Billion Artemis Plan for Moon Base
The outcome of the investigation could have far-reaching implications, not only for Snapchat but for the broader tech industry, as governments push for stricter enforcement of online safety standards. The probe sends a clear signal that the EU is willing to use its regulatory powers to compel platforms to prioritize child protection over growth metrics and engagement. For Snap Inc., the investigation represents one of its most significant regulatory challenges to date.