Brussels – The axe of the Digital Services Act (DSA) is coming down again on Meta, and this time over issues that recently closely involved TikTok. The European Commission today (May 16) opened a new formal proceeding under the Digital Services Act to assess whether the Facebook and Instagram provider violated regulations that entered into full force on February 17 regarding safeguarding children on the two online platforms. “We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate,” said EU Commission Vice-President for Digital, Margrethe Vestager.
Just two weeks after opening formal proceedings to assess violations in the areas of misleading advertising, misinformation, reporting of illegal content, and failure to provide an “effective tool” in the run-up to the European elections, Meta finds itself once again under EU investigation. The formal proceedings initiated today focus on three areas in the area of online child protection on which requests for information have already been sent to the Facebook and Instagram provider, but without satisfactory answers at the moment, the EU Commissions said. First and foremost, is compliance with respect to DSA obligations on mitigation measures to prevent minors’ access to inappropriate content: “The age verification tools used by Meta may not be reasonable, proportionate, and effective.”
Also under the lens of the EU Commission is compliance with obligations to assess and mitigate risks caused by the design of online interfaces of Facebook and Instagram, “which can exploit the weaknesses and inexperience of minors,” causing addictive behavior and stimulating so-called ‘rabbit hole effects,’ i.e., continuous and impulsive viewing of videos and content. Like in the TikTok investigation underway “such assessment is necessary to counter potential risks to the exercise of the basic right to physical and mental well-being of minors.” Finally, the third area relates to the application of “appropriate and proportionate” measures to ensure a high level of privacy, safety, and security for minors, more specifically regarding the default privacy settings for minors “as part of the design and functioning of their recommender systems,” the EU executive points out.
At this point, the Commission will continue to gather evidence on these three areas of potential violation of the Digital Services Act provisions, with further requests for information, interviews, and inspections. In parallel, it may take interim measures and non-compliance decisions. “The initiation of formal proceedings does not prejudge their outcome,” the Commission points out, noting that the DSA does not set any legal deadline to conclude formal proceedings. The length of the investigation may depend on factors such as the complexity of the case and the extent to which Meta will cooperate with engagements to remedy the issues in the proceedings. After going into effect at the end of last year, as of August 25, Meta is on the list of 19 digital platforms that must comply with obligations provided for in the Digital Services Act.
English version by the Translation Service of Withub