The European Union has initiated an investigation into Facebook and Instagram regarding their potential addictive effects on children, along with two similar probes into TikTok at an earlier date. The probe will focus on Meta-owned platforms and their addictive nature, as well as the influence of content related to depression and unrealistic body images on young users. Investigators will also examine whether the platforms effectively prevent children under 13 from utilizing their services.
Thierry Breton, the EU’s internal markets commissioner leading the investigations, expressed concerns about Meta’s compliance with the Digital Services Act obligations in safeguarding the physical and mental health of young Europeans using Facebook and Instagram. In response, Meta spokesperson Kirstin MacLeod stated that the company has implemented over 50 tools and policies to protect young users and is committed to providing safe online experiences for them.
The investigations under the Digital Services Act rules are distinct for Meta and TikTok, with a commission spokesperson noting that any similarities between the cases are a result of how the platforms operate. The debate surrounding the impact of social media on children has intensified, with concerns raised about the potential negative effects on mental health. The Digital Services Act, implemented in August last year, aims to defend Europeans’ online human rights and has led to investigations into various platforms, including AliExpress, Facebook, Instagram, TikTok, TikTok Lite, and X.
Following the launch of an investigation into a points-for-views reward system on TikTok Lite, TikTok announced the suspension of the incentive due to concerns about its impact on children. Breton emphasized the need to protect children from being used as test subjects for social media experiments.