EU investigating Meta over addiction and safety concerns for minors

Date:

Share:

[ad_1]

Meta is back in hot water for its methods (or lack thereof) for protecting children. The European Commission has launched formal proceedings to determine whether the owner of Facebook and Instagram has violated the Digital Services Act (DSA) by contributing to children’s social media addiction and not ensuring they have high levels of safety and privacy.

The Commission’s investigation will specifically examine whether Meta is properly assessing and acting against risks brought on by its platforms’ interfaces. It’s concerned about how their designs could “exploit the weaknesses and inexperience of minors and cause addictive behavior, and/or reinforce so-called ‘rabbit hole’ effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.”

The proceedings will also explore whether Meta takes necessary steps to prevent minors from accessing inappropriate content, has effective age-verification tools and minors have straightforward, strong privacy tools, such as default settings.

The DSA sets standards for very large online platforms and search engines (those with 45 million or more monthly users in the EU) like Meta. Obligations for designated companies include transparency about advertising and content moderation decisions, sharing their data with the Commission and looking into risks their systems pose related to areas such as gender-based violence, mental health and protection of minors.

Meta responded to the formal proceedings by pointing to features such as parental supervision settings, quiet mode and it automatically restricting content for teens. “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission,” a Meta spokesperson told Tech Reader.

However, Meta has continuously failed to prioritize the safety of young people. Previous alarming incidents include Instagram’s algorithm suggesting content that features child sexual exploitation and claims that it designs its platforms to be addictive to young people while suggesting psychologically harmful content, such as the promotion of eating disorders and body dysmorphia.

Meta has also famously served as a hub of misinformation for people of all ages. The Commission already launched formal proceedings against the company on April 30 due to concerns around deceptive advertising, data access for researchers and the lack of an “effective third-party real-time civic discourse and election-monitoring tool” before June’s European Parliament elections, among other concerns. Earlier this year, Meta announced that CrowdTangle, which has publicly shown how fake news and conspiracy theories move around Facebook and Instagram, would be completely shut down in August.

[ad_2]

Source link

━ more like this

Sends shares Q1 2026 business update and product progress

Sends reported Q1 2026 updates sharing news on digital cards, app redesign, ClearBank integration, and fintech industry recognition. Sends, a fintech platform operated by Smartflow...

We swipe our phones all day, and scientists just ranked which ones are the most tiring

We all know staring at your phone for hours isn’t great for mental health. But what about your fingers? Previously, researchers couldn’t measure...

Two suspects have been arrested for allegedly shooting at Sam Altman’s house

OpenAI CEO Sam Altman's house may have been the target of a second attack after San Francisco Police Department arrested two suspects for...

You Can Soon Buy a $4,370 Humanoid Robot on AliExpress

Listing consumer electronics on the internet's large ecommerce marketplaces is a key step in “democratizing” the products, allowing them to be purchased by...
spot_img