EU demands information on content algorithms from YouTube, Snapchat, and TikTok

They have barely six weeks to comply

by · TechSpot

Serving tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.

In brief: The Digital Services Act has been in effect for roughly a year and a half, and from the beginning, the European Commission has not been afraid to wield its powers. Fines for violating the act are massive enough to make even tech giants like Meta and Google wary of the financial implications, and its mandate is broad. While the DSA covers a wide range of activities, the EC recently seems to be focused on recommendation algorithms and their impact on society.

The European Commission has requested that YouTube, Snapchat, and TikTok provide detailed information about their content recommendation algorithms. This request, made under the Digital Services Act (DSA), aims to understand how these platforms recommend content to users and address potential risks associated with their systems.

In general, the EU has expressed concern about recommendation algorithms and the far-reaching influence these systems can have on various aspects of society, from individual mental health to broader democratic processes. The Commission is particularly interested in understanding how the algorithms recommend content to users. It seeks to explore their role in amplifying systemic risks and the measures taken by platforms to mitigate the spread of illegal content, such as hate speech and the promotion of illegal drugs.

For YouTube and Snapchat, the inquiry focuses on how their algorithms might affect electoral processes, civic discourse, users' mental well-being – including addictive behavior – and the protection of minors.

TikTok, in addition to these concerns, is required to provide details on its measures to prevent service manipulation by malicious actors and steps taken to reduce risks related to elections and media pluralism.

// Related Stories

The platforms must respond by November 15, 2024. Failure to comply or to provide correct information could result in fines or formal proceedings under the DSA.

This request is part of the EU's ongoing efforts to regulate big tech companies and ensure user safety online, with the Commission having previously launched similar inquiries into other platforms, including Facebook and Instagram.

The DSA came into force in February 2023, drastically changing how digital platforms in the region regulate their services. Under the DSA, all online platforms are now obligated to remove content that is illegal in any EU member state, suspend accounts that disseminate such illegal content, and report criminal offenses. Very large online platforms must produce an annual risk assessment, conduct an independent audit, implement risk mitigation measures, and appoint a compliance officer responsible for illegal content obligations. The maximum fine for a breach is 6% of global annual turnover in the preceding financial year – an even higher amount than GDPR fines.

Since the DSA's establishment, the EU has been aggressively enforcing it, as seen by these recent requests and the tight timelines the companies were given to respond. According to euronews, one senior EU official said the inquiry should act as a "wake-up call" for platforms to change their behavior, such as by letting users hide certain types of video.