EU names 19 large tech platforms that must follow Europe’s new Internet rules

Large Google logo in the form of the letter
Enlarge / Google’s booth at the Integrated Systems Europe conference on January 31, 2023, in Barcelona, Spain.
Getty Images | Cesc Maymo

The European Commission will require 19 large online platforms and search engines to comply with new online content regulations starting on August 25, European officials said. The EC specified which companies must comply with the rules for the first time, announcing today that it “adopted the first designation decisions under the Digital Services Act.”

Five of the 19 platforms are run by Google, specifically YouTube, Google Search, the Google Play app and digital media store, Google Maps, and Google Shopping. Meta-owned Facebook and Instagram are on the list, as are Amazon’s online store, Apple’s App Store, Microsoft’s Bing search engine, TikTok, Twitter, and Wikipedia.

These platforms were designated because they each reported having over 45 million active users in the EU as of February 17. The other listed platforms are Alibaba AliExpress, Booking.com, LinkedIn, Pinterest, Snapchat, and German online retailer Zalando.

Companies have four months to comply with the full set of new obligations and could face fines of up to 6 percent of a provider’s annual revenue. One new rule is a ban on advertisements that target users based on sensitive data such as ethnic origin, political opinions, or sexual orientation.

There are new content moderation requirements, transparency rules, and protections for minors. For example, “targeted advertising based on profiling towards children is no longer permitted,” the EC said.

Companies will have to provide their first annual risk assessment on August 25, and their risk mitigation plans will be subject to independent audits and oversight by the European Commission. “Platforms will have to identify, analyse and mitigate a wide array of systemic risks ranging from how illegal content and disinformation can be amplified on their services, to the impact on the freedom of expression and media freedom,” the EC said. “Similarly, specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated.”

New rules

New requirements for the 19 platforms include the following, today’s EC announcement said:

  • Users will get clear information on why they are recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
  • Users will be able to report illegal content easily and platforms have to process such reports diligently;
  • Platforms need to label all ads and inform users on who is promoting them;
  • Platforms need to provide an easily understandable, plain-language summary of their terms and conditions, in the languages of the Member States where they operate.

Another content-moderation rule requires platforms to “analyse their specific risks, and put in place mitigation measures—for instance, to address the spread of disinformation and inauthentic use of their service,” the EC said.

In addition to banning targeted advertising based on profiling children, the EC said that “platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors.”

“Special risk assessments including for negative effects on [children’s] mental health will have to be provided to the Commission four months after designation and made public at the latest a year later,” the announcement said. “Platforms will have to redesign their services, including their interfaces, recommender systems, terms and conditions, to mitigate these risks.”

Several requirements are designed to let outside auditors and researchers verify a company’s compliance. Compliance with Digital Services Act obligations must be “externally and independently audited;” companies must provide researchers with access to data; companies must “publish repositories of all the ads served on their interface;” and companies must “publish transparency reports on content moderation decisions and risk management,” the EC said.

“Thanks to the Digital Services Act, European citizens and businesses will benefit from a safer Internet,” European Commissioner for Internal Market Thierry Breton said in a video posted on Twitter. “As of the 25th of August, online platforms and search engines with more than 45 million active users in the EU will have stronger obligations, because with great scale comes great responsibility.”

The Digital Services Act is complemented by the EU’s Digital Markets Act, which imposes requirements on online “gatekeepers” to prevent them from stifling competition.

https://arstechnica.com/?p=1934219