EU Targets Big Tech with Stricter Rules under the Digital Services Act
In this article, we’ll explore the European Commission’s recent naming of 19 technology companies that will be subject to stringent new regulations under the Digital Services Act (DSA). We’ll discuss the reasons behind the designation and the key takeaways of the new rules.
- 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) have been named.
- Companies must comply with new regulations within four months.
- New rules focus on user empowerment, protection of minors, diligent content moderation, and transparency and accountability.
- The DSA will be enforced by a pan-European supervisory architecture.
The List of VLOPs and VLOSEs
The European Commission’s list of VLOPs comprises some of the most prominent tech companies in the world. These include:
- Alibaba AliExpress
- Amazon Store
- Apple AppStore
- Google Play
- Google Maps
- Google Shopping
Additionally, Bing and Google Search have been classified as VLOSEs.
These companies were selected due to their extensive user base, with each having over 45 million monthly active users in the European Union.
New Regulations under the Digital Services Act
Under the DSA, the named companies must comply with a set of new regulations within four months. The rules focus on four key areas:
- User Empowerment
- Users will receive clear information about the rationale behind recommended content and can opt out of recommendation systems based on profiling.
- Users can report illegal content easily, and platforms must process these reports diligently.
- Advertisements must not be displayed based on sensitive user data, such as ethnicity, political opinions, or sexual orientation.
- All advertisements must be labeled, and users should be informed about the promoters.
- Platforms must provide easy-to-understand, plain-language summaries of their terms and conditions in the languages of the member states where they operate.
- Protection of Minors
- Platforms must redesign their systems to ensure high levels of privacy, security, and safety for minors.
- Targeted advertising based on profiling towards children is no longer permitted.
- Companies must submit special risk assessments, including potential negative effects on mental health, to the European Commission within four months of designation and make them public no later than one year later.
- Platforms must redesign their services, interfaces, recommender systems, and terms and conditions to mitigate these risks.
- Diligent Content Moderation and Less Disinformation
- Platforms and search engines must take measures to address risks associated with the dissemination of illegal content online and negative effects on freedom of expression and information.
- Platforms must have clear terms and conditions and enforce them diligently and non-arbitrarily.
- Platforms must implement a mechanism for users to flag illegal content and act upon notifications expeditiously.
- Platforms must analyze their specific risks and implement mitigation measures, such as addressing the spread of disinformation and inauthentic use of their service.
- Transparency and Accountability
- Platforms must ensure that their risk assessments and compliance with DSA obligations are externally and independently audited.
- Platforms must provide access to publicly available data for researchers and establish a special mechanism for vetted researchers later on.
- Platforms must publish repositories of all ads served on their interface.
- Platforms must publish transparency reports on content moderation decisions and risk management.
Enforcement of the DSA
The enforcement of the DSA will be carried out through a “pan-European supervisory architecture.”
This structure involves the European Commission overseeing the designated platforms and search engines, along with national Digital Services Coordinators (DSCs).
The DSCs will also be responsible for supervising smaller platforms and search engines.
The strict rules imposed by the Digital Services Act will require significant effort and adaptation from the targeted platforms and search engines.
These companies will need to make substantial changes in their systems, policies, and operations to comply with the new regulations and ensure they meet the necessary standards.
No specific penalties were mentioned in the announcement.
However, reports from last year suggest that companies found in violation of the rules could face fines of up to 6% of their global turnover.
This potential financial impact further underscores the importance of compliance for the affected tech giants.
The enforcement of the DSA will not only protect users but also promote a safer and more transparent online environment in the European Union.
With a strong emphasis on user empowerment, protection of minors, diligent content moderation, and increased transparency and accountability, the DSA is poised to reshape the way these major technology companies operate within the EU.
As the four-month deadline approaches, it remains crucial for the designated platforms and search engines to take the necessary steps to adhere to the new rules.
The European Commission, along with the national Digital Services Coordinators, will closely monitor these companies’ progress and ensure that they fulfill their obligations under the Digital Services Act.
The implementation of the Digital Services Act signals a significant shift in the European Union’s approach to regulating major technology companies.
The act aims to hold these tech giants more accountable for their actions and their users’ online experiences.
The emphasis on user empowerment, protection of minors, diligent content moderation, and transparency will likely bring about significant changes in the way these companies operate.
As the deadline for compliance approaches, it will be interesting to see how these firms adapt to the new rules and navigate the challenges that lie ahead.