Home Web system Explanation: the basic rules of the European Union for the Web

Explanation: the basic rules of the European Union for the Web


The European Parliament and member states of the European Union (EU) announced on Saturday that they had reached a political agreement on the Digital Services Act (DSA), landmark legislation aimed at obliging major companies in the Internet to act against disinformation and illegal and harmful content, and to “ensure better protection of Internet users and their fundamental rights”.

The law, which has not yet entered into force, was proposed by the European Commission (anti-trust) in December 2020. According to the definition of the European Commission, the DSA is “a set of common rules on the obligations and liability of market intermediaries” and provides better protection for all users in the EU, regardless of their country.

The proposed law will work in conjunction with the EU’s Digital Markets Act (DMA), which was approved last month.

The DSA should be adopted by the European Parliament in the coming months. Once adopted, “it will apply from fifteen months or from January 1, 2024, whichever is later”.

What is the DSA and who will it apply to?

The DSA will tightly regulate how intermediaries, especially large platforms such as Google, Facebook and YouTube, operate when it comes to moderating user content. Instead of letting platforms decide how to deal with abusive or illegal content, the DSA will establish specific rules and obligations that these companies must follow.

According to the EU, the DSA will apply to a “broad category of online services, ranging from simple websites to internet infrastructure services and online platforms“. The obligations of each of them will differ according to their size and role.

The legislation brings into its scope platforms that provide internet access, domain name registrars, hosting services such as cloud computing and web hosting services. But more importantly, the very large online platforms (VLOP) and the very large online search engines (VLOSE) will face “stricter requirements”.

Any service with more than 45 million monthly active users in the EU will fall into this category. Those with less than 45 million monthly active users in the EU will be exempt from certain new obligations.

Once the DSA becomes law, each EU member state will have the lead role in enforcing it, along with a new ‘European Digital Services Council’. The European Commission will carry out “enhanced supervision and enforcement” for VLOPs and VLOSEs. Penalties for violating these rules could be huge – up to 6% of the company’s worldwide annual revenue.

What do the new rules say?

A wide range of proposals aim to ensure that the negative social impact resulting from many of the practices followed by internet giants is minimized or removed.

# Online platforms and intermediaries such as Facebook, Google, YouTube, etc. will have to add “new procedures for faster removal” of content deemed illegal or harmful. This may vary according to the laws of each EU member state.

In addition, these platforms will have to clearly explain their content removal policy; users will also be able to challenge these withdrawals. Platforms will need to have a clear mechanism to help users report illegal content. The platforms will have to cooperate with “trusted flaggers”.

# Marketplaces like Amazon will have to “impose a duty of care” on sellers who use their platform to sell products online. They will have to “collect and display information on the products and services sold in order to ensure good consumer information”.

# The DSA adds “an obligation for very large digital platforms and services to analyze the systemic risks they create and to carry out a risk reduction analysis”. This audit for platforms like Google and Facebook will have to take place every year.

Companies will have to address the risks of “dissemination of illegal content”, “negative effects on fundamental rights”, “manipulation of services having an impact on democratic processes and public security”, “negative effects on gender-based violence, and on minors and serious consequences for the physical or mental health of users”.

# The law proposes to allow approved independent researchers to have access to public data from these platforms to conduct studies to better understand these risks.

# The DSA proposes to ban “Dark Patterns” or “deceptive interfaces” designed to trick users into doing something they wouldn’t otherwise agree to.

This includes forced pop-up pages, giving greater importance to a particular choice, etc. The proposed law requires customers to be offered the choice of a system that does not “recommend content based on their profiling”.

# The DSA incorporates a new crisis mechanism clause – it refers to the Russian-Ukrainian conflict – which will be “activated by the Commission on the recommendation of the council of national coordinators of digital services”. However, these special measures will only be in place for three months.

This clause will make it possible “to analyze the impact of the activities of these platforms” on the crisis, and the Commission will decide on the appropriate measures to be taken so that the fundamental rights of users are not violated.

# The law offers enhanced protection for minors and aims to prohibit targeted advertising to them based on their personal data.

# It also proposes “transparency measures for online platforms on various issues, including the algorithms used to recommend content or products to users”.

# Finally, it says that canceling a subscription should be as easy as subscribing.

Does this mean that social media platforms will now be liable for any illegal content?

It has been clarified that platforms and other intermediaries will not be responsible for the illegal behavior of users. So they still have a “safe harbor” in some sense.

However, if the platforms are “aware of illegal acts and do not remove them”, they will be responsible for this user behavior. Smaller platforms, which remove any illegal content they detect, will not be liable.

Newsletter | Click to get the best explainers of the day delivered to your inbox

India’s IT rules announced last year make the social media intermediary and its executives liable if the company fails to exercise due diligence. Rule 4(a) states that significant social media intermediaries – such as Facebook or Google – must appoint a compliance officer (CCO), who could be booked if a tweet or post that violates local laws is not not deleted within the time limit.

India’s rules also introduce the requirement to publish a monthly compliance report. They include a clause on the need to trace the sender of a message – this provision was challenged by WhatsApp in the Delhi High Court.