On 15 December 2020, the European Commission presented the long-awaited Digital Services Act (DSA). It is a proposal for a European regulation that will change the rules governing the liability of online platforms. The European regulation, once finalised, will replace the eCommerce Directive 2000/31/EC, which has allowed the platforms we know today to grow and prosper over the last twenty years without having to worry too much about the content uploaded by users. Articles 14 and 15 in fact guarantee them a limitation of liability for user-uploaded content until they become aware of it. Only then are they obliged to remove the content, while they are not asked to actively monitor the type of content their users upload every moment.
This system has worked well for a long time but it was designed in years when most of the online platforms we use every day did not exist or were just born. But as Commissioner Thierry Breton has repeatedly reminded, with great power comes great responsibility. In recent years, the Commission has asked big tech to adhere to a code of conduct and publish regular reports on how they manage the moderation of illegal content and hate speech. Although much has been done, many countries such as Germany have started to publish national laws requiring platforms to remove content within hours to avoid heavy fines. The Commission therefore realised that if on the one hand the system of self-regulation would not be sufficient to solve all the problems that have arisen in recent years, on the other hand the idea that each State would write its own law would further fragment the already precarious unity of the European digital market. Hence the idea of a European regulation instead of a directive, a regulation ready to be effective with rules that are immediately the same for all 27 Member States. The ultimate goal is to establish a new reference standard that is shared globally and prescribes what platforms must do to avoid sanctions and guarantee a safe space for users.
What is new
Before moving on to what is new, it is good to delimit the perimeter of what the proposal does not regulate. There is no mention of taxation to be imposed on big tech, no new definition of what is illegal, and no changes to sectoral legislation such as the recent copyright directive, the consumer protection directive and the announced regulation on online terrorism. The DSA will complement these regulations by outlining a new set of rules to determine when platforms are exempt from liability. To qualify for this exemption, they will have to demonstrate their due diligence and cooperate with the new authority in charge, the Digital Services Coordinator.
The new rules will apply to online intermediary services such as social networks, booking platforms and cloud providers. For these rules to apply, such services must be targeted at European users in one or more countries or have a significant number of users in one or more countries. Even if they are not based in Europe, all platforms must have a legal representative who can interact quickly with the member state authorities, the Digital Services Coordinator and the Commission.
While many of the obligations do not apply to small and micro enterprises, there are additional ones for large platforms, i.e. those with an average of more than 45 million monthly users in Europe.
More transparency for users
Users will have more information about what is happening on the platforms. In the interests of greater transparency, platforms will have to use simple and clear language in their terms and conditions, and at least once a year they will have to publish reports on the moderation of content showing the number of requests from the authorities and the time taken to respond, the times they have acted according to their terms and conditions, and the times users have challenged their choice.
When a piece of content is removed, they will have to inform users clearly and specifically. Today, in fact, in the case of removal, they do not go beyond a generic communication informing that the content violated the site’s rules without giving an effective explanation. If the content is considered illegal, they must cite the reference legislation and, finally, they must communicate how the removal can be contested. Complaints should be handled without delay and free of charge for the user, through an internal complaint-handling system. If the user is not satisfied with the outcome, he or she may turn to an out-of-court dispute body, certified by the Digital Services Coordinator as being experienced and impartial.
Platforms should clearly highlight what content is being sponsored, who is sponsoring it and why a user sees that specific content and not another.
Identification of online sellers
The platform hosting an online shop will have to identify sellers by requiring their name, address, email address and telephone number, proof of identity, a bank account for customer payments and a registration number, which could be a company register or VAT number. If the data is incorrect and the seller does not make the required changes, they could be suspended from the platform.
This requirement will ensure that users buying online can identify the seller and risk less in case of fraud.
Risk assessment for large platforms
At least once a year, major platforms will have to carry out a systemic risk assessment of how users use these platforms to disseminate illegal content, to manipulate public opinion, and the impact this abuse has on fundamental rights such as privacy and freedom of expression. Based on the outcome of the report, they will have to take the necessary corrective measures. Also to ensure that the assessment is effective, they will have to undergo at least once a year an external audit by an entity certified by the Authority.
New authorities and sanctions along the lines of the GDPR
Each Member State will have to create a Digital Services Coordinator, an authority responsible for monitoring compliance with the regulation. Each national authority will send a representative to Brussels to form a board. The authority will have inspection powers and will be able to impose sanctions that are effective, proportionate and dissuasive and that can amount to up to 6% of the total turnover. The amount of sanctions will depend on the number of users, the number of countries involved, and the promptness and completeness of responses to the Authority’s questions. The Commission could also intervene if the local authority failed to do so or at the request of one or more authorities. Periodic payments could also be envisaged to ensure that violations cease as soon as possible.