08.02.2024
Online liability after the Digital Services Act
On February 17, 2024, the Regulation on a Single Market for Digital Services (Digital Services Act) fully enters into force, already applicable from August 25, 2023 for very large online platforms and containing a single set of rules with the stated aim of creating a safer digital space where the fundamental rights of all users of digital services are protected and establishing a level playing field to foster innovation, growth and competitiveness.
I. What is the Digital Services Act?
Most of us access the internet through an internet service provider, use social networks to create or follow content, shop online from online marketplaces, book vacations and hotels online, and search for almost anything using search engines.
The way we interact with these services are often regulated by each individual service provider.
The Digital Services Act (DSA) contains a set of EU-wide rules for the digital services mentioned above, which act as intermediaries for consumers and goods, services and content.
II. Who does the DSA apply to?
The Regulation applies with increasing responsibility to the following types of digital services:
1. Mere conduit" services, which consist of the transmission over a communications network of information provided by a recipient of the service or the provision of access to a communications network (such as wireless access points, VPNs, DNS services and solutions, top-level domain name registries, certification authorities issuing digital certificates, voice over IP - VoIP), etc.;
2. Caching' services, which consist of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of the transmitted information, the sole purpose of the caching being to make the further transmission of the information to other recipients more efficient, at their request (such as Content Delivery Networks (CDNs), content adaptation proxies, etc.);
3. Hosting" services, which consist of the storage of information provided by a recipient of the service at the request of the recipient (such as online hosting services, cloud hosting, etc.);
4. Online platforms, which are a type of hosting service that stores and disseminates information to the public at the user's request (such as social networks, online marketplaces - marketplaces, online travel and accommodation platforms, etc.);
5. Search engines, which are tools for searching for information on the internet (such as search engines that index web content and provide users with relevant search results based on their queries).
The red thread linking these digital services is that they are intermediary services, facilitating consumer access to goods, services and content.
III. What are the main obligations of intermediary service providers?
As we have already mentioned, the Regulation sets out an ascending scale of obligations in the form of an inverted pyramid, with few to no obligations for "mere conduit" services, "caching" and "hosting" services, as well as for SMEs, more substantial obligations for online platforms and very substantial obligations for very large online platforms (VLOPs) and very large search engines (VLOSEs).
1. Thefirst tier of obligations concernsmere conduit services,caching andhosting services, which are in principle exempted from liability for the information transmitted, for the automatic, intermediate and temporary storage of the information transmitted, and for the information stored at the request of a recipient of the service.
This exemption from liability is not absolute and is conditional on the intermediary service provider's non-intervention. In addition, in the case of hosting services, the provider must have no knowledge of the illegal activity or illegal content and, upon becoming aware that the content is illegal, act promptly to remove the illegal content or to block access to it.
In any case, ISPs have no general obligation to monitor the information they transmit or store, nor do they have an obligation to actively seek facts or circumstances indicating illegal activity.
However, all intermediary service providers are obliged (i) to designate a single point of contact enabling them to communicate directly, by electronic means, with the authorities of the Member States and the Commission, (ii) to designate a single point of contact enabling the recipient of the service to communicate directly and rapidly with them, by electronic means and in a user-friendly manner, (iii) to designate in writing a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services, if they are not established in the European Union, (iv) to include, in their general terms and conditions of use, information on any restrictions they impose in relation to the use of their service on the information provided by recipients of their service, and (v) at least once a year, to report any content moderation they have carried out during the relevant period.
Finally, a court or administrative authority may require the service provider to terminate or prevent an infringement.
2. The second tier of obligations concerns hosting services and online platforms and adds additional obligations to the first tier.
These hosting service providers are obliged in particular (i) to put in place notification mechanisms allowing any natural person or entity to notify such providers of the presence of certain information within the service they offer which that person or entity considers to be illegal content and action, whereby hosting providers must make decisions regarding the information to which the notifications relate in a timely, diligent and objective manner that is communicated to the person who made the notification, (ii) provide any affected recipient of the service with a statement of reasons in the event that a hosting provider decides to remove or block access to certain information provided by recipients of the service irrespective of the means used to detect, identify, remove or block access to such information and the reason for such decision, the hosting service provider must inform the recipient of the decision no later than the time of removal or blocking of access to the information and must provide a clear and specific statement of reasons for such decision; and (iii) notify suspected offenses involving a threat to the life or safety of one or more persons to law enforcement or judicial authorities.
3. The third tier of obligations relates to online platforms, which have more than 50 employees and/or a turnover of more than €10 million (SMEs are exempted from these obligations) and adds additional obligations to the first two tiers.
These online platforms are mainly obliged (i) to suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service who frequently provide content that is manifestly illegal, (ii) to provide access to an effective internal complaints handling system in relation to decisions taken by the online platform provider upon receipt of a notification or in relation to decisions taken by the online platform provider on the grounds that the information provided by recipients constitutes illegal content or is incompatible with their general terms of use, (iii) to mark as such advertising on online platforms and to make public the identity of the recipient and the payer. Platforms are also prohibited from targeting individuals with online advertisements based on religion, sexual preferences, health information and political beliefs or from displaying advertisements to children and teenagers based on personal data collected online.
4. The fourth tier of obligations concerns online platforms that allow consumers to conclude distance contracts with traders (marketplaces) with more than 50 employees and/or a turnover of more than €10 million (SMEs are exempted from these obligations) and adds additional obligations to the first two tiers.
These online platforms are obliged (i) to rapidly remove illegal products and services from online marketplaces and ensure better visibility of the provenance of products and services sold on the platform and (ii) to ensure the traceability of merchants and to carry out additional Know Your Customer (KIC) and anti-fraud checks.
5. The fifth tier of obligations relates to very large online platforms (VLOPs ) and very large search engines (VLOSEs) , which have more than 45 million users and add substantial additional obligations.
VLOP are Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube, Zalando, and VLOSE are Bing and Google Search.
VLOP and VLOSE are obliged to (i) assess the risks, i.e. to diligently identify, analyze and assess any systemic risks within the Union that are caused by the design or operation of their services and their related systems, including algorithmic systems, or the use of their services, including their content moderation systems, (ii) mitigate the risks, i.e. to put in place reasonable, proportionate and effective mitigation measures tailored to the specific systemic risks identified, paying particular attention to the impact of those measures on fundamental rights; (iii) manage risks applicable to public health and public security crisis situations; (iv) undergo, at their own expense and at least annually, independent compliance audits; (v) make available to the public an advertising register containing the content of the advertisement, the person on whose behalf the advertisement is made, the person paying for the advertisement, the period during which the advertisement was made, etc. a., (vi) provide the coordinator of digital services in the country of establishment or the Commission, upon reasoned request and within a reasonable timeframe, as specified in the request, with access to the data necessary to monitor and assess compliance with the Regulation, (vii) appoint a chief compliance officer who reports directly to the management body of the very large online platform provider or very large online search engine, independent of senior management, (viii) publish transparency reports at least once every six months, (ix) pay an annual surveillance fee.
IV. What is illegal content?
Under the Regulation, 'illegal content' means any information which, in itself or in relation to an activity, including the sale of products or the provision of services, does not comply with Union law or the law of any Member State which complies with Union law, irrespective of the subject matter or the precise nature of that law.
What constitutes illegal content is therefore defined in other laws, either at EU or national level. If content is illegal only in a particular Member State, as a general rule, it should only be removed in the territory where it is illegal.
It should be noted that the national law regulating content as illegal must also comply with Union law.
V. What happens to illegal content?
If content is considered illegal under EU or national law, the intermediary service provider is obliged to take moderation measures, which means removing that content, and failure to do so could mean liability for its dissemination through its services, regardless of the users posting the content. information.
Furthermore, when national authorities make a request to remove illegal content, intermediary service providers must act promptly and comply with the request.
VI. What is information incompatible with the general terms of use?
In addition to illegal content, the Regulation also regulates the moderation of information incompatible with the terms of use.
"General terms and conditions of use" means all the terms, whatever their name or form, governing the contractual relationship between the intermediary service provider and the recipients of the service.
Therefore, information incompatible with the general terms of use is content which, although not formally illegal, cannot be disseminated according to the rules of the intermediary service provider (such as content containing nudity).
VII. What about information incompatible with the terms of use?
Information incompatible with the Terms of Use is moderated in accordance with the Terms of Use.
According to the Regulation, intermediary service providers are in principle free to determine what content is incompatible with the general terms of use and can therefore be moderated by them.
This freedom creates the risk that they arbitrarily take decisions, thus infringing the freedom of expression of internet users.
To counter this risk, the Regulation provides for two types of instruments to protect users' interests:
First, in the general terms of use, users must be informed about the procedures, means and tools for content moderation. This information should not only be clear and unambiguous, but should also be publicly available and easily accessible.
Secondly, online intermediaries must exercise due diligence, objectivity and proportionality when moderating content, taking into account the fundamental rights and interests of the parties concerned.
VIII. What is harmful content?
A third category of content covered by the Regulation is harmful content, which is not intentionally defined by the Regulation, as set out in the explanatory memorandum to the draft Regulation on digital services.
Harmful content is considered as content that is not formally illegal, but may be considered unethical and socially undesirable for other reasons (e.g. misinformation, content that promotes violence, racism or xenophobia).
In most cases, harmful content will also be content incompatible with the general terms of use. However, because it is the intermediary service providers who determine these terms and conditions, it is legally possible that socially harmful content may not be prohibited by the intermediary service provider under the terms and conditions.
IX. What happens to harmful content?
In principle, the Regulation does not lay down a general obligation for intermediary service providers to moderate harmful content.
By way of exception, obligations to moderate harmful content have been imposed on VLOPs and VLOSE, which are obliged to manage systemic risk through measures, including annual assessments, and, if necessary, appropriate measures to mitigate this risk.
Accordingly, very large internet platforms or very large internet search engines must implement a risk management system that addresses the risk of dissemination of harmful content through their services, such as disinformation, election meddling or cyber-harassment.
When conducting such a risk assessment, very large online platforms should consider how their content moderation systems, recommendation systems, and ad selection and display systems affect the dissemination of illegal and harmful content.
X. Is the right to free speech respected?
The authors of the Regulation paid particular attention to the right to freedom of expression and emphasized that the removal of or blocking of access to content should be carried out with due respect for the fundamental rights of the recipients of the service, including the right to freedom of expression and information.
With regard to the right to freedom of expression, the European Court of Human Rights has ruled that this right, guaranteed by Article 10 of the European Convention on Human Rights (ECHR), is fully applicable to the internet, but that there are expressions which do not qualify for protection under Article 10 of the ECHR, such as hate speech.
According to the Committee of Ministers of the Council of Europe, "hate speech" is to be understood as covering all forms of expression which spread, incite, promote or justify racial hatred, xenophobia, anti-Semitism or other forms of hatred based on intolerance, including: intolerance expressed by nationalism and aggressive ethnocentrism, discrimination and hostility against minorities, migrants and persons of immigrant origin.
Further, the European Court of Human Rights has ruled that the exercise of the right to freedom of expression by internet users must be balanced with the right to protection of reputation. Internet users should therefore take due account of the reputation of others, including their right to privacy.
In another vein, Article 10 ECHR is not an absolute right and it is therefore necessary to balance, on the one hand, the interest in sharing information with, on the other hand, the interest in protecting the rights of copyright holders. The Court emphasized that intellectual property is protected by Article 1 of the Protocol to the ECHR. It is thus a balancing of two competing interests which are both protected by the ECHR.
XI. Who enforces the ASD?
Oversight of compliance is shared between the Commission - mainly responsible for VLOPs and VLOSE - and the Member States, responsible for other intermediate service providers, depending on where they are established.
By February 17, 2024, Member States will have to designate competent authorities - called digital service coordinators - to oversee the compliance of services established in their jurisdiction and participate in the EU cooperation mechanism.
According to the Draft Law on Establishing Measures for the Application of Regulation (EU) 2022/2065, ANCOM is designated as the digital services coordinator.
XII. What are the sanctions?
As of February 17, 2024, the Commission, within its scope of application, may (i) impose fines of up to 6% of annual worldwide turnover where an intermediary service provider breaches DSA obligations, fails to comply with interim measures adopted by the Commission or breaches commitments and/or (ii) impose periodic penalty payments of up to 5% of average daily worldwide turnover for each day of delay in complying with remedies, interim measures, commitments.
As a last resort, where the infringement persists and causes serious harm to users and involves offenses endangering the life or safety of persons, the Commission may require the temporary suspension of service.
Within the scope of the Member States, sanctions will be determined by national laws. According to the Draft Law on the establishment of measures for the application of Regulation (EU) 2022/2065, contraventions in relation to DSAs are punishable by a fine of from RON 5,000 to a maximum of 6% of the annual worldwide turnover recorded in the previous financial year by the intermediary service provider or by a fine of from RON 5,000 to a maximum of 1% of the annual revenue or annual worldwide turnover recorded in the previous financial year by the intermediary service provider or the person concerned, as the case may be.
ANCOM may also impose administrative fines up to a maximum daily amount of 5% of the worldwide average daily turnover or worldwide average daily revenue of the intermediary service provider concerned in the previous financial year, calculated from the date specified in the sanctioning decision.
ANCOM's sanctioning decisions may be appealed to the Bucharest Court of Appeal within 30 days of notification.
XIII. Conclusions
The Digital Services Act brings clear and extensive EU-wide rules for the various digital services, establishing an increasing scale of obligations according to the type and size of intermediary service providers.
The DSA covers different categories of digital services, from simple transmissions and storage to online platforms and search engines, with rules tailored according to the responsibility and scale of impact of each type of service.
Intermediary service providers are responsible for moderating illegal content, harmful content and information incompatible with the general terms of use, where appropriate. They must take effective and proportionate measures to ensure a safe and ethical digital environment.
The act also aims to protect users' fundamental rights, including the right of free expression. Content moderation must be balanced, respecting users' rights and avoiding arbitrary interference.
Enforcement of the AVMSD is shared between the Commission and the Member States, with sanctions depending on the extent of infringements, including substantial fines and the possibility of temporary suspension of services in serious cases.
Overall, the Digital Services Act represents a significant effort to create a safer and more ethical digital environment, promoting online responsibility and protecting users' fundamental rights.
An article signed by Marius CHELARU, Managing Associate (mchelaru@stoica-asociatii.ro), STOICA & Asociații