Promote the need for sincere information for a democracy based on trust
About us
|
OUR PUBLICATIONS
|
Nos activités
|
panorama
Soutenir la fondation
Fondation Descartes

The Fondation Descartes is a citizen-based, non-partisan, and independent European foundation dedicated to information-based issues.

Operation and governance

La Fondation Descartes est animée par trois organes : un Conseil d’Administration présidé par Jean-Philippe Hecketsweiler ; un Conseil Scientifique encadré par Gérald Bronner et d’une équipe permanente, dirigée par Laurent Cordonier, Directeur de la recherche.

Contact us

Vous souhaitez contacter l’équipe permanente de la Fondation Descartes ?

Our reports

La Fondation Descartes est une initiative citoyenne, apartisane, indépendante et européenne dédiée aux enjeux de l’information, de la désinformation et du débat public dans une société démocratique.

Thematic overviews

La Fondation Descartes est animée par trois organes : un Conseil d’Administration présidé par Jean-Philippe Hecketsweiler ; un Conseil Scientifique encadré par Gérald Bronner et d’une équipe permanente, dirigée par Laurent Cordonier, Directeur de la recherche.

Experts' blog

Vous souhaitez contacter l’équipe permanente de la Fondation Descartes ?

Events

The Descartes Foundation lists significant events, in France and internationally, particularly related to the topics of disinformation, interference and media literacy.

Our partnerships

La Fondation Descartes est animée par trois organes : un Conseil d’Administration présidé par Jean-Philippe Hecketsweiler ; un Conseil Scientifique encadré par Gérald Bronner et d’une équipe permanente, dirigée par Laurent Cordonier, Directeur de la recherche.

Our podcasts

Selection of 'must-listen' podcasts on the Descartes Foundation's fields of research, in French, English or German.

Actors

The Descartes Foundation offers you a cartography of the main actors involved in researching on the quality of information, or in fighting against disinformation, in France and throughout the world.

Initiatives

Fact checkers, web extensions, journalistic standards... The Fondation Descartes offers you a map of initiatives in France and around the world involved in asserting the quality of information or in fighting against disinformation.

Our references and resources

The Fondation Descartes' experts select and comment on key publications on disinformation, trust in the media and the Foundation's other research topics. This data base is available online to members of the Foundation and in the Foundation's documentation centre.

Inscrivez-vous à la newsletter
Download this content

Digital Services Act: The Fondation Descartes' Position

Léa Giffard & Nikita Guedj
05/05/2021

Position of the Fondation Descartes on the Digital Services Act (DSA)

The Fondation Descartes welcomes the drafting of the DSA, which serves to update the e-Commerce Directive (2000). It however calls for the reconsideration of certain aspects that seem to endanger freedom of expression on the Internet.

Online platforms, as private actors, are playing an increasingly pivotal role in the moderation of online content. They now regulate public expression on their pages according to their own guidelines through the use of both artificial intelligence and human review.

Given the quasi-oligopolistic position of online platforms within the market of online public expression, they have de facto become regulators of the democratic debate. However, it is not desirable for private actors to fulfil this role by adhering to regulatory criteria that they themselves determine. In effect, this opens to the door to arbitrary regulations and to the risk of over- or under-moderation.

Yet, as the volume of content to be moderated is substantial, only the platforms themselves possess the technical capabilities to do so. As it stands, it would be unrealistic to rely solely on the judicial system to carry out this regulation – the French justice system, for instance, is not equipped to handle any additional complaints related to abusive uses of freedom of expression.

We believe that it is the responsibility of public authorities to determine the objectives and framework for the moderation of online public expression by online platforms. The latter would therefore be required to comply with these objectives and framework, under penalty of effective sanctions.

Contrary to what is proposed in the DSA, the Fondation Descartes considers that it is important to make national law the framework for moderating public expression on social networks. In France, for example, the law of 1881, which both guarantees and sets clear limits on freedom expression, as well as pertinent articles of criminal law (on terrorism, etc.), should constitute the framework for the moderation of social networks and the Internet according to the territorial application of French law. Platforms should be required to moderate content according to this legal framework and its jurisprudence.

In their moderation, platforms should in no case be able to be less restrictive than the national law determining the limits of freedom of expression. But they should not be able to be more restrictive either, by prohibiting expression that is permitted by law, except for imposing additional limits that do not disproportionately infringe on citizens’ freedom of expression but only regulate the form of expression – for example, by requiring their users to be civil in their exchanges or by prohibiting the publication of violent images or messages not necessary for information, debate, or the expression of an idea.

The Recommendations of the Fondation Descartes

Amendments to the Recital

1. On the objective of fighting hate speech

The Fondation Descartes considers that the architecture of online platforms, shaped by their business models, cannot be solely guided by commercial interests, but must also incorporate imperatives related to the public debate

Indeed, the Fondation Descartes considers that online platforms must do what is necessary in order to take these imperatives into consideration.

In this sense, the Fondation Descartes proposes that online platforms be set the objective of working to eliminate hate speech constituting abuses of freedom of expression and sanctioned as such by national law and of preventing its virality.  

Furthermore, the Fondation Descartes calls for hate speech to be defined in accordance with the national law of each Member State.

Amendment to Recitals 3 and 12:

(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. In order to behave responsibly, online platforms must do what is necessary to incorporate imperatives related to the public debate. Platforms must seek to enact measures that support this aim, as identified in the risk mitigation reports.

(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech as defined by the national law of each Member State or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question

2. On interoperability

The Fondation Descartes calls for the development of interoperability between platforms in order to encourage effective interoperability: users must be able to transition more easily from one platform to another.

The Fondation Descartes believes that “fostering interoperability” is not a precise enough term. It is a matter of encouraging the development of technical measures with the aim of ensuring the effectiveness of interoperability, according to the “Good Samaritan” principle.

Amendment to Recital 4:

(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and rendering interoperability between platforms effective, such that users can transition more easily from one platform to another. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.

3. On the clarification of intermediary services

The Fondation Descartes proposes the inclusion of search engines in intermediary services as a result of the systemic risks (as defined in article 26 of the DSA) they entail.

Amendment to Recital 14:

(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the council, 1 such as emails or private messaging services, fall outside the scope of this Regulation. Search engines fall within the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information.

I.  Amendments to Chapter II – Liability of providers of intermediary services

4. On the voluntary own-initiative investigations

The Fondation Descartes calls for clarification of Article 6: will Article 6 have the indirect consequence of rendering platforms less accountable? The Fondation Descartes is concerned that this article will replace any obligation of results with a simple obligation of means.

II. Amendments to Chapter III – Due diligence obligations for a transparent and safe online environment

5. On legal representatives

The Fondation Descartes considers it important for very large online platforms to be made legally accountable. The Fondation Descartes recommends that a legal representative of each platform be present in each Member State of the European Union, and not in one of the Member States, as provided in the text.

Thus, very large platforms should be obliged to have a legal representative in each of the 27 countries of the European Union, whom the State, the courts or individuals could challenge if they believe that platforms are not fulfilling their moderating function in accordance with the objectives and framework imposed by the public authorities of each State.

The Fondation Descartes therefore calls for the addition of an article (Article 27a) to complement Article 11 which applies to very large platforms, as drafted below.

In addition, the Fondation Descartes wonders whether the dual responsibility described in paragraph 3 of Article 11 implies a reduction of platforms’ accountability. The Fondation Descartes calls for clarification of this article.

Moreover, the Fondation Descartes calls for clarification of the missions of the point of contact and of the legal representative, whose roles overlap (see Recitals 36 and 37).

New article (27a)

Article 27a
Legal representatives

In complement to Article 11, platforms that do not have an establishment within the Union, but that offer services within the Union shall designate, in writing, a legal or natural person as their legal representative in each Member State in which the platform offers its services

6. On terms and conditions

The Fondation Descartes considers that the terms and conditions of very large platforms cannot be significantly more restrictive of freedom of expression than national law.

The Fondation Descartes suggests establishing a common set of norms for all platforms, which should reflect national law according to its territorial application and the principles of the Charter of Fundamental Rights of the European Union.

This common base would take the form of a common set of terms and conditions of use established by the European Board for Digital Services 

Amendment to Article 12:

Article 12
 Terms and conditions

  1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
  2. Providers of intermediary services shall act in a diligent, objective, and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, to the common set of terms and conditions established by the European Board for Digital Services, which namely includes the applicable fundamental rights of the service as enshrined in the Charter.

7. On transparency reporting

The Fondation Descartes considers that the nature of the implementation of transparency obligations regarding recommendation or moderation algorithms is vague. The Fondation Descartes calls for platforms to specify the inner mechanisms of their algorithms.

Amendment to Article 23:

Article 23
Transparency reporting obligations for providers of online platforms

1.            In addition to the information referred to in Article 13, online platforms shall include in the reports referred to in that Article information on the following:

(a)      the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 18, the outcomes of the dispute settlement and the average time needed for completing the dispute settlement procedures;

(b)      the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;

(c)      any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, methods used (including up-to-date versions of the mechanisms of the algorithms in use in plain language and understandable to all), indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied.

2. […]

8. On appeals to national regulators or an independent body

The Fondation Descartes considers that the regulation of online content should be handled at the national level.

As such, the Fondation Descartes proposes the establishment of moderation centres at the national level, in which platform moderators would be trained in the national law of the Member State as well as its jurisprudence (as it stands, the training of moderators of large online platforms is based on the community standards of each platform).

The role of these moderators would be to detect and remove content that is manifestly illegal according to national law and that violates the platform’s terms and conditions and to refer the most serious cases to a judge or to a regulatory authority (national regulators or an independent body associated with national regulators), depending on the nature of the alleged infraction. 

Consideration could also be given to the establishment of an independent, national mediation centre for moderation within each country of the European Union (the “independent body associated with national regulators”) which would bring together professional jurists, representatives of large online platforms and State representatives. Platform moderators could refer to this mediation centre, whose decisions would be prescriptive, in cases of doubt regarding the moderation of certain content.

Users of online platforms could also appeal to this mediation centre if they consider, after having exhausted the platform’s internal complaint-handling system, that one of their contents has been improperly moderated by an online platform or, on the contrary, if they believe that some unmoderated content ought to be moderated. For users of online platforms, the possibility of resorting to this mediation centre would be an optional alternative that would in no way impede their right to appeal directly to the courts.   

Lastly, the decisions of this mediation centre for moderation could be appealed, as judges remain the ultimate protectors and arbiters of freedom of expression. The funding of this mediation centre for moderation would be fully provided by online platforms.

The notice and action mechanisms of Article 14 detrimentally confer to digital platforms the function of a judge. The Fondation Descartes calls for this role to be shared between the justice system, the platform and the regulatory authority.

The Fondation Descartes suggests the creation of a new article (Article 17a, for instance), intended for online platforms, to provide them certain new obligations, as drafted below:

New article (Article 17a):

Article 17a
Appeals to national regulators or an independent body

1. In conjunction with Article 5, platforms shall have access to national moderators trained in the national law of the Member State, who may seek advice from national regulators or from an independent body associated with national regulators, whose decisions would be prescriptive.

2. Users shall also be able to appeal to national regulatory authorities or to the independent body associated with national regulators, after having exhausted the platform’s internal complaint-handling system, if they wish to challenge a decision taken by the platform.

3. The independent body associated with national regulators would take the form of an independent, national mediation centre for moderation in each country of the European Union. Its organization will be defined by national implementation decrees.

9. On notice and action mechanisms

In the interest of guaranteeing freedom of expression, the Fondation Descartes is concerned that engaging the responsibility of online platforms as soon as they are notified of illegal content could lead to the over-moderation of reported content.

It cannot be assumed that a notice of illegal content is justified and effectively contains illegal content. The platform must therefore be provided with the necessary time to discern and judge the manifestly illegal character of the content prior to its removal. In this sense, the excessive or hurried removal of legal content is as harmful as the failure to remove illegal content within a reasonable delay.

Indeed, the Fondation Descartes considers that the risk of over-moderation must be taken into account in the same way as the risk of under-moderation.

The Fondation Descartes calls for the clarification of the notion of having “knowledge of illegal activity or illegal content” (Article 5). It considers that a notice cannot constitute actual “knowledge”.

As a result, the Fondation Descartes believes that the responsibility of platforms with regards to notified content should be incurred once the notified content has been reviewed by the platform (and not once platforms receive a notice of illegal content), in order to avoid platforms unduly removing notified content in fear of being held legally accountable for said content if it is indeed illegal.

So as to not slow down the moderation process, the notice should be processed as quickly as possible according to the severity of the content and the reach of the information in question. Platforms will be held responsible if the notice is not processed within a reasonable delay.

Amendment to Article 14:

Article 14
Notice and action mechanisms

1.  Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly and allow for the submission of notices exclusively by electronic means.

2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. Moderators present in each country assist in the moderation process. In cases of serious doubt, providers may call upon the national regulatory authority or the independent body associated with the national regulator of the country of destination (cf. Article 17a). To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:

(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content;

(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs and, where necessary, additional information enabling the identification of the illegal content;

(c) the name and electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU; 

(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

3.  Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Articles 5 and 17a in respect of the specific item of information concerned. The responsibility of platforms with regards to notified content would be incurred only once the content in question has been reviewed by the platform within a reasonable delay. The notice shall be processed as quickly as possible according to the severity of the content and the reach of the information.

10. On out-of-court dispute settlements

The Fondation Descartes calls for a clarification of the jurisdiction of the out-of-court dispute settlement body. Indeed, it is a matter of judging in accordance with national law and its mechanisms, hence the need for national jurisdiction (even if the dispute settlement is held remote).

Amendment to Article 18:

Article 18
Out-of-court dispute settlement

1. Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.

The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law.

2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body which is competent in the national territory of the Member State where the body has demonstrated that it meets all of the following conditions: […]

11. On trusted flaggers

The Fondation Descartes calls for the implementation of strict criteria for the accreditation of trusted flaggers such as expertise and respect of ethical rules.

Amendment to Article 19:

Article 19
Trusted flaggers

1.  Online platforms shall take the necessary technical and organisational measures to ensure the notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.

2.  The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:

(a) it has particular multidisciplinary expertise and competence for the purposes of detecting, identifying and notifying illegal content;

(b) it represents non-partisan collective interests and is independent from any online platform;

(c)  it responds to ethical rules

(d)  it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner.

3.  […]

12. On decisions of account suspension

The Fondation Descartes considers that the decision to suspend an account that has frequently published illegal content can be entrusted onto online platforms only if an appeal mechanism to the national regulatory authority or the independent body associated with national regulators is concurrently established. 

Platforms will be held accountable for any abusive suspensions.

Amendment to Article 20:

Article 20
Measures and protection against misuse

1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.

2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:

(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;

(b) the relative proportion thereof in relation to the total number of items of information provided or notices submitted in the past year;

(c) the gravity of the misuses and its consequences;

(d) the intention of the recipient, individual, entity or complainant.

4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

5. All decisions of platforms relative to the present Article shall be subject to an appeal mechanism to the national regulatory authority or the independent body associated with the national regulator. Platforms shall be held accountable for any abusive suspension.

13. On the vetting of researchers

The Fondation Descartes welcomes the opening of access to data to researchers and stresses the need for the vetting procedure to be simple and fast.

Amendment to Article 31:

Article 31
 Data access and scrutiny

1. Very large online platforms shall provide the Digital Services Coordinator of establishment of the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data and to recommendation algorithms that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.

2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). The procedure, if the requirements in paragraph 4 are met, shall be simple and fast.

3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate.

4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. There will be no oversight for platforms as regards the publications resulting from the data.

[…]

14. On Digital Services Coordinators

The Fondation Descartes calls for the clarification of the status of Digital Services Coordinators.

This status must be independent, similarly to that of independent administrative authorities under French law. Administrative authorities are defined as administrative bodies that act on behalf of the State and possess regulatory and sanctioning powers, and act independently so as to not depend on any commercial or political interests.

Amendment to Article 39:

Article 39
Requirements for Digital Services Coordinators

1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Digital Services Coordinators shall be statutorily independent from public authorities and the private sector. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks.

2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete indepence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party.

3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law.

  1. Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36[]
Topic :  Platform Regulation, Digital Services Act (DSA)  
/
Format :    
Language  :  English 
/
Keywords :   
Share the article
Suivre les actualités de la désinformation
S'inscrire à la newsletter
Soutenir nos actions
faire un don
Fonds de dotation pour la création de la Fondation Descartes
8, Avenue du Président Wilson 75116 Paris.
Plus d'informations
Mentions légales
Gestion des cookies
Contactez-nous
Copyright © 2024 - Site réalisé par Monsieurcom
crosschevron-down linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram