Preserving the limited exemption from secondary liability
The E-Commerce Directive establishes limited exemptions from secondary liability for information society service providers (Internet intermediaries). Such an approach should be preserved in any revision of the current framework. Where policy requires that Internet intermediaries intervene to suppress content, this should be implemented through the imposition of complementary statutory obligations (potentially carrying sanctions for non-compliance), and not by creating derogations to the liability protection in the E-Commerce Directive.
Freedom to provide lawful services
The freedom to provide lawful services, which is guaranteed by the Charter of Fundamental Rights and the Telecoms Framework Directive, should not be infringed. Therefore, Internet intermediaries should not be prohibited from offering certain types of otherwise legitimate services on the grounds that it is not technically possible or commercially feasible to apply content regulation obligations to such a service. Rather, these obligations should only apply to the extent that they are feasible for the service in question. Furthermore, Internet intermediaries should not be subjected to a priori licensing or approval.
Preserving anonymity on the Internet
There should be no general legal prohibition on the provision of services to anonymous users. The right of users to state a fact or an opinion anonymously should be protected, and Internet intermediaries should not be required to close their services to unidentified users. At the same time, Internet intermediaries should still be allowed to verify the identity of their users, depending on their business model and terms of service.
If the law introduces a requirement to verify some attribute of a user to prove entitlement to access a service (e.g. age verification, European or national identity verification), it must be possible to satisfy this requirement without identifying the user (e.g. via proof of age tokens, instead of copies of ID cards or passports).
Where necessary, any instrument introducing a requirement for such verification should also include a requirement for Member States to ensure that a mechanism for anonymous verification exists.
Horizontal scope of the instrument
New statutory obligations to remove illegal content online should apply horizontally to any type of illegal content. This will better enable the creation of a balanced, consistent, sustainable and stable framework for the role of Internet intermediaries, that takes into account the legitimate interests of all stakeholders. Differing procedures for different types of content should be justified by objective distinctions (for example, whether or not the nature and legal status of the content is objectively classifiable; whether or not the alleged infringement is of criminal law or instead of private rights). Policy-makers should ensure legal coherence between any new vertical legislation and the existing vertical and horizontal laws. In order to avoid regulatory fragmentation, a horizontal approach should always be preferred to the extent possible.
Protecting SMEs and startups
Obligations on providers should not be such that market entry and market viability are only available to extremely highly capitalised entities. A vibrant Digital Single Market, open to SMEs and startups, should be preserved, as vigorous competition is a driver of innovation.
Maintain the existing categories of the E-Commerce Directive
Given the evolution of the Internet ecosystem, additional statutory obligations in order to tackle illegal content online more effectively should not apply to services covered by the existing definitions of the E-Commerce Directive. Neutral services, such as mere conduits, caching services, electronic communication services, hosting service providers, or other services provided in the layers of the Internet infrastructure, such as registries and registrars, DNS (domain name system) or adjacent services, such as payment services or DDoS (distributed denial of service) protection services, and pure hosting service providers, should still be protected by the limited exemption from secondary liability laid down in the E-Commerce Directive.
A new category for “online platforms”
A new category of “online platforms” should be added to the existing ones, in order to distinguish precisely between pure hosting service providers, who do not have control over the content that they host, and other algorithm-driven, consumer-facing service providers with more control over the content that they host, focussing on user-generated content (such as social media platforms).
Such online platforms should be subject to additional statutory obligations in terms of tackling illegal content as compared to other ISPs. The precise definition of the category will determine what obligations can be imposed, in alignment with the capabilities of the services falling within the definition. Accordingly, a more precise definition would allow a broader range of obligations, while the reverse would be true with a less precise definition, as some options might not be applicable to certain types of operator falling within a broader definition.
A balanced Good Samaritan clause
Internet intermediaries should benefit from a “Good Samaritan clause”, extending protection from liability in cases where they have actual knowledge of allegedly illicit content when they apply in good faith procedures designed to tackle said content, where those procedures contain measures to preserve fundamental rights. In practice, this would extend liability protection both (1) when proactively searching for illicit content and (2) after any good faith decision that potentially illicit content does not qualify for removal. The introduction of such a clause would both enhance suppression of genuinely illicit content (by removing the disincentive to search actively) and enhance protection of fundamental rights (by removing the legal impediment to offering an appeals process).
Subsidiarity: importance of removal at source and limitations of access blocking
In first instance, competent authorities should always take action against the content provider of the illegal content itself (the user) or the online platform. Where possible, removal at source should always be preferred and prioritised. Only in the case that there is no action from the content provider or platform, as ultima ratio, should the competent authority request the access provider to intervene. As blocking at the level of the access provider is in principle neither effective nor proportionate, national legislation mandating this must satisfy certain criteria. Mandatory blocking should be mandated by a Court order or a public authority, in full respect of fundamental rights’ safeguards, accompanied by cost reimbursement for the affected Internet intermediaries.
Limits of the obligations
It is impossible for online platforms to tackle all illegal content being shared by their users. No Internet intermediary should be subject to a duty that is impossible or commercially unreasonable for it to undertake. The manner in which all Internet intermediaries would fulfil their statutory obligations should not be excessively prescriptive, especially as to technical means, in order to preserve its flexible, technology-neutral and future-proof nature. At the same time, statutory obligations must avoid requiring an absolute outcome/strict liability (e.g. to ensure the non-availability of illegal content), as this is likely to be impossible to achieve and does not give adequate guidance on what measure the Internet intermediary could take that might satisfy the obligation.
Moreover, Internet intermediaries should not be subject to general monitoring obligations. Such an obligation would not respect important underlying fundamental rights of users, and would not be proportionate and respectful of industry’s freedom to conduct business activity.
Due process must be respected
Judicial oversight should always be ensured in the context of how Internet intermediaries deal with illegal content and Governments should maintain their role in upholding the Rule of Law. Competent authorities should only be permitted to send orders/requests for the purpose of suppressing content or restricting a user’s access to the service on the basis of an identified infringement of law (civil or criminal); competent authorities should not be permitted to use the providers’ Terms of Service as an alternative basis, without identifying an infringement of law. Given the importance of legal certainty, legislation should clearly describe the obligations according to the respective roles of each Internet intermediary and the instances in which liability is placed on each them.
Users should always have the right to challenge decisions to suppress content or restrict their access to a service before their national Courts. Competent authorities should be accountable for their interventions, and so their identity should be disclosed to the user when they ask for the suppression of user content or service facilities (NB: this is distinct from covert investigations, e.g. when the authority seeks user account information).
Safeguards and fundamental rights
Any new obligations to act against content should be balanced with new obligations to treat the content provider (the end user) fairly. Fair treatment might be ensured procedurally, for instance by imposing clarity as to the rules and restrictions, transparency as to what rule is said to be broken, independence of decision making, right to be notified, a right of appeal, etc. If the decision on content takedown has been made by the court, the above safeguards could be dispensed with, as the court operates these protections itself.
In order to make fair treatment possible, Internet intermediaries should be granted a new protection from liability during the period in which they are taking reasonable steps to determine the justification for action against infringing content. In the event that a fair treatment process has reasonably determined that there is no infringement (i.e. the allegation of infringement was unjustified), the ISP's original limited exemption from secondary liability should be restored so that they are not exposed to liability, should a further process overturn the original ruling.
Regulation of complaint making
Provisions regulating the practice of making complaints to intermediaries should be introduced: those issuing notices should be placed under the obligation to flag content accurately, and in good faith. Those who file fraudulent notices, in order to induce the intermediary to interfere with content published by a third party, or to induce the intermediary to restrict the third-party publisher (e.g. account cancellation), should be held accountable and liable for economic loss and other harm to both the end user and the to the Internet intermediary. Intermediaries should be permitted to ignore notices from individuals and organisations that persistently abuse “notice and takedown” procedures, and such notices should not count as conveying “actual knowledge”. Furthermore, provisions should describe a minimum degree of quantity and quality of information that an Internet intermediary would need to respond to a complaint.
The role of trusted flaggers
"Trusted flaggers" must be a private organization, rather than a public authority. They can be useful in some circumstances, but it must be taken into account that some entities might have vested interests, with the risk of co-opting critical platforms for political, cultural, or private economic interests. Accordingly, trusted flaggers should not be excluded from the regulation of complaint-making described above (and indeed might be subjected to more stringent requirements). Internet intermediaries should be free to choose their own trusted flaggers. They should never be forced to work with them, nor should they be required to automatically act upon a trusted flagger's judgment instead of their own assessment.
Sanctions against operators for non-compliance with any new obligations should be proportionate to the offence and level of culpability. When determining the sanction, aggravating and mitigating factors, such as the size and capabilities of the intermediary, should be taken into account. Individual instances of non-compliance with a statutory duty should only give rise to a maximum penalty proportionate to that instance of non-compliance. If the operator systematically refuses to comply, a greater sanction that is sufficiently dissuasive may be justified, but this further aggravation should be proved, not assumed. Sanctions should only be assigned after verifying that the online platform has made its best efforts to comply with the obligations, rather than because of the failure to achieve the assumed result.
When Internet intermediaries are required by policy-makers to carry out obligations equivalent to a public function, they should receive full compensation at market value out of ordinary taxation, as imposing such a public function constitutes a meaningful appropriation of private property. Such a situation should only be justifiable in exceptional cases: public functions should normally be carried out by public authorities held accountable by democratic supervision and constraints aimed at the State, such as fundamental rights.
Sector specific taxes, levies and “regulatory fees”
Sector specific taxes, levies and "regulatory fees" should not be imposed. There should be no levies imposed on the Internet industry for the benefit of other industry sectors, whether justified by claims the Internet is harming said other sector, or for any other reason. If new instruments create new duties for public authorities in relation to Internet content (or indeed, new authorities), these should be funded out of general taxation.
Download the principles here.
For more information or questions on EuroISPA's guiding principles for the Future of the EU Intermediary Liability Framework, please contact EuroISPA's Secretariat.