You searched for:
EU Commission issues guidance to online platforms for tackling illegal content online
On 28 September 2017 the European Commission published its Communication on ‘Tackling Illegal Content Online’ (“the Guidance”). The Guidance provides a set of guidelines and principles directed towards online platforms (“OPs”) and their role and responsibilities in dealing with illegal content.
The overarching aim of the Guidance, which is not legally binding, is to encourage the implementation of good practices for preventing, detecting, removing and disabling access to illegal content, increasing transparency, and ‘clarifying’ OPs liability under the safe harbour of Article 14 of the E-commerce Directive when taking the proactive steps in the Guidance. The Guidance follows the Commission’s stated intention to encourage EU-wide ‘self-regulatory’ efforts by OPs in the context of its Digital Single Market strategy and against a political backdrop in which OPs are coming under pressure to up their efforts in this area.
The main elements of the Guidance are as follows:
OPs are encouraged to have the resources necessary to understand the legal frameworks in which they operate so that they can make swift decisions about illegal content without the need for a court order requiring it to be removed or blocked. To facilitate this, OPs should also cooperate closely with law enforcement and other relevant authorities, establishing points of contact and developing digital interfaces to ensure that they can be contacted quickly and deal with removal requests expeditiously.
OPs are also encouraged to cooperate closely with ‘trusted flaggers’ (i.e., specialised entities with expertise in identifying illegal content such as Europol’s Internet Referral Unit) and create ‘fast-track’ channels through which trusted flaggers can provide notices. The Commission also proposes to explore EU-wide criteria for classification of a ‘trusted flagger’.
OPs should deploy easily accessible and user-friendly reporting mechanisms which enable the electronic submission of high quality notices (i.e., sufficiently precise and adequately substantiated) of illegal content they might be hosting. Users should not be compelled to identify themselves in these notices, although they should be provided with an opportunity to do so, which in some instances may be necessary (e.g., to determine the illegality of the content by asserting ownership of IPR).
OPs are strongly encouraged to take a proactive approach and “do their utmost” to detect and remove illegal content online, rather than simply reacting to notices that they receive, and utilise and develop automatic detection and filtering technologies. The Commission stresses that OPs should remove illegal content as fast as possible especially where serious harm is at stake (e.g., if content is inciting the commission of terrorism). The Commission also proposes to further analyse the possibility of setting fixed timeframes for removal. Additionally, where an OP finds evidence of criminal activity in the context of the removal of illegal content that should be reported to law enforcement authorities.
OPs should clearly explain their content removal policies in their terms of service, including information about what content is not permitted, the procedures governing removal and objecting to removal decisions (including those triggered by trusted flaggers). The Commission also encourages OPs to publish at least once a year transparency reports providing detailed information about the number and types of notices they have received, action taken, processing time, source of the notification and any counter-notices, and intends to explore the possibility of a standard form of reporting for this purpose.
OPs should allow content providers the opportunity to contest any removal via a counter-notice and where a counter-notice provides reasonable grounds to consider that the notified information/activity is not illegal, the OP should restore it without undue delay or allow the user to re-upload it (without prejudice to the OP’s terms of service).
OPS are encouraged to put in place measures to dissuade users from repeatedly uploading illegal content (e.g., suspending or terminating their accounts) as well as using and developing automated technologies to prevent the re-appearance of illegal content.
While the safe harbour regime under the E-commerce Directive remains unchanged, the Guidance will give OPs considerable pause for thought.
Under Article 14 of the E-commerce Directive OPs are typically exempted from liability for hosting illegal content where they do not play an active role of such a kind as to give them knowledge or control over that content (see Google France C-236/08 to C-238/08). As the Guidance anticipates that OPs will now take a much more active role, they are more likely to obtain the knowledge/control that would result in them losing the exemption from liability in Article 14. To maintain the benefit of the hosting exemption, OPs will, upon obtaining actual knowledge or awareness of illegal content, need to act expeditiously to remove or disable access to it.
The Guidance, if followed to the letter, looks set to significantly increase the administrative burden on OPs and place them in the challenging position of being arbiters over all manner of illegal activities, from statements which are defamatory, incite racial or religious hatred or promote terrorism to infringement of IPR. Obviously an act considered to promote terrorism is easier to identify than what may be considered defamatory or infringing IPR, yet the Commission seems to class all such acts as illegal activity and therefore within the acts OPs should proactively filter, detect and remove. The Guidance also sits uncomfortably with Article 15 of the E-commerce Directive which provides that EU Member States must not place any ‘general obligation’ to monitor on OPs to monitor the content which users post on their services - that seems precisely what the Guidance is aimed towards.
Some OPs might view the Guidance as the lesser of two evils. There has been much rhetoric from individual Member States, particularly in the wake of localised terrorist attacks, that firm legal regulation should be introduced at the national - rather than EU - level. The Commission will be mindful of any such developments and will no doubt be concerned to avoid a proliferation of potentially inconsistent national laws regulating OPs.
Am I an ‘Online Content Sharing Service Provider’ under Article 17 (formerly Article 13) of the proposed Copyright Directive?
Further obstacles for Europe’s draft Copyright Directive: Disharmony in the Council of the EU
European Parliament approves new controversial copyright law