Important progress in one year, but some challenges remain!
⛔ The EU and its Member States, together with social media companies and other platforms have a responsibility to act so that the internet does not become a free haven for illegal hate speech and violence.
⛔ One year after its adoption, the Code of Conduct on countering illegal hate speech online has delivered some important results.
⛔ On average,in 59% of the cases, the IT companies responded to notifications concerning illegal hate speech by removing the content. This is more than twice the level of 28% that was recorded six months earlier.
⛔ The amount of notifications reviewed within 24 hours improved from 40% to 51% in the same six months period. Facebook is however the only company that fully achieves the target of reviewing the majority of notifications within the day.
⛔ As compared with the situation six months ago the IT companies have become better at treating notifications coming from citizens in the same way as those coming from organisations which use trusted reporters channels.
⛔ Quality of feedback motivating the decision is an area where further progress can be made.
Commission initiative with social media platforms and civil society shows progress
Brussels, 1 June 2017
Other available languages: [DA ES IT SV PT FI EL CS ET LT LV MT PL SK SL BG RO HR]
One year ago, the European Commission and four major social media platforms announced a Code of Conduct on countering illegal online hate speech.
It included a series of commitments by Facebook, Twitter, YouTube and Microsoft to combat the spread of such content in Europe. An evaluation carried out by NGOs and public bodies in 24 Member States, released on the first anniversary of the Code of Conduct, shows that the companies have made significant progress in following up on their commitments.
Andrus Ansip, European Commission Vice President for the Digital Single Market, welcomed progress: “Working closely with the private sector and civil society to fight illegal hate speech brings results, and we will redouble our joint efforts. We are now working to ensure closer coordination between the different initiatives and forums that we have launched with online platforms. We will also bring more clarity to notice and action procedures to remove illegal content in an efficient way – while preserving freedom of speech, which is essential.”
Vĕra Jourová, EU Commissioner for Justice, Consumers and Gender Equality, said, “The results of our second evaluation of the Code of Conduct are encouraging. The companies are now removing twice as many cases of illegal hate speech and at a faster rate when compared to six months ago. This is an important step in the right direction and shows that a self-regulatory approach can work, if all actors do their part. At the same time, companies carry a great responsibility and need to make further progress to deliver on all the commitments. For me, it is also important that the IT companies provide better feed-back to those who notified cases of illegal hate speech content.”
The European Union is founded on the values of respect for human dignity, freedom, democracy, equality, rule of law and fundamental rights. The EU and its Member States, together with social media companies and other platforms have a responsibility to act so that the internet does not become a free haven for illegal hate speech and violence.
By signing the Code of Conduct, the IT companies committed in particular to reviewing the majority of valid notifications of illegal hate speech in less than 24 hours and to removing or disabling access to such content, if necessary, on the basis of national laws transposing European law. The Code also underlined the need to further discuss how to promote transparency and encourage counter and alternative narratives.
One year after its adoption, the Code of Conduct on countering illegal hate speech online has delivered some important progress, while some challenges remain:
* On average,in 59% of the cases, the IT companies responded to notifications concerning illegal hate speech by removing the content. This is more than twice the level of 28% that was recorded six months earlier.
* The amount of notifications reviewed within 24 hours improved from 40% to 51% in the same six months period. Facebook is however the only company that fully achieves the target of reviewing the majority of notifications within the day.
* As compared with the situation six months ago the IT companies have become better at treating notifications coming from citizens in the same way as those coming from organisations which use trusted reporters channels. Still, some differences persist and the overall removal rates remain lower when a notification originates from the public.
* Finally, the monitoring showed that while Facebook sends systematic feedback to users on how their notifications have been assessed, practices differed considerably among the IT companies. Quality of feedback motivating the decision is an area where further progress can be made.
Improvements in the handling of complaints from users and cooperation with civil society
Within the last year, the IT companies have strengthened their reporting systems and made it easier to report hate speech. They have trained their staff and they have increased their cooperation with civil society. The implementation of the Code of Conduct has strengthened and enlarged the IT companies’ network of trusted flaggers throughout Europe.
The increased cooperation with civil society organisations has led to a higher quality of notifications, more effective handling times and better results in terms of reactions to the notifications.
The Commission will continue to monitor the implementation of the Code of conduct with the help of civil society organisations. Improvements are expected by IT companies in particular on transparency of the criteria for analysing flagged content and feedback to users.
The Commission will take the results of this evaluation into account as part of the work announced in its mid-term review on the implementation of the Digital Single Market Strategy. The Commission will also continue its work to promote more efficient cooperation between the IT companies and national authorities.
The Framework Decisionon Combatting Racism and Xenophobia criminalises the public incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to race, colour, religion, descent or national or ethnic origin. Hate speech as defined in this Framework Decision is a criminal offence also when it occurs in the online world.
A recent European survey showed that 75% of those following or participating in debates online had come across episodes of abuse, threat or hate speech. Almost half of these respondents said that this deterred them from engaging in online discussions.
The EU, its Member States, together with social media companies and other platforms, all share a collective responsibility to promote and facilitate freedom of expression throughout the online world. At the same time, all of these actors have a responsibility to ensure that the internet does not become a free haven for violence and hatred.
To respond to the increased problem of illegal hate speech in the online world, the European Commission and four major IT companies (Facebook, Microsoft, Twitter and YouTube) presented a “Code of conduct on countering illegal hate speech online”on the 31 May 2016.On 7 December 2016 the Commission presented the results of a first monitoring exercise to evaluate the implementation of this code of conduct.
The mid-term review on the implementation of the Digital Single Market Strategy issued on 10 May 2017 confirmed the need to continue working towards minimum procedural requirements for the ‘notice and action’ procedures of online intermediaries, including as concerns quality criteria for notices, counter-notice procedures, reporting obligations, third-party consultation mechanisms and dispute resolution systems. In the same vein, the Commission’s proposal for a revision of the Audiovisual Media Services Directive contains strong provisions to oblige platforms to set in place a flagging system for audiovisual material containing hate speech online.
The Commission has set up several dialogues with online platforms within the Digital Single Market (e.g. EU Internet Forum, Code of Conduct on illegal online hate speech, and Memorandum of Understanding on the Sale of Counterfeit Goods over the Internet) and plans to coordinate these in a more efficient way to ensure the best possible results. These IT companies are also members of the “Alliance to better protect minors online”, a multi-stakeholder platform facilitated by the European Commission to provide a better and safer digital environment to tackle harmful content and behaviour.
These efforts, initiated by the Commission, also contribute to the action of G7 leaderswho have recently committed to supporting industry efforts and increasing engagement with civil society to combat online extremism.
Code of Conduct on countering illegal online hate speech
Brussels, 1 June 2017
What is the aim of this Code of Conduct?
Each of the IT companies (Facebook, Google, Twitter, Microsoft) that signed this Code of Conduct is committed to countering the spread of illegal hate speech online, and to having rules that ban the promotion of violence and hatred.
When they receive a request to remove content from their online platform, the IT companies will assess the request against their rules and community guidelines and, where applicable, national laws on combating racism and xenophobia. They then decide if the content can be considered as illegal online hate speech and if needs to be removed.
The aim of the Code is to make sure that requests to remove content are dealt with speedily. The companies have committed to reviewing the majority of these requests in less than 24 hours and to removing the content if necessary.
What is the definition of illegal hate speech?
Illegal hate speech is defined in EU law (Framework Decision on combating certain forms and expressions of racism and xenophobia by means of criminal law) as the public incitement to violence or hatred on the basis of certain characteristics, including race, colour, religion, descent and national or ethnic origin.
Will the Code of Conduct lead to censorship?
No. The Code of Conduct’s aim is to tackle online hate speech that is already illegal. The same rules apply both online and offline. Content that is illegal in the offline should not be allowed to remain legal in the online world.
The Code’s aim is also to defend the right to freedom of expression. The results of a 2016 European surveyshowed that 75% of those following or participating in online debates had come across episodes of abuse, threat or hate speech aimed at journalists. Nearly half of these people said that this deterred them engaging in online discussions. These results show that illegal hate speech should be effectively removed from social media, as it might limit the right to freedom of expression.
Isn’t it for courts to decide what is illegal?
Yes, interpreting the law is and remains the responsibility of national courts. At the same time, IT companies have to act in line with national laws, in particular those transposing the Framework Decision on combatting racism and xenophobia and the 2000 e-commerce Directive.
When they receive a valid alert about content allegedly containing illegal hate speech, the IT companies have to assess it, not only against their rules and community guidelines, but, where necessary, against applicable national law (including that implementing EU law), which fully complies with the principle of freedom of expression.
Should one take down ‘I hate you’?
Not every offensive or controversial statement or content is illegal. As the European Court of Human Rights said, ‘freedom of expression … is applicable not only to “information” or “ideas” that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population’.
In the Code, both the IT Companies and the European Commission also stress the need to defend the right to freedom of expression.
Assessing what could be illegal hate speech includes taking into account criteria such as the purpose and context of the expression. The expression ‘I hate you’ would not appear to qualify as illegal hate speech, unless combined with other statements about for example threat of violence and referring to race, colour, religion, descent and national or ethnic origin, among others.
What prevents government abuse?
The Code of Conduct is a voluntary commitment made by Facebook, Twitter, YouTube and Microsoft. It is not a legal document and does not give governments the right to take down content.
The Code cannot be used to make these IT Companies take down content that does not count as illegal hate speech, or any type of speech that is protected by the right to freedom of expression set out in the EU Charter of Fundamental Rights.
How did the Commission evaluate the implementation of the Code of Conduct?
The Code of Conduct is evaluated through a monitoring exercise set up in collaboration with a network of civil society organisations located in different EU countries. Using a commonly agreed methodology, these organisations test how the IT companies applied the Code of Conduct in practice. They do this by regularly sending the four IT Companies requests to remove content from their online platforms. The organisations participating in the monitoring exercise record how their requests are handled. They record how long it takes the IT companies to assess the request, how the IT Companies’ respond to the request, and the feedback they receive from the IT Companies.
For more information: