Social media platforms hosting antisemitism? The EU can now fine them for that

While hate speech on social media is at an all time high, with celebrities and influencers adding fuel to the flames, the European Union recently approved a new piece of legislation and a strict enforcement timeline to crack down on online hate speech and offer protection from various forms of online harms and illegal activities. The Digital Services Act (DSA), which entered into force on November 16, 2022, is the first and most comprehensive piece of legislation regulating online companies and platforms. Over the next few years, the world will be looking to the EU to monitor the efficacy and impact of this legislation, and, potentially, follow suit.

The Digital Services Act Overview

The Digital Services Act (DSA), passed in conjunction with the Digital Markets Act (DMA), serves two purposes: 

  1. To create a safer digital space in which the fundamental rights of all users of digital services are protected.
  2. To establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.

The DSA acts as a framework for online companies and platforms operating in the EU to create transparency and accountability, both of which have been sorely lacking. The guidelines set forth the measures these platforms must take in order to ensure a safe experience for all users, as well as to protect their rights as citizens. At its core, the DSA requires the monitoring for and removal of illegal goods, services, and content. Importantly, illegal content includes “illegal hate speech”, which is defined, in part, as “(a) public incitement to violence or hatred in respect of a group of persons or a member of such a group defined by reference to colour, race, religion or national or ethnic origin.” The DSA will also require independent audits by online platforms to prevent the spread of false information, attempts to manipulate elections, or attempts to harm women and children.

Along with preventing illegal content online, the DSA requires that all major platforms “set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms,” adding another layer of protection for users. Reporting will become an accessible tool, and trusted flaggers will be introduced in order to better vet illegal posts online. All of these mechanisms are aimed at empowering users and enabling them to take control of their online spaces.

What this means is that Jewish users in the European Union have the right to not be targeted with antisemitic rhetoric and hatred, and if reporting it to the platform results in inaction, they can take legal action.

What Makes the Digital Services Act Unique?  

The Digital Services Act creates heavy oversight of online platforms, especially larger companies operating in the online space. Under the DSA, a Commission will be formed to ensure that platforms comply with the new guidelines. “Very large platforms”, defined as those with over 45 million users, must comply with even more stringent regulations because of the major influence they have on rapidly disseminating information online. 

The Commission has the ability to enforce the regulations and, if need be, may impose fines of up to 6% of the global turnover (profit) of a service provider. Not only does the Commission have power to enforce consequences, but also independent users who feel that they have been harmed will be able to seek compensation for any damage caused by an infringement of the DSA in and out of court. The addition of trusted flaggers and an intense vetting process for third parties adds a further layer of accountability. 

For the first time ever, real profit and financial consequences are at stake for major online platforms that fail to adhere to government regulations and stand by their community guidelines. They can no longer ignore the reports and complaints they receive, otherwise there will be repercussions.

Standardized Regulations Make a Difference! 

Certain platforms, such as 4chan, are known to host users with fringe ideologies and hate. Such speech is not only permitted, but also prevalent and even expected. But many haters and aggressors are not satisfied with being restricted to speaking with like-minded individuals. Some want to reach as many people as possible in order to troll and harass members of minority groups and radicalize new people to their cause. This is why they seek to abuse mainstream social media platforms, or “very large online platforms”. Recently, this became apparent during Elon Musk’s Twitter takeover when haters, and antisemites especially, were eagerly anticipating and even coordinating the moment when they could begin spewing antisemitic vitriol on Twitter. The DSA standardizes treatment and categorization of the platforms. Had something like the DSA been in place in the EU, the US, and other nations, the ugly spike of antisemitism that CyberWell found during the Twitter transition would not have happened because Twitter would not have taken the legal and financial risk of  playing  around with the implementation of its community standards and policies.

Reporting Matters

An important component of the DSA, and highly relevant for Jewish users, is the facet stating that “users will be empowered to report illegal content in an easy and effective way.” Many Jewish social media users are all too familiar with the “report this post” button on various platforms. Infact, one of CyberWell’s main strategies for improving the enforcement of social media platform policies involves rallying social media users to log in to our reporting platform and report the hate that we have collected. Unfortunately, most users are also familiar with a complete lack of response from the platforms regarding their reports, or the dreaded “this post did not violate our terms of service” response. It is rare (25% at best, more typically just 3-10%) that a reported antisemitic post ends up being removed, as numerous external reports and CyberWell’s own research shows. This leads to reporting fatigue, where Jewish users feel like their reports don’t make a difference. But reporting matters! And if reporting is made easier, as the DSA requires, perhaps it will stave off reporting fatigue and empower Jewish users to keep taking action. 

Why does reporting matter? Social media platforms heavily rely on users to report misconduct and only then review the reported content for potential removal. The more reports a post receives, the faster the platforms review that post. The current reporting model places the burden on the individual to report harmful content and creates a system where posts targeting minority groups and individuals stay on longer simply because fewer people report it. Unfortunately, this leaves individuals and minority communities in a situation where their reports often go ignored. Every day, hate directed against the general public and minority groups specifically does not gain the eye of the social media review team and remains online.

Demanding Accountability

But the DSA goes one step further – it actively works to take the onus off of the user to report hate speech and requires that social media companies monitor their own platforms, remove hateful content, and share this information transparently with researchers and the EU. We know that social media companies can monitor and remove content when they are motivated by the purse strings to do so (Copyright infringement? Removed in a heartbeat). With the DSA overseeing very large platforms directly and making it possible to fine them up to 6% of their global profit, we can expect to see social media companies actively monitoring their own platforms for hate and taking it down.

What Happens Next?

Now that the DSA has entered into force, things will be moving forward. By February of 2023, all companies operating online must have reported the number of active end users on their websites and platforms. The Commission will then designate each company according to its number of users, at which point the companies will have 4 months to comply with the various rules, regulations, and obligations that the DSA has created, including providing the Commission with a risk assessment exercise. By January 1, 2024, the EU expects that all online search engines and platforms will be fully compliant with the DSA. 

CyberWell looks forward to seeing how social media platforms respond to the new regulations, and sincerely hopes that this is a step in the right direction. It is critical to protect minority communities from online harm. With online antisemitism on the rise globally and becoming normalized across mainstream social media, we hope that the DSA will light a fire under the social media companies to take action against the insidious, nuanced, and harmful antisemitic content that they host. 

you might also be interested in:

Report to us

If you have experienced or witnessed an incident of antisemitism, extremism, bias, bigotry or hate, please report it using our incident form below:

Skip to content