BRUSSELS – With the European Union (EU) investigating X – the social media platform owned by Elon Musk – over breaching content moderation rules, the billionaire, who champions himself as the free speech champion, may end up paying billions in fines.

Also read: EU Warns Musk Against Promoting Hate Speech

Reason? That’s because the EU could calculate fines not just on the basis of X’s turnover  but on the revenues of Musk’s entire business empire, including Tesla and SpaceX.

The probe is a result of alleged violation of the Digital Services Act (DSA), a landmark legislation introduced by the EU in October 2022. It is expected a decision would be made withing months after starting the investigation in December last year, as regulators are still probing how it tackles the spread of illegal content and information manipulation.

But the it is not just the EU, as Musk has already been dealing with the affairs in Brazil where his failure to remove the accounts associated with hate speech turned into a global news.

Earlier in August, X was shut down in Brazil after it did not comply with orders from the top court related to hate speech moderation in the social platform. But X representatives have recently started to publicly talking about the intentions to address the court demands, even though the firm had previously said it would not meet them.

AUSTRALIA JOINING FORCES AGAINST DISINFORMATION:

Earlier this month, Australia warned of imposing fine on internet platforms, which can go up to 5% of their global revenue, for failing to prevent the spread of misinformation and disinformation.

The legislation targets false content that hurts election integrity or public health, calls for denouncing a group or injuring a person, or risks disrupting key infrastructure or emergency services.

It is the Australian leaders who have saying that foreign tech platforms are overriding the country’s sovereignty, as X has removed most of the content moderation tools after the takeover by Musk.

“Misinformation and disinformation pose a serious threat to the safety and wellbeing of Australians, as well as to our democracy, society and economy,” Communications Minister Michelle Rowland had said in a statement.

The Australia Communications and Media Authority said it welcomed “legislation to provide it with a formal regulatory role to combat misinformation and disinformation on digital platforms”.

Meanwhile, Musk in his reaction likened the Australian government to “fascists”.

X vs TWITTER:

Regarded as the most prominent and influential social media platform despite having less number of users than Facebook and others, X has become a controversial entity. Reason? Musk bought the Twitter and renamed it as X – a brand name reflecting SpaceX, his top new venture.

But it didn’t stop there as Musk removed the restrictions concerning hate speech and incitement to violence. Meanwhile, he also started making a source of direct income other than advertisements by charging the users for the famous blue tick by making them upgrading their accounts.

Meanwhile, Musk opened the floodgates in favor of far-right by disposing of the previous effective content moderation mechanism. It is an open secret, with world’s richest person personally propagating fake news and disinformation. From suggesting the possibility of civil war in the United Kingdom during the recent anti-immigrant riots to backing Trump, his association with far-right ideology has become one of the main sources of disinformation.

Also read: Freedom of Inciting Violence: Billionaires Shaping The World

That’s why a spokesperson for the European Commission, which is handling the investigation, said in August that it could take handling of harmful content related to the recent UK riots into account.

DSA vs DISINFORMATION?

  • The legislation notes “systemic risks on society and democracy”, and says it “fully harmonizes the rules applicable to intermediary services in the internal market with the objective of ensuring a safe, predictable and trusted online environment.”

But how? By “addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate.” However, it protects the “fundamental rights” enshrined in the Charter of Fundamental Rights of the European Union.

  • The DSA mentions the issue of organized propaganda and its effects.

“When recipients of the service are presented with advertisements based on targeting techniques optimized to match their interests and potentially appeal to their vulnerabilities, this can have particularly serious negative effects. In certain cases, manipulative techniques can negatively impact entire groups and amplify societal harms, for example by contributing to disinformation campaigns or by discriminating against certain groups.”

  • It also shares broader guidelines to the social media platforms

“Providers of very large online platforms and of very large online search engines should, in particular, assess how the design and functioning of their service, as well as the intentional and, oftentimes, coordinated manipulation and use of their services, or the systemic infringement of their terms of service, contribute to such risks.”

“Such risks may arise, for example, through the inauthentic use of the service, such as the creation of fake accounts, the use of bots or deceptive use of a service, and other automated or partially automated behaviors, which may lead to the rapid and widespread dissemination to the public of information that is illegal content or incompatible with an online platform’s or online search engine’s terms and conditions and that contributes to disinformation campaigns.”

  • When it comes politics and public security, it identifies manipulative techniques.

“Very large online platforms or very large online search engines should ensure public access to repositories of advertisements presented on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality.”

  • As the DSA talks about crisis response mechanism for very large online platforms and very large online search engines, it notes that the European Commission “may initiate the drawing up of voluntary crisis protocols to coordinate a rapid, collective and cross-border response in the online environment.”

“Such can be the case, for example, where online platforms are misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable.”