Deliberately misleading or false information spread to deceive has been studied quite extensively in fields like psychology, sociology, political science, and artificial intelligence among other fields, and is known as disinformation or in some other vernacular.

Disinformation exploits cognitive biases, making people believe falsehoods that align with their views while rejecting contradictory facts.

Researchers have developed countermeasures to reduce the impact because of its spread, the cognitive biases that make people susceptible to it, and the societal consequences. It is important, to be able to cultivate a society that is educated and can differentiate between truth and sophistry, to understand the science of disinformation.

Cognitive biases and the general feeling of psychological vulnerability that exist in human thinking are particularly vulnerable to disinformation exploits. Confirmation bias is one of the greatest biases at play. This bias causes people to be more likely to accept false information that fits their theoretical views than actual information that does not. Moreover, the illusory truth effect shows that false information, even when it is false, is more likely to be believed because it has been exposed many times.

Emotional manipulation is also part of another psychological mechanism. For instance, disinformation is often constructed to generate strong emotional responses, including fear, anger, outrage and so on that take precedence over rational analysis. Critical thinking is diminished whenever emotions are heightened and the result is that individuals are more willing to accept and share false narratives. This is also the reason newspapers can use sensational headlines, emotionally charged language, and polarizing content in disinformation campaigns.

Disinformation has reached far and wide and spread at increased speed on social media and online platforms that are a staple of the digital age. Typically, algorithms are invented to boost user engagement, and thus they reward sensational content and in so doing, reward disinformation over factual reporting. Also, the echo chamber effect and filter bubble make that bigger problem by strengthening existing biases and constraining, in humans, different points of view.

Furthermore, robots and artificial intelligence have been corrupted to proliferate disinformation on an unprecedented scale. Fake engagement —likewise made with automated accounts—can be utilized also to impersonate trends, improving wrong narratives or ideological messages. Now with deepfake technology – which manipulates hyper-real audio and video – the distinction of what is fact and fiction has all the more become a blur making it almost impossible to verify the information.

Social media algorithms amplify disinformation, prioritizing sensational content over factual reporting, fueling polarization and echo chambers.

The problem with disinformation is not just an individual cognitive problem but also with consequences in the wider society. Propaganda, placing elections, destabilizing governments, manipulating public opinions, all of these, and more have been used as a tool. Populations have been targeted by state-sponsored disinformation campaigns to sow discord, to create chaos and confusion amongst trusting people to weaken their trust in institutions. Information warfare: a tactic used by many non-state as well as state actors to achieve their geopolitical goals.

Around politics, disinformation has far graver public health consequences. The dissemination of false information about vaccines, medical treatments, and pandemics has already put lives at risk, eroded public trust in scientific institutions, and led to conspiracy theories.

Disinformation can only be addressed by a multi-pronged effort, through education, increased technological innovation, policy and legislation, and personal responsibility. Media literacy programs are one of the most effective ways of dealing with disinformation because these programs help individuals learn to critically examine sources of information, to identify tactics of manipulations, and confirm facts before sharing the content.

Fact check systems, labeling of misleading information, etc. are also adopted by social media platforms to fight out disinformation. Nevertheless, these steps are frequently challenged for fear of censorship and the loss of freedom of speech. It is hard to curtail disinformation without limiting free expression.

Given that, governments and international organizations have been firming up regulatory frameworks on disinformation, especially with references to political advertising, foreign influence operations, and a misuse of artificial intelligence to spread falsehood. Efforts in the legislature seek to make platforms and people responsible for the spread of damaging misinformation while making it approachable for the people of this nation to hold democratic principles.

Disinformation is a tool of information warfare, used to manipulate public opinion, disrupt democracies, and weaken trust in institutions.

But, the methods of disinformation are going to evolve with the technology. There are new challenges to be raised in the form of synthetic media, AI-generated propaganda, and more advanced psychological manipulation. As disinformation becomes more sophisticated, it will no longer be sufficient to combat it with the same sophistication; we will need artificial intelligence to assist in fact-checking, more transparent algorithms, and more international cooperation against information warfare.

Additionally, mitigating disinformation responsibility does not fall exclusively on the institutions and the governments. People must develop the habit of critical thinking, digital skepticism, and responsible information sharing. Prevention from the propagation of such deceptive narratives can only be achieved through awareness and vigilance.

Disinformation takes some understanding of the psychology and the technology involved, as well as what I think is the most important part, which is the societal dynamics supporting the spreading of false information. For democratic societies to protect themselves and for citizens to maintain an informed public, it is necessary to understand how disinformation works, how people get caught in it, and what can be done to resist it.

Combating disinformation requires media literacy, AI-driven fact-checking, regulatory frameworks, and responsible information sharing.

Although technology has created the problem, it also offers solutions that with the proper education and policy measures, can reduce the problem. With misinformation spreading faster than truth, knowledge and critical thinking are our only assets.

Disclaimer: The opinions expressed in this article are solely those of the author. They do not represent the views, beliefs, or policies of the Stratheia.

Author

  • Dr. Ghulam Mujaddid

    Dr. Mujaddid is an Associate Professor in Muslim Youth University Rawalpindi holds three Masters and a PhD in Strategic Studies. He is a former Commissioned officer in the Pakistan Air Force for 33 years.

    View all posts