OVERVIEW
The Pakistan Tehreek-e-Insaf (PTI) held a demonstration in Islamabad on November 24, 2024, calling for the release of former Prime Minister Imran Khan. The demonstration turned violent when protesters broke into the extremely sensitive Red Zone area of D-Chowk, forcing police to use tear gas to scatter the crowd. The event attracted a lot of attention because of the political stakes involved as well as the extensive use of propaganda and false information to sway public opinion. False narratives about the demonstration were actively spread by prominent PTI figures and associated social media accounts.
Claims ranged from exaggerated turnout figures to manipulated visuals portraying police brutality and even AI-generated content aimed at maligning state institutions. These initiatives aimed to undermine national security and trust in democratic processes by misleading the people, inciting unrest, and portraying the state as totalitarian.
The deliberate dissemination of disinformation demonstrates how propaganda is increasingly being used as a political instrument to influence public opinion. These strategies pose serious threats to national interests and societal cohesiveness by taking advantage of the vulnerabilities of a public realm that is becoming more and more digital. It emphasises the urgent need for strong fact-checking procedures and critical media literacy to protect against manipulation as Pakistan struggles with this growing wave of digital dis/ misinformation.
In succeeding paras few disinformation campaigns/ claims run and supported by key leadership and supporters of Pakistan Tehreek-i-insaf have been fact checked which were used to propagate hatred against the state and armed forces.
DISINFORMATION CASE I- DEATH OF PTI WORKER
Claim
This section aims to investigate the widely circulated fake news of PTI worker Tahir Abbas Tarar’s death by Rangers while praying atop a container.
|
Fact-Check
A video on social media claimed Rangers dragged Tahir Abbas Tarar and pulled him down causing his death, attracted mass outrage and accusations of civilian killings against state institutions. PTI’s key leadership and official social media accounts amplified this claim, alleging that Tahir Abbas Tarar was killed on the spot. The incident sparked hashtags against state institutions and falsely reported Tahir’s family abduction, intensified the narrative.
Whereas, Tahir Abbas Tarar was not only alive but also met Khyber Pakhtunkhwa Chief minister Ali Amin Gandapur, who went to ask about his health. Pakistani news outlet also reported this visit.
Multiple social media posts confirmed Tahir Abbas Tarar being alive. One of the posts is from Mandi Bahauddin’s PTI MNA.
Simultaneously, another claim widely circulated on social media platform was abduction of Tahir Abbas Tarar’s family members. Instagram user posted a video with similar claims.
On contrarily, during his interview or meeting with KP chief minister, Tahir did not mention his family members being abducted which nullifies the false claims propagated on social media.
To verify the authenticity of viral claim, the video claiming Tahir’s death was closely analysed frame-by-frame using reverse image and video verification techniques to identify the original source and authenticity. Social media accounts that propagated the claim were identified, and their posting patterns was analysed to detect coordinated efforts. |
|
|
|
|
|
|
|
|
|
|
Verdict
The claim that Tahir Abbas Tarar was killed by Rangers during a protest is false, as evidence shows he survived and was treated with minor injuries. The video was shared out of context to provoke public outrage.
Conclusion
The incident highlights the destructive power of disinformation, which exploits delicate situations to attack institutions, polarize society, provoke unrest, and reduce public confidence. It emphasizes the need to refute these myths to maintain social harmony and prevent political manipulation of popular sentiment.
CASE II – VIDEO FROM INDIA FALSELY ATTRIBUTED TO PTI PROTEST
Claim
A video circulated online after Pakistan Tehreek-e-Insaf’s protest rally and claimed hundreds of vehicles were moving towards Islamabad.
|
Fact-Check
Reverse image searches were performed on key frames from the viral video, which led to the original source of the video. It was found out that the video widely circulated by PTI accounts is originally from a political rally in India organised by the All India Majlis-e-Ittehadul Muslimeen (AIMIM). It was initially posted on Instagram in September.
|
Verdict
The claim that the viral video depicted the Islamabad protest on November 25, 2024, is False. The video was traced to an AIMIM rally in India and has no connection to the PTI protests.
Conclusion
Public perception can be distorted by irrelevant content that is mistakenly linked to an event, which fuels disinformation. Verifying content is essential during sensitive times to avoid manipulation.
DISINFORMATION CASE III – CLAIMED VIRAL VIDEO FROM EAST TIMOR AS PTI PROTESTERS MARCHING TO ISLAMABAD
Claim
A video circulating online claimed to depict thousands of demonstrators marching towards Islamabad in support of jailed former Prime Minister Imran Khan.
|
Fact-Check
A reverse image search was conducted on key frames, which revealed that the video is from a Christian Event held by Pope Francis in East Timor in September 2024, where thousands of worshippers were seen exiting the event. No connection to protests in Pakistan.
|
Verdict
The claim that the viral video shows PTI supporters marching to Islamabad is False.
Conclusion
The incident underscores the need for critical evaluation of viral content, particularly during politically charged times. To combat such misinformation, it is crucial to rely on credible sources and verify content before sharing.
DISINFORMATION CASE IV –AI GENERATED BLOOD-COVERED IMAGE OF ISLAMABAD ALLEGEDLY PORTRAYING UNCOUNTED NUMBER OF DEATHS
Claim
Pakistani journalist @ImranRiazKhan shared an image on X of a blood-covered thoroughfare in Islamabad, claiming it depicted authorities washing the site after an alleged PTI protest massacre. The post sparked public outrage and state violence against dissenters. However, it was later confirmed to be AI-generated, lacking authenticity and not aligning with credible accounts or visual evidence.
Another social media user shared the same picture with the following caption: “At least more than 100 people have been martyred. Now the bodies are being hidden. Want to bring on record that 33 dead bodies were brought to only one hospital. Another 27. The rest of the numbers are coming.
Hitler Asim Munir Should Resign
#IslamabadMassacre”
|
Fact-Check
This claim was fact-checked to counter disinformation and assess the accuracy of the viral image. The spread of AI-generated visuals poses significant risks of inciting public unrest and creating a false perception of events.
Verification Steps:
· Reverse image searches were conducted to trace the origin of the image.
· Metadata analysis revealed the image’s creation using AI-based tools.
Comparisons with news reports, live footage, and verified photos from D-Chowk were conducted. |
Verdict
The claim that the image shows blood-covered streets in Islamabad after a crackdown on PTI protesters is AI-Generated Disinformation. The image was AI-generated, displaying typical signs of artificial creation, including inconsistent textures and unnatural details.
Conclusion
The case underscores the dangers of AI-generated content spreading misinformation, emphasizing the need to verify the authenticity of viral content during political turmoil to prevent manipulation of public opinion. Information coming from trusted entities can be deceiving in manner.
DISINFORMATION CASE V- MISLEADING IMAGE CLAIMING TO SHOW DEAD BODIES AFTER ISLAMABAD PROTEST
Claim
A social media user, @Sufisal, shared an image claiming dead bodies were loaded into an ambulance after a security operation against Pakistan Tehreek-e-Insaf protesters in Islamabad.
|
Fact-Check
Reverse image search revealed that original post by the journalist Vannesa Beeley covering the Gaza conflict was made on November 1, 2024.
|
Verdict
The claim that the image shows fatalities from the Islamabad protests on November 26 is False and Misleading. The image is from Gaza, taken after an Israeli airstrike on a Palestinian refugee camp. It was shared on social media weeks before the Islamabad protests.
Conclusion
This incident underscores the dangers of misinformation during sensitive political times. The misuse of images from unrelated conflicts to promote false narratives can manipulate public emotions and incite unnecessary unrest.
DISINFORMATION CASE VI – MISLEADING VIDEO ALLEGEDLY DEPICTING ISLAMABAD PROTEST’s AFTERMATH
Claim
A 25-second video claiming to depict a security operation against Pakistan Tehreek-i-Insaf protesters in Islamabad went viral, causing public outrage and allegations of state-sponsored violence.
|
Fact-Check
A reverse video search was performed to trace the origin of the clip. Verified news reports from 2019 covering the Nankana Sahib incident were consulted to verify the claim. Reverse image search revealed that Pakistan mainstream media has reported clashes between two groups in Nankana Sahib in 2019, which resulted in the deaths of 4 people.
|
Verdict
The claim that the video shows the aftermath of the Islamabad protests is Misleading. The video is from a violent clash in Nankana Sahib, Punjab, in 2019. It has no connection to the November 2024 protests in Islamabad.
Conclusion
This case highlights the prevalence of misattributed and outdated content used to manipulate public opinion during politically sensitive times. The deliberate use of an unrelated video to spread disinformation underscores the importance of media literacy.
DISINFORMATION CASE VII – AI GENERATED IMAGE MISATTRIBUTED TO ISLAMABAD PROTEST
Claim
The Pakistan Tehreek-e-Insaf (PTI) shared an image on Instagram claiming a violent crackdown on PTI protesters in Islamabad, causing outrage and condemnation.
|
Fact-Check
To verify the authenticity, a comparison of the shared image with real images of Jinnah Avenue in Islamabad was made. A comparison with actual images of Jinnah Avenue shows significant inconsistencies. The road structure, building alignment, and overall layout do not match the actual location. Moreover, building rooftops on both sides exhibits distorted angles and irregular construction, characteristic of AI errors.
|
Verdict
The claim that the image shows bloodshed on Jinnah Avenue during the Islamabad protests is AI-Generated Disinformation. The image is AI-generated and bears no connection to the protests or any real-life event.
Conclusion
The use of AI-generated imagery to fabricate evidence of violence undermines the credibility of legitimate protests and misleads the public. Such disinformation can provoke unnecessary fear and hostility, further destabilising an already tense political climate.
DISINFORMATION CASE VIII – DATED BACK VIDEO OF DOCTOR CRITICIZING STATE ACTIONS
Claim
A video of a doctor visibly emotional while speaking about government actions was widely shared on X (formerly Twitter). The post falsely connected the video to the crackdown on PTI protesters in Islamabad, claiming the doctor was lamenting alleged state brutality. With over 282,000 views and 2,100 reposts, the video spread rapidly across various platforms.
|
Fact-Check
A reverse image search traced the video to a YouTube post from April 18, 2021, by a TLP-affiliated channel. The video was filmed during violent clashes between the government and TLP supporters at Lahore’s Rehmat-ul-Lil Alameen Mosque, related to the group’s demand for the expulsion of the French ambassador.
The doctor in the video criticized the then Prime Minister Imran Khan, directly saying, “This is not the state of Madina. Imran Khan, do not misuse [Holy Prophet] Muhammad’s (PBUH) name.” The remarks clearly indicate that the video predates the PTI protests and is unrelated to the recent crackdown.
This video was originally uploaded on YouTube three years ago and had 148 views while reporting.
|
Verdict
The claim that the video of the doctor relates to the PTI protest crackdown in Islamabad is Misleading. The footage is from April 2021 and concerns a completely unrelated incident involving TLP protests in Lahore.
Conclusion
Disinformation, such as misattributing older, unrelated content to current events, can exacerbate societal tensions and mislead the public. The video in question was weaponised to falsely depict state brutality against PTI protesters, although it had no connection to the recent political unrest.
DISINFORMATION CASE IX – MISLEADING CLAIMS OF CASUALTIES AND USE OF FORCE DURING PTI PROTEST
Claim
PTI leadership and social media spread unconfirmed death tolls, claiming security forces killed up to 400 civilians. These accusations were fuelled by emotional descriptions of state brutality, but eyewitness reports and hospital comments refuted these claims, stating no shots were fired and non-lethal tactics were used.
Social media post claiming the casualties as high as 400 gained more than 687K views and was shared for more than 20,000 times.
|
Fact-Check
Another post from political figure Dr. Shahbaz Gill claiming 100 causalities was shared for more than 15,000. This post alone had more than one million views.
Pakistan Tehreek-i-Insaf official account has also been misleading the general public regarding the number of casualties.
Confusion was increased when a Panjgur video of a car being fired upon was mistakenly linked to the Islamabad demonstration.
In reality, this video was recorded in Panjgur district of Balochistan, on the night of October 29, 2024, following an assassination attempt by the terrorist organization on a Baloch citizen, Zahir Shambezai. The video was initally posted on 29 October on Tiktok.
Statements from the Pakistan Institute of Medical Sciences (PIMS) and the Federal Polytechnical Institute were reviewed to ascertain the number and cause of injuries reported during the protest. Official statement from PIMS and Federal Government Polyclinic refuted the false claims of casualties and direct fires on protestors.
|
Verdict
The claim that 400 or 100 civilians were killed by direct gunfire from security forces during the protest is Disinformation. Hospital reports from PIMS and the Federal Government Polyclinic Institute confirmed that no gunshot injuries or deaths were recorded. Additionally:
- The video from Pajngur was proven unrelated to the Islamabad protest and was linked to a separate incident.
- No evidence of direct fire on civilians was found. Security forces exclusively used tear gas and shelling to manage the protest.
Conclusion
The circulation of unverified claims about casualties and the use of lethal force underscores the dangers of disinformation in politically charged environments. These narratives, aimed at undermining trust in state institutions, reflect the growing weaponization of fake news for political agendas.
DISINFORMATION CASE X – DATED BACK VIDEO CLAIMING PTI WORKERS SWIMMING TO ISLAMABAD DUE TO BLOCKAGE
Claim
A video circulating on social media, allegedly shows PTI workers swimming across a river to evade roadblocks and reach Islamabad for a party protest, but fact-checking revealed it’s unrelated to the PTI protest and dates back to September 2023.
|
Fact-Check
The video, allegedly a drug awareness initiative under the PTI flag, was linked to an event in Swabi, Pakistan in September 2023. The individual, Zahid, led the activity and made political statements supporting PTI, including calls for Imran Khan’s release and criticism of inflation. The video’s original intent was misinterpreted, misleading viewers.
|
Verdict
The claim that the video shows PTI workers swimming across a river to bypass roadblocks and reach Islamabad for the November 2024 protest is False. The footage is from a drug awareness activity conducted in Swabi in September 2023.
Conclusion
The video was repurposed to amplify the narrative of widespread and unconventional efforts by PTI supporters during the November 2024 protests. However, its original context debunks such claims, revealing the video as unrelated to the current political scenario.
DISINFORMATION CASE XI – AI GENERATED FALSE VIDEO CLAIMS ON PAKISTAN ARMY CHIEF’S VISIT TO SAUDI ARABIA
Claim
In November 2024, Pakistan Army Chief General Syed Asim Munir allegedly touched Saudi Crown Prince Mohammed bin Salman’s knee in a video widely shared on social media platforms TikTok and X, which was later confirmed to be AI-generated content.
|
Fact-Check
The video was initially uploaded by a TikTok account named “Bilal AI,” explicitly labelled as AI-generated and satire. The account’s bio mentions that it creates AI-generated content.
Despite the clear labelling on TikTok, the video gained traction on other social media platforms, amassing over 370,000 views. The official visit, which occurred on November 6, focused on discussions related to regional peace, defence, security cooperation, and enhancing bilateral relations. Both sides emphasised strategic cooperation during the meeting.
|
Verdict
The claim that Pakistan’s Army Chief General Syed Asim Munir touched Saudi Crown Prince Mohammed bin Salman’s knee during their meeting is AI-Generated. The video circulating this claim is an AI-generated creation, as verified by its original uploader.
Conclusion
The incident highlights the dangers of AI-generated media in spreading misinformation, particularly when used to malign public figures or undermine diplomatic relations. The video, initially satirical, was later weaponized to stir controversy, harming reputations and destabilizing public trust.
DISINFORMATION CASE XII – PTI FALSE CLAIMS ABOUT PRESENCE OF SNIPERS DURING PROTEST
Claim
Following PTI protests, videos emerged online claiming snipers were targeting protesters. Qasim Suri, a PTI figure, posted a video of five people on a rooftop labelled as snipers. Another user shared a separate video claiming snipers were on Centaurus Mall. However, these claims are false and misleading.
Another X user claimed, “This video was also made by me; snipers were operating from each building, not one building
#Islamabad_Massacre #IslamabadMassacre” |
|
|
|
|
Fact-Check
Keyframe review and zoom-in analysis were undertaken to identify key features. Examination of body posture, clothing, equipment, and behaviour was undergone. Research on standard characteristics of sniper deployment practices in urban conflict zones was considered to understand the pattern of operation.
Video Content Review:
● In the first video, the individuals on the rooftop appear to be civilians, not armed personnel.
● One individual is visibly older (approximately 50 years of age) and holding a smartphone, likely recording the protests below.
● None of the individuals display military uniforms, tactical gear, or weapons such as sniper rifles or rangefinders.
● In the second video of the Centaurus Mall, the figures are similarly unarmed, casually dressed, and engaged in observation or recording.
Snipers are highly trained, discreet operatives who blend into their surroundings, often using ghillie suits or urban cover. They use concealed vantage points and carry long-range rifles with scopes, with spotters using binoculars or rangefinders. They don’t operate in civilian attire, especially in high-stakes environments. Rangefinders, often appearing as small binoculars or handheld gadgets, are not used in videos. |
Verdict
The claim that snipers were stationed on rooftops during the PTI protests in Islamabad is Disinformation. The individuals in the videos were civilians.
Conclusion
The spread of such misinformation can have serious consequences, including increased hostility towards state institutions and unwarranted panic among protestors. It is essential for the public to critically analyse such claims, rely on verified information.
DISINFORMATION CASE XIII – MISLEADING VIDEO CLAIMING STATE VIOLENCE AGAINST PTI PROTESTERS
Claim
A video circulated on X during the PTI protests in Islamabad, depicting a broken windshield and two allegedly dead men, was widely circulated. The video was backed by PTI supporters and a fan account of journalist Imran Riaz Khan. However, an investigation revealed the video had unreliable origins, was not filmed at D-Chowk, and contained unrelated cinematic elements. This tweet was shared more than 8,000 times & has more than 251.8K views.
|
Fact-Check
The exact origin of the video could not be verified. However, its location does not match D-Chowk, where the main protests occurred. The cinematic quality of the video, with deliberate camera movements and angles, suggests it may have been staged or unrelated to real events. Unlike genuine protest footage, the video lacked the chaotic and spontaneous elements typically seen in such scenarios. The video was not captured by CCTV but was shared by an Imran Riaz Khan fan account, known for amplifying pro-PTI narratives. |
Verdict
The claim that the video shows state-led killings of PTI protesters is False and Misleading. The video is not from the protests at D-Chowk and appears to be a movie scene shared out of context to align with the anti-state narrative.
Conclusion
The circulation of this video highlights the importance of verifying content before dissemination, especially during politically charged events. Such misleading visuals can incite unrest and damage trust in institutions.
Lessons and Call To Action
The speed at which false information spreads on delicate political occasions, such as the PTI protests, highlights the risks that fake news poses in influencing public opinion and raising tensions. Misinformation operations use technology and human emotion to sow disorder, from AI-generated photos of blood-soaked streets to irrelevant videos falsely linked to protest violence. The use of repurposed content from other events (such as media from Gaza or previous TLP protests), doctored images, and claims of snipers attacking people demonstrate how narratives may be twisted to undermine public confidence in state institutions, mislead the public, and spark unrest. This emphasises how important it is to be vigilant and consume media responsibly during turbulent times.
AI-generated content has added a new dimension to misinformation, making it more difficult to identify and believable. Fact-checking efforts are often outpaced by the rapid spread of fake news, undermining democratic processes and causing fear and divisiveness. Strong systems are needed to combat online misinformation and teach people how to distinguish reliable sources from fake ones. To prevent the spread of lies, citizens, journalists, and online media must work together to promote a culture of critical enquiry. Verifying sources is a fundamental step in ensuring a better informed and more resilient society.