ZME Science
No Result
View All Result
ZME Science
No Result
View All Result
ZME Science

Home → Science → News

Wartime deepfakes are the new face of propaganda. Can we still trust our eyes?

New study tries to make sense of the evolving world of deepfake misinformation in wartime news.

Tibi PuiubyTibi Puiu
October 25, 2023
in Future, News, Psychology
A A
Edited and reviewed by Zoe Gordon
Share on FacebookShare on TwitterSubmit to Reddit
deep fake propaganda
Credit: AI-generated, DALL-E 3.

Deepfakes — videos and voice recordings that have been manipulated by AI to impersonate real people — are the next iteration in online misinformation. This highly convincing and realistic doctored footage can be easily abused to impersonate politicians and celebrities, extract money from gullible people in elaborate hoaxes and cons, and socially target women with pornographic deepfakes.

This is still a novel technology, and we’ve yet to see the full scope of its impact on society. In a new study, researchers at the University College Cork in Ireland have now explored the implications of their use during wartime, as seen in the Russo-Ukrainian conflict. The findings bring to light concerns about trust, misinformation, and the very nature of truth.

The new propaganda frontier

Deepfakes are advanced digital forgeries created using artificial intelligence, particularly deep learning techniques. By training on vast amounts of data, these algorithms can generate eerily realistic video or audio recordings of real people saying or doing things they never actually did. The term “deep” refers to the deep neural networks used in their creation. While the technology has impressive, legitimate applications — such as in filmmaking, video game design, and voice synthesis — it also poses ethical and security concerns.

There is widespread anxiety among AI ethicists that the technology could make it increasingly difficult to tell what’s real among a glut of convincing fake news. And out of all possible scenarios, it is during times of war that deepfakes are perhaps the most concerning.

In early 2022, not long after Russia launched its full-scale invasion of Ukraine, a fake video of Ukrainian President Volodymyr Zelensky started circulating on social media and hacked Ukrainian news websites. The video showed Zelensky appearing to tell his soldiers to lay down their arms and surrender.

A deepfake of Ukrainian President Volodymyr Zelensky calling on his soldiers to lay down their weapons was reportedly uploaded to a hacked Ukrainian news website today, per @Shayan86 pic.twitter.com/tXLrYECGY4

— Mikael Thalen (@MikaelThalen) March 16, 2022

Later that year, the mayors of several European capitals were embarrassingly duped into holding video calls with a deepfake of their counterpart in Kyiv, Vitali Klitschko. Around 15 minutes into the video conferences, a fake but convincing Klitschko started talking about how Ukrainian refugees were cheating the German state of unlawful social benefits and appealed to them to send back Ukrainians for military service.

“There were no signs that the video conference call wasn’t being held with a real person,” the office of the mayor of Berlin, Franziska Giffey, said in a statement.

RelatedPosts

The moon formed much later than thought, but new questions arise
Horses remember their keepers’ faces and are probably smarter than you think
Obesity doubles the risk of COVID-19 hospitalization
Medieval English longbow could cause trauma similar to gunshot wounds

But it isn’t just Russia that is weaponizing deepfakes for propaganda purposes. Two can play that game. In June 2023, Ukrainian hackers broadcasted a fake emergency message on several Russian radio and television stations, showing a nervous President Vladimir Putin declaring martial law after Ukranian troops crossed the border into Russian territory. And early into the war, Ukraine used a combination of video game footage and deep fake images to manufacture the myth of the ‘Ghost of Kyiv’, a supposed ace pilot who downed more than 40 Russian jets before dying heroically in battle.

In #Russia, several radio stations and even local TV networks appear to have been hacked to broadcast a deep fake address allegedly by president Putin.

This fake address announced mass mobilisation and introduced martial law in border regions.

1/2pic.twitter.com/Z79Jqjil6W

— Alex Kokcharov (@AlexKokcharov) June 5, 2023

Researchers from University College Cork set out to explore the impact of deepfakes in wartime scenarios. It is the first study of its kind to do so. By analyzing nearly 5,000 tweets from X (previously known as Twitter) during the first seven months of 2022, the team sought to understand public reactions to these digital deceptions.

“This research is important because there is very limited empirical research on how mis/disinformation deepfakes are impacting social media already. When we talk about deepfakes we often choose to focus on the future harm/benefits rather than looking at how deepfakes are impacting our online spaces now. Our research shows how the potential for deepfakes in conflict has been in some ways realized during the Russo-Ukrainian war,” John Twomey, a psychologist at University College Cork and lead author of the study, told ZME Science.

The team qualitatively assessed the tweets using thematic analysis, adding tags to each tweet and looking for commonalities between them to find common ground. “This method suits analyzing real-world textual data as it can be used to gain a good critical idea of what they contain,” Twomey said.

As they delved deeper, it became clear that onlookers of the Russo-Ukrainian War were having a lot of trouble discerning the line between reality and fiction. But the biggest problem wasn’t that people were getting dupped, although this is also a concern. Instead, the general impression is that people can’t trust their eyes anymore, and this means their suspicions and doubts now extend to legitimate media.

“Our research shows that it is easier and more common for deepfakes to be used to sow doubt. For example, by falsely accusing media of being a deepfake to challenge its authenticity. Though that in no means diminishes the possibility for deepfakes to be used to deceive people and the negative consequences of that.”

‘(Deep)fake news!’

iversity College Cork researchers examining deepfake videos
University College Cork researchers examining deepfake videos. Credit: University College Cork, Image by Max Bell.

As the study uncovered, the mere possibility of deepfakes made many doubt the authenticity of actual footage from the conflict. This can turn into a huge crisis of trust in an already shaky media landscape. Even before deepfakes, some people were questioning the facts around events that unquestionably happened, such as the Holocaust, the moon landing, and 9/11 — despite ample video proof. Deepfakes not only alter our trust in video and audio evidence but threaten to revise history itself to suit a nefarious agent’s agenda.

“I’ve certainly become more worried about two things. Firstly, the harms of falsely accusing real content of being AI-generated. Secondly, the worries of deepfakes becoming a buzzword used to discount real videos. Our research shows that there are already conspiracy theories accusing real videos of politicians as being deepfaked,” Twomey said.

Improving awareness about deepfakes will prove increasingly important to safeguard our democracy. However, there’s a twist. The study revealed a paradoxical effect: while raising awareness can help educate the public about deepfakes, it might also erode trust in legitimate videos. As the number of people who are aware deepfakes exist increases, so will the number of false accusations and suspicions surrounding legitimate media.

This kind of unhealthy skepticism, where genuine content is discounted as artificial, is a new and important challenge that we’ll have to grapple with for years to come. As deepfakes become more sophisticated, the onus falls on us, the consumers of news, to navigate the tricky waters of misinformation and seek the truth.

“Not everything is fake but it is a good thing to know what a deepfake is and how to treat suspected deepfakes,” Twomey said.

The findings appeared in the journal PLoS ONE.

Quick guide on how to spot deepfakes

  • Inconsistent Lighting and Shadows: Look for unnatural lighting on the subject’s face or inconsistent shadows. Real videos have consistent lighting, while deepfakes may struggle with this detail.
  • Facial Distortions: Pay attention to the eyes, mouth, and hairline. Deepfakes might produce glitches or blurring in these areas.
  • Audio-Visual Mismatch: The movement of the lips might not sync perfectly with the audio. Any delay or inconsistency can be a red flag.
  • Blinking Patterns: People naturally blink regularly. Deepfakes, especially earlier versions, might not replicate this behavior accurately.
  • Background Noise: Listen for unnatural background sounds or inconsistencies in audio quality.
  • Emotional Inconsistency: The facial expressions might not match the emotion conveyed by the voice or the context of the conversation.
  • Digital Artifacts: Look for pixelation, unusual patterns, or other digital artifacts that seem out of place.
  • Source Verification: Always check the source of the video or audio. If it’s not from a reputable source, be skeptical.
  • Deepfake Detection Tools: Utilize available software and online platforms that are designed to detect deepfakes. These tools analyze videos for inconsistencies that the human eye might miss.
  • Trust Your Gut: If something feels off about the video or audio, it might be worth investigating further.

ShareTweetShare
Tibi Puiu

Tibi Puiu

Tibi is a science journalist and co-founder of ZME Science. He writes mainly about emerging tech, physics, climate, and space. In his spare time, Tibi likes to make weird music on his computer and groom felines. He has a B.Sc in mechanical engineering and an M.Sc in renewable energy systems.

Related Posts

Environment

This Plastic Dissolves in Seawater and Leaves Behind Zero Microplastics

byTudor Tarita
8 hours ago
Anthropology

Women Rate Women’s Looks Higher Than Even Men

byTudor Tarita
9 hours ago
Art

AI-Based Method Restores Priceless Renaissance Art in Under 4 Hours Rather Than Months

byTibi Puiu
1 day ago
News

Meet the Dragon Prince: The Closest Known Ancestor to T-Rex

byTibi Puiu
1 day ago

Recent news

This Plastic Dissolves in Seawater and Leaves Behind Zero Microplastics

June 14, 2025

Women Rate Women’s Looks Higher Than Even Men

June 14, 2025

AI-Based Method Restores Priceless Renaissance Art in Under 4 Hours Rather Than Months

June 13, 2025
  • About
  • Advertise
  • Editorial Policy
  • Privacy Policy and Terms of Use
  • How we review products
  • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.

No Result
View All Result
  • Science News
  • Environment
  • Health
  • Space
  • Future
  • Features
    • Natural Sciences
    • Physics
      • Matter and Energy
      • Quantum Mechanics
      • Thermodynamics
    • Chemistry
      • Periodic Table
      • Applied Chemistry
      • Materials
      • Physical Chemistry
    • Biology
      • Anatomy
      • Biochemistry
      • Ecology
      • Genetics
      • Microbiology
      • Plants and Fungi
    • Geology and Paleontology
      • Planet Earth
      • Earth Dynamics
      • Rocks and Minerals
      • Volcanoes
      • Dinosaurs
      • Fossils
    • Animals
      • Mammals
      • Birds
      • Fish
      • Amphibians
      • Reptiles
      • Invertebrates
      • Pets
      • Conservation
      • Animal facts
    • Climate and Weather
      • Climate change
      • Weather and atmosphere
    • Health
      • Drugs
      • Diseases and Conditions
      • Human Body
      • Mind and Brain
      • Food and Nutrition
      • Wellness
    • History and Humanities
      • Anthropology
      • Archaeology
      • History
      • Economics
      • People
      • Sociology
    • Space & Astronomy
      • The Solar System
      • Sun
      • The Moon
      • Planets
      • Asteroids, meteors & comets
      • Astronomy
      • Astrophysics
      • Cosmology
      • Exoplanets & Alien Life
      • Spaceflight and Exploration
    • Technology
      • Computer Science & IT
      • Engineering
      • Inventions
      • Sustainability
      • Renewable Energy
      • Green Living
    • Culture
    • Resources
  • Videos
  • Reviews
  • About Us
    • About
    • The Team
    • Advertise
    • Contribute
    • Editorial policy
    • Privacy Policy
    • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.