Preventing Inaccurate Reporting on Ukraine

Explore top LinkedIn content from expert professionals.

Summary

Preventing inaccurate reporting on Ukraine means stopping the spread of false or misleading information about the conflict, especially with the rise of digital disinformation campaigns and deepfakes. The goal is to build a resilient information environment by combining fast fact-checking, promoting trustworthy sources, and strengthening truthful narratives that counter false claims.

  • Use verification tools: Make a habit of checking images, articles, and social media posts with reliable fact-checking and AI-detection platforms before sharing information.
  • Strengthen trusted narratives: Support and share stories from credible sources to help build public trust and make it harder for false reports to take hold.
  • Monitor information channels: Keep an eye on popular platforms and networks where disinformation is commonly spread, so you can quickly spot and address misleading content.
Summarized by AI based on LinkedIn member posts
  • View profile for Marie-Doha Besancenot

    Senior advisor for Strategic Communications, Cabinet of 🇫🇷 Foreign Minister; #IHEDN, 78e PolDef

    38,465 followers

    RAND ‘s report on wartime #disinformation : Applying lessons learned from #Ukraine to other contexts. 92 pages, 3 chapters, 12 lessons learned : 🪖Before the war: shaping operations. 🔹 2014-22: building government &civil society institutions countering adversary disinformation 🔹 steps to stop the flow of Russian propaganda targeting the country 🔹 intelligence-driven “prebunk” informing international audiences about planned Russian operations 🪖During the war: countering false narratives across the 3 theaters of the information war. 🧰12 lessons learned : 🔸Prepare and plan for 3 theaters of information war, look for innovative ways to reach & communicate with populations in totalitarian countries; rally international institutions to effectively prebunk adversary campaigns targeting the rest of the world; support a broader array of institutions residing in host nations. 🔸Build critical host nation institutions in advance of and during conflict 🔸Build and maintain capacity to counter disinformation: assess own doctrine, training, and wargaming efforts to ensure it is able to counter disinformation during conflict. Ensure that institutions & psychological operations forces retain their capability. 🔸Invest in and work with civil society 🔸Build and maintain trust to effectively dispel adversary narratives. 🔸Work with and empower local and military influencers: promote online voices to help support national security objectives. 🔸Build #resilience of troops: to avoid frontline soldiers being a target of adversary campaigns, undermining their will to fight. Develop a mandatory media literacy education campaign to help deployed and garrison personnel recognize malign influence attempts and foster safer online behavior. 🔸Do not allow coordination to sacrifice speed in responding: the Ukrainian experience highlights the value of a loosely coordinated and redundant network response that involves multiple actors both monitoring media and communicating key narratives. 🔸Be prepared to take risks: accept that government communicators outsource their efforts to creative and agile civil society institutions. Allow communicators to quickly create unique, humorous, and engaging content. 🔸Plan on resourcing and executing 3 critical counterdisinformation tools: Debunking (fact checking), prebunking, and the promulgation of proactive information narratives. Ensure the 3 are integrated in military theaters of operation. 🔸Be prepared to build the capacity of key institutions: In future contingency operations, consider adversary targets for propaganda and disinformation and evaluate the ability of local institutions to effectively respond. 🔸Recognize the risk of waning support over time, as the time engaged in conflict increases and influence of messaging decreases & adversary disinformation narratives may become more influential. Wargame these risks and consider incorporating them in war plans. 👏🏼 Todd Helmus Khrystyna Holynska

  • View profile for Aidan Raney

    CEO/Founder of Farnsworth Intelligence | CPO/Co-Founder @ Alerts Bar | OSINT Expert, Content Creator, and Consultant | Vice Chair @ Wisconsin Governor’s Juvenile Justice Commission

    14,096 followers

    "What is the cost of lies?" An incredibly poignant quote from HBO's 2019 'Chernobyl', and it couldn't be more relevant. We live in a golden age of disinformation, with online mediums being the perfect method of delivery of fake news to your fingertips. Here's a list of online tools to help you fact-check and identify disinformation on the internet! Snopes, https://www.snopes.com/ - A reliable, independent fact-checking website posting articles addressing rumours surrounding public issues in the USA. DFRLab, https://dfrlab.org/ - A product of the Atlantic Council, DFRLab has published over 1000 articles covering disinformation, connective technologies and more, related to current global political and conflict events (Ukraine, the Middle East etc.)   Factcheck, https://www.factcheck.org/ - A non-partisan website monitoring the accuracy of statements made by US politicians, publishing articles and Q&As on topical areas of American politics. AskNews, https://asknews.app/en - An AI-powered news platform sourcing articles from over 50000 sources in 14 languages. Designed to feature unbiased and diverse content, with free and paid options for content. Used for fact-checking, situation monitoring and global risk analysis. Information Laundromat, https://lnkd.in/g3QgFXmA - An open-source tool providing 2 services: the Content Similarity Search and the Metadata Similarity Search. Designed to identify links between sources, content repurposing and similar articles. InVID, https://lnkd.in/gnR7Nnz8 - A plugin designed to verify photos and images, as well as flag false content on social networks by providing contextual information, reverse image searches, read metadata and more. Image Whisperer, https://lnkd.in/g39rGt5d - An AI-powered image verification tool designed to flag AI-generated content and provide context about a submitted image. Hive Moderation, https://lnkd.in/gbECarpJ - An AI-powered content moderation tool filtering out illegal, harmful or inappropriate content from online communities. Hiya Deepfake Voice Detector, https://lnkd.in/gS6ArSsq - A free browser extension helping verify if audio is AI-generated. Google Factcheck Explorer, https://lnkd.in/gMtRF_7D - An open-source tool allowing users to view different fact-checking sources by inputting key words or images. A key point to take from this post- trust but verify. Not everything is as it seems...

  • View profile for Frank Wolf

    I am the co-founder of Staffbase. As Chief Strategy Officer, I focus on future trends for communication. I recently published my second book, "The Narrative Age".

    6,352 followers

    How can we fight misinformation? I believe we overrate the role of intervention and underrate prevention. Let’s look at a deepfake that targeted Ukraine's first lady last week.   What Happened? 🤔 A deepfake video falsely claimed that Ukraine's First Lady, Olena Zelenska, bought a $4.8 million Bugatti in Paris. This claim originated from a social media post by a fake Bugatti employee's Instagram profile, which had only four posts, including a purported invoice of the sale. The video was then amplified by Russian disinformation networks across platforms like X, Telegram, and TikTok, even making it to major news portals like MSN. Bugatti’s official partner in France quickly issued a statement denying the purchase and called out the “disinformation campaign.” The Problem 🚨 If you hit the right narrative, spreading fake news is too easy. The AI video was not even good. The man's head moves, but his torso remains almost completely stable. It also contains cuts, a strange accent, and unnatural mouth movements. Yet, many people won’t even watch the video — they’ll just see its existence as proof enough to justify beliefs; "They even have a Bugatti employee on record!" Reports about the fake news are out now, but given the fast pace of the news cycle and our limited attention span, this intervention comes too late and likely won’t reach as many people as the initial lie. The story took off because it fits a narrative many are eager to spin: the Ukrainian elite is corrupt, and Western aid for the war against Russia isn’t reaching the frontline but is wasted on luxury. The Broader Issue 🌐 In this environment, relying solely on intervention is always too little, too late. Yes, we need fact-checking. Yes, tools like community notes on X are great for building transparency. But we also need something else: strong, truthful narratives that can resist misinformation and fake news. This is what I mean by prevention. Building Trust & Strong Narratives 🛡️ When you trust someone, you give them the benefit of the doubt. If that person gets in trouble, you listen to their side of the story before jumping to conclusions. According to a study by Ipsos, 59% of people who strongly trust a company would give it the benefit of the doubt in a crisis. For those who feel neutral, that percentage drops to just 10%. Prevention means building trust and strong narratives. Because I believe the Zelenskys are exemplary leaders serving their country, my initial reaction was to doubt the Bugatti story. Of course, if there was more evidence, I might have changed my mind. But in this case, my narrative led me to doubt the fake news. To fight misinformation, we need both intervention and prevention. We must build trust and strong narratives to create a resilient defense against the spread of false information. I’d love to hear any similar examples of narrative and trust you can recall. #Misinformation #Leadership #Narratives

  • View profile for Ruslan Trad

    Researcher on information operations, disinformation narratives, security-related topics, and hybrid warfare from Syria to Eurasia. Editor, Security and Defense, Capital. Non-Resident Fellow at DFRLab.

    3,523 followers

    GLOBSEC's "Global Offensive: Mapping the Sources Behind the Pravda Network" report reveals a sophisticated, expanding disinformation ecosystem spreading pro-Kremlin narratives, adding a necessary layer to the previous research on the topic. Comprising over 87 localized subdomains, it functions as a continuous propaganda machine, primarily manipulating AI and language models by flooding digital spaces with content. The study analyzed over 4.3 million articles from 8,000+ sources. Telegram is the main distribution channel (up to 75% of content), with channels boasting 250 million subscribers. Russian websites, including state media like TASS and RT, contribute nearly 20%. Facebook is also utilized. Article output sharply increased in 2023-2024, prioritizing quantity over quality (85% published in under a minute). Serbia, the US, Ukraine, Moldova, and Italy are key targets, alongside the CEE region and Africa. Network analysis shows interconnected sources, with distinct local Telegram clusters. Examples from Czechia, Slovakia, Hungary, and Poland highlight local channels (e.g., RuskiStatek, UKR LEAKS_pl) spreading pro-Russian and anti-Ukrainian propaganda. ⭕ Recommendations include enhanced monitoring, interdisciplinary research, AI data manipulation prevention frameworks, and sanctioning those responsible for disinformation. The report notes blocking domains won't stop the network, as it recycles content from existing pro-Kremlin sources. 🔗 For more details: https://lnkd.in/dSPhV_8e

Explore categories