Disinformation Maneuver: Serving Lies with a Side of Truth

Disinformation Maneuver: Serving Lies with a Side of Truth

This blog is about disinformation being presented as truth and feeding us lies.

As technology rapidly develops, the possibilities of disinformation have increased significantly, leading to the emergence of new forms of influence on human behavior. Humans still cope with information and technological threats unknown to them in the ways that evolution has taught them. This blog delves into the intricate ways in which disinformation impacts human behavior, shaping our brains, emotions, and overall well-being. We’ll explore the cunning mechanisms by which disinformation deceives our minds, while also offering insights into how we can fortify ourselves against its harmful effects.

There are various algorithms for creating disinformation out of lies. Some of these can be identified, studied, and comprehended, while others are not visible at all, making it difficult for individuals to determine what has influenced their emotions or behaviors. In this sense, disinformation is comparable to an iceberg. Its scale is invisible, it is presented as truth, and even after multiple readings or views, it “does not radiate lies or danger.” Instead, it works on emotions, and that is the point.

Disinformation can have a subtle yet intentional impact on the human body and information processing system. Remarkably, even fictitious, imagined information can wield a profound impact on our mood and mindset. Consider, for instance, the realm of  dreams (when your eyes are closed, and images are perceived only by imagination), which can leave emotional imprints and visual perceptions. If a person visualizes disinformation into a singular image, enriching it with imaginary patterns, it constructs a ‘pseudo-reality’ that emotionally affects the person in the same way as a real-life experience.

Human anatomy is a crucial aspect that cannot be ignored here. The brain structure, formed over millions of years of evolution, plays a pivotal role in shaping our responses to both novel and familiar information, governs our reactions to emotional triggers, regulates memory retention and categorization, orchestrates the formation of new memories, and oversees the deletion of obsolete or unwanted material. However, in the realm of disinformation, some methods employed to manipulate information can veer into unethical territory, posing potential harm to individuals’ mental and emotional well-being. 

The central aim of disinformation is to penetrate the unconscious mind, a feat often facilitated during moments of crisis and emotional vulnerability. In such circumstances, individuals may undergo shifts in their information consumption patterns. It’s crucial to recognize that people tend to perceive, process, and retain negative and positive information in distinct ways. Negative emotions, such as anger, sadness, and disgust, exert a more profound impact on the human psyche than their positive counterparts. Therefore, disinformation is more effective when it is associated with negative emotions. Often fueled by hate speech, disinformation becomes a potent tool, particularly during times of war and crisis, exacerbating polarization and fracturing societal cohesion. To counter disinformation effectively, it is paramount to foster unity among disparate groups and individuals, rallying them around shared values and demonstrate unity against the threat. 

“Resilience filter”

The main challenge is that we expose ourselves to well-disguised disinformation, which often infiltrates our unconscious minds unchecked, establishing a stronghold that is difficult to dismantle. Manipulation plays a central role in this process, as the brain naturally seeks to exert control over incoming information by categorizing it into familiar patterns and selectively memorizing it. Familiar information is perceived as safe by the brain, allowing it to pass through filters with ease. When confronted with illogical, confusing, or blatantly false information, the mind grapples with confusion, ultimately relinquishing control and allowing disinformation to permeate unchecked. This kind of confusion is the key to disinformation because our body cannot physiologically “control confusion” for long. The brain surrenders to disinformation only to relieve tension and calm itself. Then, it opens the barrier of control, thus “opening the door” to disinformation and “swallowing” it unprocessed. When we talk about influencing a person with confusion, it can mean the influence of a sentence made up of contradictory information. Again, the mind finds it difficult to find logic and gives up. It is at this moment that the received factual material leaves a trace on our unconscious mind and emotions. 

Shocking messages are often used to confuse the mind. The larger the lie, the more likely it is to be remembered, making a person more vulnerable to manipulation. Although we may not believe the information, we may still remember it. Once embedded in the unconscious and memorized, a falsehood can persistently influence perceptions. When the truth is presented later, barriers are created that hinder the acceptance of truth.

Disinformation often thrives in the absence of knowledge. When people are well-informed about a particular topic, they are better equipped to identify and filter out disinformation. Therefore, it is crucial to not only consider what the human brain knows but also what it does not know. Disinformation thrives in the absence of knowledge, exploiting gaps in understanding to propagate falsehoods unchecked. In these situations, the brain is not prepared to respond to disinformation. For instance, if we lack knowledge about the history of Ukraine, we may be susceptible to believing Russia’s false information, which can lead us to remember any lies. At this point, nothing stands in the way of disinformation wrapped in truth. Disinformation can easily take hold in the absence of knowledge. For instance, during the 2020-21 Covid pandemic, amidst the stress and crises, several conspiracy theories about the virus spread. Logically, individuals who knew nothing about the virus and how it was transmitted might have been more susceptible to remembering these conspiracy stories. 

Metaphorically speaking, disinformation is a “virus” that, in the absence of immunity, can harm and destroy the body (mental and physical health). This virus does its harmful work after entering the body. It has its own structure, knows how to move, find vulnerable groups, replicate itself, leave confusion, etc. Awareness is the best defense against the “virus” of disinformation. 

How did they feed us a lie? 

Distorting the truth is not the only goal of disinformation. It uses the truth for its own purposes. Integration with the truth is the natural state of disinformation. A paradox is that disinformation uses as much truth as possible to spread lies. For disinformation to take hold in the human brain, it needs an entourage of truth. This means that if the information is 95% true and only a small part of it is fiction, the human brain will remember the information as 100% true. This technique has been used before and is still used today. Let’s take a closer look at how it works.

The brain goes through three stages during the process of integrating truth and manipulation. First, it receives information that it recognizes as true; so, this information is familiar and associated with safety and physiological comfort (1); next, it integrates lies (2), and finally, it receives another flow of truth, which again provides comfort and safety.

  1. Truth (causing a sense of safety and comfort) 
  2. Lie (causing confusion and suspicion) 
  3. Truth (causing a sense of safety and comfort)

With this combination, it is a challenge for the brain to “sneak a lie.” Disinformation affects us even when we have doubts. Even when individuals realize that they are being lied to, the brain still has a very complex job to do. It should immediately leave the “zone of truth and safety, comfort zone” and react accordingly… This is also prevented by the fact that while processing the received lie or truth, it already hears the continuation of the story – the truth again, which brings it back to a state of safety and physiological comfort. This happens very quickly. The brain, wanting to solve the problem as easily as possible, prefers to “skip over threatening lies”, protect and maintain a sense of comfort, and not process the lie separately, but integrate it into a whole package. As a result, the brain remembers the entire information (including the lie component) as the truth.

Let’s try an experiment while reading this blog. Imagine that I am addressing you directly:

“I am pleased that you are reading this blog and I thank you for doing so. Disinformation is the topic of this blog, as you already know. Some of you may be reading this material on your phone or laptop, some of you may be in Georgia, others abroad. I am writing this blog from Brussels, and it is raining here today, like in Tbilisi. That’s where I decided to publish this blog and now you are reading it.”

It seems that there is nothing that our mind needs to filter out. There is nothing dangerous.

When faced with unfamiliar information, it’s prudent to consider the possibility that it might be disinformation. Critical thinking serves as the initial and essential step in safeguarding ourselves. It’s preferable to address confusion by asking questions and seeking clarifications.

Integration with time and memory 

Our brain does not function as a video camera that can be rewound to review what it has seen or heard. Instead, memory, which is constantly created, selectively remembers information such as images, sounds and emotions. Often, we are unaware of what is stored in our minds. As you read this blog, you are creating a new memory. Disinformation also attempts to integrate with time and memory, acting imperceptibly and gradually, which is what media theories refer to as the “drip-drip effect”. Disinformation repeats a small lie numerous times, across different contexts and timeframes, effectively embedding it into our neural networks. Hence, initially, we are subjected to falsehoods, and subsequently, whether intentionally or inadvertently, we may propagate disinformation ourselves. This interplay between human nature and manipulation significantly complicates efforts to combat this harmful phenomenon.

On two disinformation algorithms 

Johnson-Cartee and Copeland (2004) [1] discuss various propaganda techniques. My focus is on propaganda techniques and algorithms targeting emotions, specifically fear. I refer to this as the “spiral of fear” of disinformation. The following two schemes are offered:

  1. If X happens, then Y will happen, and that is terrible.
  2. If X and Y happen that is already terrible and Z appears, it will be the same and even worse.

Insert the statements by Russian propaganda journalist Margarita Simonyan into these algorithms and you will see the systematic and widespread nature of its practical use. 

How can we protect ourselves from disinformation?

How can we teach our brains to respond to new threats? The challenge lies in the fact that we cannot eradicate disinformation as a phenomenon; it will always exist. Unfortunately, identifying already spread disinformation does not have retroactive power, as the damage has already been done. The only solution is to proactively build information resilience and immunity. 

The media serves as a daily source of lifelong non-formal education, playing a crucial role in fostering information resilience. Being well-informed contributes to our emotional and mental well-being, instilling a sense of security.

Given the targeted nature of Russian propaganda in post-Soviet countries, it is crucial to cultivate information resilience within the historical context through education. To counteract the falsification of history, it is imperative to proactively enhance the teaching of world history, elucidating the true essence of Russia and the Soviet Union, including their characters and narratives. Given Russia’s ongoing objective to exert influence over our state, it is paramount to carefully consider media language and vocabulary when disseminating materials concerning occupied territories. For instance, rather than simply referring to Abkhazia and Tskhinvali, it is crucial to accurately portray the situation by using terms such as “occupied Abkhazia” and “occupied Tskhinvali.” This approach ensures clarity and truthfulness in reporting, thereby countering attempts to distort or manipulate the narrative. And there are numerous other topics that warrant similar attention and consideration.

Did You Just Swallow a Lie?

I would like to ask you to recall the weather in Tbilisi and Brussels when this blog was written. Do you remember that it “was raining”? If so, be sure that it was a small lie wrapped in truth and you memorized it as another safe but unverified piece of information. You might wonder why it’s necessary to verify such trivial information. However, remember that even seemingly insignificant details can serve as the foundation for future disinformation. For instance, if someone later claims they couldn’t attend a meeting due to the rain, or a speaker arrives late citing traffic caused by inclement weather, this initial detail could inadvertently lend credibility to their claims, logical chain will be more convincing. The content and scope of the information may change, but the structure will remain the same.

Therefore, both the key to disinformation and protecting oneself against it lie within the individual and the invisible player called the media. The best defense against the influence of disinformation is our awareness, more critical thinking, and asking simple questions like “What? Who? Where? When? How? Why?” These simple journalistic questions, also known as the 5W+1H serve to cultivate greater information resilience and ultimately shield us from manipulation.

This blog has been produced under the series of “History Keepers” in the frame of the project “Solidarity Journalism for Peace and Security” funded by the European Union, within its Eastern Partnership Civil Society Fellowship Programme. Its contents are the sole responsibility of the author and do not necessarily reflect the views of the European Union.

[1] Strategic political communications, rethinking social influence, persuasion and propaganda, by Karen S. Johnson-Cartee and Cary A. Copeland, Rowmen & Littlefield Publishers, 2004

Leave a comment

Send a Comment

Your email address will not be published. Required fields are marked *