0
Skip to Content
RashidaH
Home
Blog Post #1
Blog Post #2
Blog #3
Blog #4.
A Guide for the Digital Age
RashidaH
Home
Blog Post #1
Blog Post #2
Blog #3
Blog #4.
A Guide for the Digital Age
Home
Blog Post #1
Blog Post #2
Blog #3
Blog #4.
A Guide for the Digital Age

Navigating the Illusion of Consensus: A Guide for the Digital Age

It is hard to navigate the information world with the current whirlwind of information. For a young college student on TikTok, X, and Instagram, having an eye for truth is no longer enough. We need to understand how our technology and social factors affect our reality. If we lived by the phrase, “We got this!” we would not have been so overwhelmed by the chaos of the news cycle.

Rather, we would have learned to remain skeptical about how algorithms and repetition impact our perception of knowledge in order to understand how the mind processes misinformation. If we could understand the psychology behind misinformation, we would be able to regain agency and become more actively engaged in our democracy. We want to understand how technology and social forces affect our perception of truth.

The Cognitive Trap of the Illusory Truth Effect

One of the most common reasons people hear false information is the illusory truth effect, where people believe repeated statements are more true than they are. The reason for this is that we believe that repetition allows us to process information more quickly. We rely on how easy it is to process information as our cue for truth, and we often believe what we know to be true (Udry & Barber, 2024).

Also, in some cases, repetition makes even the most unbelievable claims more likely to be true even though they contradict our prior knowledge (Udry & Barber, 2024). In other words, this is not always true, but because we are susceptible to repeating inaccurate information, we will share it. Because of repeated information, we believe the information is true by simply because it circulated in a social network. This “vicious circle” of disinformation continues to be shared, despite its lack of evidence (Vellani et al., 2023).

Distinguishing Between Truth and Consensus

It is the false consensus effect, where one assumes that a particular opinion or theory is more widely held by a majority than it is. This is what happens in digital echo chambers, where people only come across posts of people who share their thoughts.

This is contrasted with the false consensus effect, which is when people overestimate how many people share their opinions. The two words are often confused, but the illusion of consensus refers to the perceived popularity of an idea within a group and not its repetition (Alper et al., 2023).

These are often used in media. In digital media, the algorithms have been used for these arguments because they enhance sensational content and create a “false balance” in the media that hides actual consensus. When students see high engagement metrics on a piece of content they are shared, they can appear to add to the perceived trustworthiness of the message by letting the assumption that a group of users likes it, or likes, or comments that are a positive post, this social proof can be used to manipulate their perception of what others are saying about it.

This is a trait favored theory among both experts and laypeople (Alper et al., 2023; Traberg, 2025).

The Influence of Source Cues and Social Proof

Social environment in which information is consumed affects how information is perceived. For example, sources that are perceived to be credible or similar to those who represent a particular political party are more likely to be believed as reliable. This partisan source bias suggests that shared identity can be a mental shortcut that allows us to take in more misinformation from “our side” and less information from those of our opponents.

Such biases could become so extreme that they ignore objective accuracy and lead people to believe similar sources even when their content is questionable (Traberg, 2025).

While increased vulnerability to misinformation, the same source’s similarity cannot have a similarly trustworthy viewpoint in factual information in the same way, although it does affect the perception of a source with high objective credibility. If a source is deemed to be a lacked journalistic integrity, the influence of political similarity is less strong, suggesting that credibility is a primary filter for trust.

The problem is that these signals are often obscured or altered by malicious actors wearing approved badges or official-looking papers masquerading as reputable news outlets.

Reclaiming Calmness through Awareness and Inoculation

For this purpose, researchers have developed the idea of a “psychological inoculation.” Like a medical vaccine, a psychological inoculation would involve exposing patients to a weaker version of an unscrupulous strategy, such as emotional manipulation or conspiracy logic, and equipping them with the tools to counter it.

Gamification has proven to be effective for helping students recognize the “fingerprints” of manipulation and become more resilient towards misinformation from within and outside their group. Using the tools at their disposal, gamified media games are also known to help students develop resilience and improve their confidence in the digital world (Traberg et al., 2024).

Simply changing how we pay attention to information can make the truth appear more real. For example, if we focus on a task that requires our attention, such as reading a sentence rather than scrolling through it, we are less likely to count on processing fluency to prove the claim is true (Ly et al., 2024).

Another reason is that we are susceptible to influences beyond our control; the third-person consensus effect, where we presume that those around us are more influenced than we are, can lead us to doubt ourselves. By creating a healthy dose of “skeptical calmness” and questioning why an item of information is true, millennial’s can transcend the mirage and navigate the information age with confidence (Udry & Barber, 2024).

This visualizes three social clusters, with the large red component representing a dense digital echo chamber where frequent repetition can trigger the "illusory truth effect" among its members. The isolated green and blue clusters illustrate how fragmented information networks create an "illusion of consensus," leading people in the larger group to overestimate how widely their specific opinions are held

References

Alper, S., Yelbuz, B. E., & Konukoglu, K. (2023). Perceived expert and laypeople consensus

predict belief in local conspiracy theories in a non-WEIRD culture: Evidence from

Turkey. Judgment and Decision Making, 18(e35), 1-18.

https://doi.org/10.1017/jdm.2023.33

Ly, D. P., Bernstein, D. M., & Newman, E. J. (2024). An ongoing secondary task can reduce the

illusory truth effect. Frontiers in Psychology, 14(1215432).

https://doi.org/10.3389/fpsyg.2023.1215432

Traberg, C. S. (2025). The social cognition of misinformation and implications for psychological

interventions [Doctoral dissertation, University of Cambridge].

https://doi.org/10.17863/CAM.106452

Traberg, C. S., Harjani, T., Roozenbeek, J., & van der Linden, S. (2024). The persuasive effects

of social cues and source effects on misinformation susceptibility. Scientific Reports,

14(4205). https://doi.org/10.1038/s41598-024-54030-y

Udry, J., & Barber, S. J. (2024). The illusory truth effect: A review of how repetition increases

belief in misinformation. Current Opinion in Psychology, 56(101736).

https://doi.org/10.1016/j.copsyc.2023.101736

Vellani, V., Zheng, S., Ercelik, D., & Sharot, T. (2023). The illusory truth effect leads to the

spread of misinformation. Cognition, 236(105421).

https://doi.org/10.1016/j.cognition.2023.105421