Propaganda Technique of Enemy Dehumanization: Sociocommunicative Dimension

2025;
: 78-90
1
Zaporizhia National University

The study explores the impact of the propaganda technique of enemy dehumanization on the Russian audience within the context of the Russia-Ukraine conflict, with a focus on sociocommunicative aspects. The research examines the nature of fake messages spread through Russian Telegram channels, their influence on emotional, cognitive, and behavioral perception, and the classification of user reactions. The analysis helps to understand the mechanisms through which hostility is formed via dehumanization, evaluate audience engagement, and develop strategies to counter propaganda narratives.

Methodology. The research employs a comprehensive approach, including content analysis of fake messages, cognitive and emotional analysis of audience reactions, and quantitative assessment of interactions (shares, comments, likes). Content analysis identified key elements of dehumanization, such as manipulative language, visual materials, and emotionally charged narratives. Cognitive analysis revealed audience perception of these messages, highlighting primary emotions such as anger, fear, and doubt, while quantitative methods measured audience engagement. 

Findings. The study identified core characteristics of dehumanization propaganda, including lexical and visual structures and manipulative narratives used in Russian Telegram channels. The research categorized audience reactions based on their tone, engagement level, and interaction style. It assessed how dehumanization propaganda fosters hostility toward the Ukrainian Armed Forces and the factors contributing to the viral spread of such messages. The results provide a foundation for recommendations on countering manipulative content and improving media literacy.

Novelty. This study offers a comprehensive analysis of the impact of enemy dehumanization on the Russian audience from a sociocommunicative perspective. For the first time, it examines the relationship between emotional, cognitive, and behavioral responses to fake messages in Telegram channels, allowing the development of a reaction typology. New criteria for analysis are introduced, including comment tone, engagement level, and content virality. The study also identifies key mechanisms for spreading manipulative narratives through lexical and visual dehumanization techniques that reinforce cognitive biases and evoke negative emotions.

Practical Significance. The research findings can be used to develop strategies for countering propaganda and digital manipulation. The proposed typology of audience reactions and fake message analysis criteria can be applied to create automated systems for detecting manipulative content on social media. The results also contribute to media literacy programs aimed at enhancing critical thinking skills. Additionally, they can support government and non-governmental initiatives in shaping information security strategies, combating dehumanization narratives, and fostering trust in credible information sources.

  1. Zimbardo, P. The Lucifer Effect: Understanding How Good People Turn Evil. Kyiv: Dukh i Litera, 2019.
  2. Pratkanis,  A.,  Aronson, E.  Age of Propaganda: The Everyday Use and  Abuse of Persuasion.  Kyiv: Akademiya, 2021. 400 p.
  3. Boyd, D., Ellison, N. Social Network Sites: Definition, History, and Scholarship. Journal of Computer- Mediated Communication. 2020. No. 13 (1).
  4. Ivanova, O. V. Disinformation in the Digital Age: The Impact of Fake Messages on Audience Awareness. Naukovi Zapysky. 2023. No. 7. Рр. 45–58.
  5. Petrov, D. V. Emotional Impact of Fake Messages on the Audience. Information Wars. 2023. No. 5. Рр. 12–25.
  6. Howard, P. Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. Oxford: Oxford University Press, 2020. 240 p.
  7. Wodak,  R.  The  Politics  of  Fear:  What  Right-Wing  Populist  Discourses  Mean.  London:  SAGE Publications, 2015. 256 p.
  8. Brown, M. Deepfake Propaganda and Digital Manipulation. AI & Society. 2023. No. 22. Рр. 345–360.
  9. Zannettu, S. Disinformation in the Era of Social Media. ACM Transactions. 2020. No. 3 (1). Рp. 25–40.
  10. Marchenko, T. Modern Technologies for Detecting Fakes. Technical Progress. 2023. No. 2. Рp. 15–28.
  11. Havryliuk, O. Information War: Methods, Technologies, Impact. Kyiv: Naukova Dumka, 2022. 250 p.
  12. Wardle, C., Derakhshan, H. Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking. Council of Europe Report. Strasbourg: Council of Europe, 2017. 109 p.
  13. Center for Countering Disinformation. Analytical Report on Russian Propaganda. Kyiv: Center, 2024. 65 p.