TYPOLOGY OF AGENTS AND THEIR INFLUENCE ON THE SPREAD OF DISINFORMATION IN SOCIAL NETWORKS
DOI:
https://doi.org/10.28925/2663-4023.2025.28.862Keywords:
information security; simulation agent modeling; social networks; disinformation; information operations.Abstract
Recently, there have been numerous studies analyzing the role of social networks in various types of communication, particularly in the spread of informational influences such as disinformation, fake news, and targeted information campaigns. Such phenomena poses a threat to the information security of any state. Currently, there is active use of the X network (Twitter) to influence both European and Ukrainian social media users. Given the importance of this issue, this study aimed to analyze the algorithm for content dissemination, taking into account different types of users (so-called “agents”). The study analyzed four types of agents and identified their key characteristics, which are likely to determine the level of influence during an information operation, which, in turn, is the main criterion for building a simulation model of content distribution on social networks.
Downloads
References
Ferrara, E. (2017). Disinformation and social bot operations in the run up to the 2017 French presidential election. First Monday, 22(8). https://doi.org/10.5210/fm.v22i8.8005
Bessi, A., & Ferrara, E. (2016). Social bots distort the 2016 U.S. presidential election online discussion. First Monday, 21(11). https://doi.org/10.5210/fm.v21i11.7090
Bovet, A., & Makse, H. A. (2019). Influence of fake news in Twitter during the 2016 US presidential election. Nature Communications, 10, Article 7. https://doi.org/10.1038/s41467-018-07761-2
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559
Aral, S., & Walker, D. (2012). Identifying influential and susceptible members of social networks. Science, 337(6092), 337–341. https://doi.org/10.1126/science.1215842
González‑Bailón, S., Borge‑Holthoefer, J., & Moreno, Y. (2013). Broadcasters and hidden influentials in online protest diffusion. American Behavioral Scientist, 57(7), 943–965. https://doi.org/10.1177/0002764213479371
Howard, P. N., & Kollanyi, B. (2016). Bots, #StrongerIn, and #Brexit: Computational propaganda during the UK‑EU referendum. COMPROP Research Note 2016.1. https://doi.org/10.2139/ssrn.2798311
Starbird, K., Arif, A., & Wilson, T. (2019). Disinformation as collaborative work: Surfacing the participatory nature of strategic information operations. Proceedings of the ACM on Human‑Computer Interaction, 3(CSCW), Article 127. https://doi.org/10.1145/3359229
Goel, S., Anderson, A., Hofman, J., & Watts, D. J. (2016). The structural virality of online diffusion. Management Science, 62(1), 180–196. https://doi.org/10.1287/mnsc.2015.2158 (DOI expected, but search did not find a direct DOI; journal and date are provided)
Berger, J. (2013). Contagious: Why things catch on. New York, NY: Simon & Schuster. ISBN: 978-1451686579
Vasylieva O, Technical Sciences and Technologies: Scientific Journal / National University “Chernihiv Polytechnic.” – Chernihiv: NU “Chernihiv Polytechnic,” 2023. – No. 2(32). – 418 p., 193-200
Daniel J. Brass A SOCIAL NETWORK PERSPECTIVE ON HUMAN RESOURCES MANAGEMENT, Research in Personnel and Human Resources Management, 39-79. DOI:10.1093/oso/9780195159509.003.0019
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Ольга Васильєва

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.