
Political Science
Social networks, disinformation and diplomacy: a dynamic model for a current problem
A. G. Rincón, S. B. Moreno, et al.
Dive into the intriguing world of disinformation propagation in social networks! This research, conducted by Alfredo Guzmán Rincón, Sandra Barragán Moreno, Belén Rodríguez-Canovas, Ruby Lorena Carrillo Barbosa, and David Ricardo Africano Franco, uncovers how bots, trolls, and echo chambers play a pivotal role in shaping public opinion. Discover the dynamics at play and learn how a portion of the susceptible population remains engaged despite the onslaught of false information.
~3 min • Beginner • English
Introduction
The study investigates how social media enables the rapid spread of diplomatically motivated disinformation and seeks to understand its dynamics from an international relations perspective. With social media increasingly used by states and diplomatic actors to reach and influence foreign publics, some governments have allegedly leveraged these platforms to disseminate false content to advance foreign policy goals. Against this backdrop, the paper asks: how do elements of disinformation derived from social media diplomacy strategies interact to affect a susceptible population? Additional questions address the roles of bots and trolls, the impact of delaying disinformation activation, and the effects of algorithm-driven echo chambers. The purpose is to build a simulation model that integrates diplomacy-specific mechanisms (e.g., paid promotion, bots, trolls, organic reach) to provide insights into system behavior and potential intervention points.
Literature Review
Theoretical framing distinguishes disinformation—intentional, organized communication that lacks truth (by commission or omission) and aims to deceive—from propaganda, which seeks longer-term control. Historically used in statecraft, disinformation has evolved with digital platforms, where amplification, speed, and difficulty of attribution increase impact. Prior work documents state-level uses (notably Russia and China) to influence elections, polarize discourse, and undermine media credibility. Rumor and misinformation spread have been modeled with SIR/SEIR-type frameworks and extensions (SIRaRu, complex networks), but most models omit social media dynamics specific to diplomacy, such as organic/paid/invitational reach, bots, and trolls, and often assume unbounded target populations. Social media introduces algorithmic amplification, echo chambers, and engagement dynamics that can entrench exposure to falsehoods. The review highlights gaps: difficulty of attribution to state actors, scarcity of declassified data, and the need for diplomacy-focused models that integrate strategic planning, resource deployment (e.g., paid promotion), and coordinated inauthentic behavior (bots/trolls).
Methodology
Design: A system dynamics computational model was developed to simulate disinformation propagation on social media as a diplomacy strategy. The approach follows Bala et al. (2017), Forrester (2013), and Sterman (2012), motivated by the system’s non-linear, multi-causal, and time-lagged behavior. Stocks and flows: The model builds on SIR-type structures with seven stocks: five measured in people (Target population PO, Susceptible PS, Disinformed PD, Informed Pin, Unsubscribed PU) and two in accounts (Bots B, Trolls T). Flows are regulated by variables including organic/paid/invitational reach, campaign effectiveness, engagement, echo chamber effects, and bot/troll lifecycle rates. Variables: Key variables (with abbreviations) include invitation rate (i), invitation effectiveness (ei), multiple organic reach rates (tao_n), paid impressions (CPM) and campaign count (cd), campaign effectiveness (ec), resusceptibility/unsubscribe rates (tr, td), bot/troll contact rates (tcb, tct), disinformation delay (rd), engagement level (ne) and echo chamber (ce), bot/troll activation/deactivation and creation/removal rates (tab, tdb, tcpt, tet), and outreach by bots/trolls (ab, at). Assumptions: (1) Target population size is fixed; (2) Paid campaign count cd is the same for susceptibility adoption and disinformation phases; (3) Bots and trolls grow exponentially per their rates (not resource-bounded in the base model). Mathematical formulation: Differential equations specify dynamics for PO, PS, PD, Pin, PU with delay functions for onset of disinformation and growth/decay equations for B and T. Echo chamber ce is a function of engagement ne (graphical function). Initial parameters: PO=1,000,000; PS=1; PD=0; Pin=0; PU=0; B=1; T=10; representative rates include i=5%, ei=10%, CPM=1000 impressions/day, cd=10, ec=15%, tcb=20%, tct=40%, rd=70 days, ne=15%, tab=3%, tdb=0.1%, tcpt=3%, tet=0.08%, td=8%. Validation: Structural validation via sensitivity analysis (±10% on ec, i, rd, tab, tdb, ne, tcpt, tet; cd in 0–20; rd in 65–85 days; 100 scenarios, uniform distributions). Results showed numerical sensitivity but consistent qualitative behavior across stocks, indicating robustness of modeled structure. Simulation setup: Implemented in Stella Architect v3.3, time horizon 0–180 days, Δt=0.1 day, Euler integration; SPSS v25 used for normality (Kolmogorov–Smirnov) and non-parametric comparisons (Wilcoxon) of median stock levels between baseline and modified-parameter runs. Experiments: Five deterministic scenarios were run modifying one parameter at a time: Sim-1 cd=0 (remove paid promotion), Sim-2 B=0 (no bots), Sim-3 T=0 (no trolls), Sim-4 rd=30 (earlier activation), Sim-5 ne at 5% and 40% (echo chamber intensity).
Key Findings
Baseline (180 days): • Target population (PO) decreased by 84.3%; final PS reached 691,722 people. • The agent spread disinformation to 267,275 people overall; the informed population (Pin) reached 148,117; only 11,779 unsubscribed (PU). • Average daily impact after activation was 1,476 people; peak daily increase was 3,335 at day 176. • Paid reach disinformed a constant 1,200 people/day; other mechanisms contributed less individually, while bot-driven outreach was relatively larger by t=180. • Stocks for bots and trolls exhibited exponential growth per their activation/deactivation parameters. Sensitivity envelopes (95% CI at t=180) suggested PD between 227 and 203,000; Pin between 138 and 69,200; PU between 1,060 and 76,000, with consistent qualitative behavior across runs. Comparative simulations (Wilcoxon tests, p<0.001 unless noted): • Sim-1 (cd=0): Removing paid promotion significantly reduced PO-to-PS conversion and downstream PD, Pin, PU across the board; PS fell by 38.02% at t=180, indicating paid promotion is a critical lever in diplomacy-driven disinformation. • Sim-2 (B=0): Absence of bots significantly lowered PD, Pin, and PU relative to baseline; overall impact was smaller than removing paid reach but still substantial. • Sim-3 (T=0): Absence of trolls also significantly reduced PD, Pin, and PU, with a magnitude similar to but slightly less than removing bots. • Sim-4 (rd=30): Earlier activation increased PS and PU, while PD and Pin changed minimally (on the order of tenths of a percent), indicating timing mainly affects susceptibility and disengagement rather than final misinformation/information totals. • Sim-5 (echo chamber ne): Lower engagement (ne=5%) modestly reduced PD, Pin, and PU vs. baseline; higher engagement (ne=40%) increased PD (e.g., +3,648 at t=180) and reduced Pin (−307) and PU (−1,538), consistent with echo chambers reinforcing misinformation and hindering correction/unsubscription. Overall, paid promotion had the largest system-wide effect, followed by bots and trolls; echo chambers amplified misinformation prevalence; early activation primarily increased susceptibility and unsubscription but had limited effect on PD/Pin totals.
Discussion
The model addresses how diplomacy-specific disinformation elements interact on social media to affect a susceptible population. By bounding the target population and integrating organic/paid/invitational reach with coordinated inauthentic behavior (bots, trolls), delays, and engagement-driven echo chambers, the framework captures macro-level international dynamics beyond local rumor models. Findings indicate that paid promotion is a key driver of both audience acquisition (PS) and downstream misinformation exposure (PD). Bots and trolls significantly amplify dissemination but, under the simulated parameters, have less impact than paid promotion. Earlier activation expands susceptibility and increases unsubscription with limited change to misinformation/information totals, suggesting timing strategies may be optimized primarily for reach rather than conversion. Echo chambers, operationalized via engagement, increase misinformed prevalence and suppress correction and disengagement, aligning with prior evidence of selective exposure and reinforcement in social media contexts. For policy and platform governance, results underscore the importance of regulating or monitoring paid amplification of political/diplomatic content, enhancing detection and removal of coordinated inauthentic behavior, and mitigating engagement-driven reinforcement (e.g., algorithmic downranking of misleading content). For international relations, the model provides a tool to explore scenario planning, assess leverage points, and inform counter-disinformation strategies.
Conclusion
This study proposes a diplomacy-focused system dynamics model of disinformation propagation on social media that integrates target population constraints, outreach modes (organic, paid, invitation), coordinated inauthentic behavior (bots, trolls), activation delays, and echo-chamber effects. The model advances beyond traditional rumor-spread frameworks by reflecting strategic, resource-enabled actions typical of state actors. Simulations show that paid promotion exerts the strongest influence on system behavior, with bots and trolls serving as significant but secondary amplifiers; echo chambers increase misinformation prevalence; and earlier activation chiefly increases susceptibility and unsubscription. The model is adaptable across platforms and can be extended with layered or array-based structures. Future research should incorporate stochastic elements and multi-parameter perturbations, expand validation techniques, and update model components as platforms and tactics evolve, thereby enhancing robustness and applicability for scholars and policymakers in international relations.
Limitations
Simulations altered one parameter at a time under ceteris paribus assumptions; multi-parameter changes could yield different dynamics. The model is deterministic, omitting stochastic variability and uncertainties inherent in real-world systems. Validation focused on structural sensitivity; additional validation techniques from system dynamics could further test robustness. Some parameter values were estimated from reports and literature, with limited declassified data and attribution challenges in diplomacy contexts. The model reflects current tactics; new platform features or disinformation mechanisms may require extensions.
Related Publications
Explore these studies to deepen your understanding of the subject.