User Tools

Site Tools


disinformation

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
disinformation [2023/03/05 20:14] jgmac1106disinformation [2023/06/08 16:59] – adding kling et al jgmac1106
Line 1: Line 1:
 ===== Bibliography ===== ===== Bibliography =====
 +A’Beckett,L.(2013).Strategies to Discredit Opponents: Russian Presentations of Events in Countries of the Former Soviet Union. Psychology of Language and Communication,17(2) 133-156. https://doi.org/10.2478/plc-2013-0009
  
 Alieva, I., Moffitt, J. D., & Carley, K. M. (2022). How disinformation operations against Russian opposition leader Alexei Navalny influence the international audience on Twitter. Social Network Analysis and Mining, 12(1), 1-13. Alieva, I., Moffitt, J. D., & Carley, K. M. (2022). How disinformation operations against Russian opposition leader Alexei Navalny influence the international audience on Twitter. Social Network Analysis and Mining, 12(1), 1-13.
Line 58: Line 59:
  
 Doroshenko, L., & Lukito, J. (2021). Trollfare: Russia’s Disinformation Campaign During Military Conflict in Ukraine. International Journal of Communication, 15, 28. Doroshenko, L., & Lukito, J. (2021). Trollfare: Russia’s Disinformation Campaign During Military Conflict in Ukraine. International Journal of Communication, 15, 28.
 +
 +European External Action Service’s (2023). EEEAS Report on Foreign Information Manipulation and Interference Threats.{{ :eeas-threatreport-february2023-02.pdf |}}
  
 Fallis, D. (2009). A conceptual analysis of disinformation. Fallis, D. (2009). A conceptual analysis of disinformation.
Line 82: Line 85:
  
 Guimaraes, A., Balalau, O., Terolli, E., & Weikum, G. (2019, July). Analyzing the traits and anomalies of political discussions on reddit. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 13, pp. 205-213). Guimaraes, A., Balalau, O., Terolli, E., & Weikum, G. (2019, July). Analyzing the traits and anomalies of political discussions on reddit. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 13, pp. 205-213).
 +
 +Hale, H. E., Shevel, O., & Onuch, O. (2018). Believing facts in the fog of war: identity, media and hot cognition in Ukraine’s 2014 Odesa tragedy. Geopolitics, 23(4), 851-881.{{ :believing_facts_in_the_fog_of_war_identi.pdf |}}
  
 Holland, M. (2006). The propagation and power of communist security services dezinformatsiya. International Journal of Intelligence and CounterIntelligence, 19(1), 1-31. Holland, M. (2006). The propagation and power of communist security services dezinformatsiya. International Journal of Intelligence and CounterIntelligence, 19(1), 1-31.
Line 100: Line 105:
  
 Kilkenny, E. (2021). Russian Disinformation–The Technological Force Multiplier. Global Insight: A Journal of Critical Human Science and Culture, 2. Kilkenny, E. (2021). Russian Disinformation–The Technological Force Multiplier. Global Insight: A Journal of Critical Human Science and Culture, 2.
 +
 King, F. (2018). Reflexive control and disinformation in putin’s wars. King, F. (2018). Reflexive control and disinformation in putin’s wars.
 +
 +Kling, J., Toepfl, F., Thurman, N., & Fletcher, R. (2022). Mapping the website and mobile app audiences of Russia’s foreign communication outlets, RT and Sputnik, across 21 countries. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-110 
  
 Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US presidential debate. Comprop data memo, 1, 1-4. Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US presidential debate. Comprop data memo, 1, 1-4.
Line 107: Line 115:
  
 Lanoszka, A. (2019). Disinformation in international politics. European journal of international security, 4(2), 227-248. Lanoszka, A. (2019). Disinformation in international politics. European journal of international security, 4(2), 227-248.
 +
 +Laruelle, M., & Limonier, K. (2021). Beyond “hybrid warfare”: a digital exploration of Russia’s entrepreneurs of influence. Post-Soviet Affairs, 37(4), 318-335.{{ ::laurellelimonie2021influencer.pdf |}}
 +
  
 Lewis, R., & Marwick, A. (2017). Taking the red pill: Ideological motivations for spreading online disinformation. Understanding and addressing the disinformation ecosystem, 18-22. Lewis, R., & Marwick, A. (2017). Taking the red pill: Ideological motivations for spreading online disinformation. Understanding and addressing the disinformation ecosystem, 18-22.
Line 183: Line 194:
  
 Walker, S., Mercea, D., & Bastos, M. (2019). The disinformation landscape and the lockdown of social platforms. Information, Communication & Society, 22(11), 1531-1543. Walker, S., Mercea, D., & Bastos, M. (2019). The disinformation landscape and the lockdown of social platforms. Information, Communication & Society, 22(11), 1531-1543.
 +
 +Watling, J., Danylyuk, O. V., & Reynolds, N. (2023). Preliminary Lessons from Russia’s Unconventional Operations During the Russo-Ukrainian War, February 2022–February 2023.
  
 Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646-1664 Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646-1664
disinformation.txt · Last modified: 2023/08/06 13:58 by jgmac1106