User Tools

Site Tools


disinformation

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
disinformation [2023/05/24 01:10] jgmac1106disinformation [2023/06/08 16:59] – adding kling et al jgmac1106
Line 1: Line 1:
 ===== Bibliography ===== ===== Bibliography =====
 +A’Beckett,L.(2013).Strategies to Discredit Opponents: Russian Presentations of Events in Countries of the Former Soviet Union. Psychology of Language and Communication,17(2) 133-156. https://doi.org/10.2478/plc-2013-0009
  
 Alieva, I., Moffitt, J. D., & Carley, K. M. (2022). How disinformation operations against Russian opposition leader Alexei Navalny influence the international audience on Twitter. Social Network Analysis and Mining, 12(1), 1-13. Alieva, I., Moffitt, J. D., & Carley, K. M. (2022). How disinformation operations against Russian opposition leader Alexei Navalny influence the international audience on Twitter. Social Network Analysis and Mining, 12(1), 1-13.
Line 104: Line 105:
  
 Kilkenny, E. (2021). Russian Disinformation–The Technological Force Multiplier. Global Insight: A Journal of Critical Human Science and Culture, 2. Kilkenny, E. (2021). Russian Disinformation–The Technological Force Multiplier. Global Insight: A Journal of Critical Human Science and Culture, 2.
 +
 King, F. (2018). Reflexive control and disinformation in putin’s wars. King, F. (2018). Reflexive control and disinformation in putin’s wars.
 +
 +Kling, J., Toepfl, F., Thurman, N., & Fletcher, R. (2022). Mapping the website and mobile app audiences of Russia’s foreign communication outlets, RT and Sputnik, across 21 countries. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-110 
  
 Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US presidential debate. Comprop data memo, 1, 1-4. Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US presidential debate. Comprop data memo, 1, 1-4.
Line 113: Line 117:
  
 Laruelle, M., & Limonier, K. (2021). Beyond “hybrid warfare”: a digital exploration of Russia’s entrepreneurs of influence. Post-Soviet Affairs, 37(4), 318-335.{{ ::laurellelimonie2021influencer.pdf |}} Laruelle, M., & Limonier, K. (2021). Beyond “hybrid warfare”: a digital exploration of Russia’s entrepreneurs of influence. Post-Soviet Affairs, 37(4), 318-335.{{ ::laurellelimonie2021influencer.pdf |}}
 +
  
 Lewis, R., & Marwick, A. (2017). Taking the red pill: Ideological motivations for spreading online disinformation. Understanding and addressing the disinformation ecosystem, 18-22. Lewis, R., & Marwick, A. (2017). Taking the red pill: Ideological motivations for spreading online disinformation. Understanding and addressing the disinformation ecosystem, 18-22.
Line 189: Line 194:
  
 Walker, S., Mercea, D., & Bastos, M. (2019). The disinformation landscape and the lockdown of social platforms. Information, Communication & Society, 22(11), 1531-1543. Walker, S., Mercea, D., & Bastos, M. (2019). The disinformation landscape and the lockdown of social platforms. Information, Communication & Society, 22(11), 1531-1543.
 +
 +Watling, J., Danylyuk, O. V., & Reynolds, N. (2023). Preliminary Lessons from Russia’s Unconventional Operations During the Russo-Ukrainian War, February 2022–February 2023.
  
 Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646-1664 Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646-1664
disinformation.txt · Last modified: 2023/08/06 13:58 by jgmac1106