User Tools

Site Tools


disinformation

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
disinformation [2023/05/24 01:10] jgmac1106disinformation [2023/08/06 13:58] (current) jgmac1106
Line 1: Line 1:
 ===== Bibliography ===== ===== Bibliography =====
 +A’Beckett,L.(2013).Strategies to Discredit Opponents: Russian Presentations of Events in Countries of the Former Soviet Union. Psychology of Language and Communication,17(2) 133-156. https://doi.org/10.2478/plc-2013-0009
 +
 +Agursky, M. (1989). SOVIET DISINFORMATION AND FORGERIES. International Journal on World Peace, 6(1), 13–30. http://www.jstor.org/stable/20751319
  
 Alieva, I., Moffitt, J. D., & Carley, K. M. (2022). How disinformation operations against Russian opposition leader Alexei Navalny influence the international audience on Twitter. Social Network Analysis and Mining, 12(1), 1-13. Alieva, I., Moffitt, J. D., & Carley, K. M. (2022). How disinformation operations against Russian opposition leader Alexei Navalny influence the international audience on Twitter. Social Network Analysis and Mining, 12(1), 1-13.
Line 82: Line 85:
  
 Golovchenko, Y., Buntain, C., Eady, G., Brown, M. A., & Tucker, J. A. (2020). Cross-platform state propaganda: Russian trolls on Twitter and YouTube during the 2016 US presidential election. The International Journal of Press/Politics, 25(3), 357-389. Golovchenko, Y., Buntain, C., Eady, G., Brown, M. A., & Tucker, J. A. (2020). Cross-platform state propaganda: Russian trolls on Twitter and YouTube during the 2016 US presidential election. The International Journal of Press/Politics, 25(3), 357-389.
 +
 +Grčar, M., Cherepnalkoski, D., Mozetič, I., & Kralj Novak, P. (2017). Stance and influence of Twitter users regarding the Brexit referendum. Computational social networks, 4, 1-25.
  
 Guimaraes, A., Balalau, O., Terolli, E., & Weikum, G. (2019, July). Analyzing the traits and anomalies of political discussions on reddit. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 13, pp. 205-213). Guimaraes, A., Balalau, O., Terolli, E., & Weikum, G. (2019, July). Analyzing the traits and anomalies of political discussions on reddit. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 13, pp. 205-213).
  
 Hale, H. E., Shevel, O., & Onuch, O. (2018). Believing facts in the fog of war: identity, media and hot cognition in Ukraine’s 2014 Odesa tragedy. Geopolitics, 23(4), 851-881.{{ :believing_facts_in_the_fog_of_war_identi.pdf |}} Hale, H. E., Shevel, O., & Onuch, O. (2018). Believing facts in the fog of war: identity, media and hot cognition in Ukraine’s 2014 Odesa tragedy. Geopolitics, 23(4), 851-881.{{ :believing_facts_in_the_fog_of_war_identi.pdf |}}
 +
 +Hatch, B. (2019). The future of strategic information and cyber-enabled information operations. Journal of Strategic Security, 12(4), 69-89.
  
 Holland, M. (2006). The propagation and power of communist security services dezinformatsiya. International Journal of Intelligence and CounterIntelligence, 19(1), 1-31. Holland, M. (2006). The propagation and power of communist security services dezinformatsiya. International Journal of Intelligence and CounterIntelligence, 19(1), 1-31.
Line 98: Line 105:
  
 Jankowicz, N. (2020). How to lose the information war: Russia, fake news, and the future of conflict. Bloomsbury Publishing. Jankowicz, N. (2020). How to lose the information war: Russia, fake news, and the future of conflict. Bloomsbury Publishing.
 +
  
 Karlova, N. A., & Lee, J. H. (2011). Notes from the underground city of disinformation: A conceptual investigation. Proceedings of the American Society for Information Science and Technology, 48(1), 1-9. Karlova, N. A., & Lee, J. H. (2011). Notes from the underground city of disinformation: A conceptual investigation. Proceedings of the American Society for Information Science and Technology, 48(1), 1-9.
Line 104: Line 112:
  
 Kilkenny, E. (2021). Russian Disinformation–The Technological Force Multiplier. Global Insight: A Journal of Critical Human Science and Culture, 2. Kilkenny, E. (2021). Russian Disinformation–The Technological Force Multiplier. Global Insight: A Journal of Critical Human Science and Culture, 2.
 +
 King, F. (2018). Reflexive control and disinformation in putin’s wars. King, F. (2018). Reflexive control and disinformation in putin’s wars.
 +
 +Kling, J., Toepfl, F., Thurman, N., & Fletcher, R. (2022). Mapping the website and mobile app audiences of Russia’s foreign communication outlets, RT and Sputnik, across 21 countries. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-110 
  
 Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US presidential debate. Comprop data memo, 1, 1-4. Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US presidential debate. Comprop data memo, 1, 1-4.
Line 111: Line 122:
  
 Lanoszka, A. (2019). Disinformation in international politics. European journal of international security, 4(2), 227-248. Lanoszka, A. (2019). Disinformation in international politics. European journal of international security, 4(2), 227-248.
 +
  
 Laruelle, M., & Limonier, K. (2021). Beyond “hybrid warfare”: a digital exploration of Russia’s entrepreneurs of influence. Post-Soviet Affairs, 37(4), 318-335.{{ ::laurellelimonie2021influencer.pdf |}} Laruelle, M., & Limonier, K. (2021). Beyond “hybrid warfare”: a digital exploration of Russia’s entrepreneurs of influence. Post-Soviet Affairs, 37(4), 318-335.{{ ::laurellelimonie2021influencer.pdf |}}
 +
  
 Lewis, R., & Marwick, A. (2017). Taking the red pill: Ideological motivations for spreading online disinformation. Understanding and addressing the disinformation ecosystem, 18-22. Lewis, R., & Marwick, A. (2017). Taking the red pill: Ideological motivations for spreading online disinformation. Understanding and addressing the disinformation ecosystem, 18-22.
  
 Lemke, T., & Habegger, M. W. (2022). Foreign Interference and Social Media Networks: A Relational Approach to Studying Contemporary Russian Disinformation. Journal of Global Security Studies, 7(2), ogac004. Lemke, T., & Habegger, M. W. (2022). Foreign Interference and Social Media Networks: A Relational Approach to Studying Contemporary Russian Disinformation. Journal of Global Security Studies, 7(2), ogac004.
 +
 +Llewellyn, C., Cram, L., Favero, A., & Hill, R. L. (2018, May). Russian troll hunting in a brexit Twitter archive. In Proceedings of the 18th acm/ieee on joint conference on digital libraries (pp. 361-362).
  
 Libicki, M. C. (1995). What is information warfare?. NATIONAL DEFENSE UNIV WASHINGTON DC INST FOR NATIONAL STRATEGIC STUDIES. Libicki, M. C. (1995). What is information warfare?. NATIONAL DEFENSE UNIV WASHINGTON DC INST FOR NATIONAL STRATEGIC STUDIES.
Line 127: Line 142:
  
 McFaul, M., & Kass, B. (2019). Understanding Putin’s intentions and actions in the 2016 US presidential election. SECURING AMERICAN ELECTIONS, 1. McFaul, M., & Kass, B. (2019). Understanding Putin’s intentions and actions in the 2016 US presidential election. SECURING AMERICAN ELECTIONS, 1.
 +
 +McGaughey, E. (2018). Could brexit be void?. King's Law Journal, 29(3), 331-343.
  
 McKay, S., & Tenove, C. (2021). Disinformation as a threat to deliberative democracy. Political Research Quarterly, 74(3), 703-717. McKay, S., & Tenove, C. (2021). Disinformation as a threat to deliberative democracy. Political Research Quarterly, 74(3), 703-717.
Line 133: Line 150:
  
 Mejias, U. A., & Vokuev, N. E. (2017). Disinformation and the media: the case of Russia and Ukraine. Media, culture & society, 39(7), 1027-1042. Mejias, U. A., & Vokuev, N. E. (2017). Disinformation and the media: the case of Russia and Ukraine. Media, culture & society, 39(7), 1027-1042.
 +
 +Narayanan, V., Howard, P. N., Kollanyi, B., & Elswah, M. (2017). Russian involvement and junk news during Brexit. The computational propaganda project. Algorithms, automation and digital politics. https://comprop. oii. ox. ac. uk/research/working-papers/russia-and-brexit.
 +
 +North, S., Piwek, L., & Joinson, A. (2021). Battle for Britain: Analyzing events as drivers of political tribalism in Twitter discussions of Brexit. Policy & Internet, 13(2), 185-208.
  
 Ong, J. C., & Cabañes, J. V. A. (2018). Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. Ong, J. C., & Cabañes, J. V. A. (2018). Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines.
  
 Pacheco, D., Flammini, A., & Menczer, F. (2020, April). Unveiling coordinated groups behind white helmets disinformation. In Companion proceedings of the web conference 2020 (pp. 611-616). Pacheco, D., Flammini, A., & Menczer, F. (2020, April). Unveiling coordinated groups behind white helmets disinformation. In Companion proceedings of the web conference 2020 (pp. 611-616).
 +
 +Panov, A. Comparative analysis of the discourse on Brexit in the context of Estonian and Russian language media.
  
 Prier, J. (2017). The command of the trend: Social media as a weapon in the information age. SCHOOL OF ADVANCED AIR AND SPACE STUDIES, AIR UNIVERSITY MAXWELL AFB United States. Prier, J. (2017). The command of the trend: Social media as a weapon in the information age. SCHOOL OF ADVANCED AIR AND SPACE STUDIES, AIR UNIVERSITY MAXWELL AFB United States.
Line 154: Line 177:
  
 Romerstein, H. (2001). Disinformation as a KGB Weapon in the Cold War. Journal of Intelligence History, 1(1), 54-67. Romerstein, H. (2001). Disinformation as a KGB Weapon in the Cold War. Journal of Intelligence History, 1(1), 54-67.
 +
 +Rosa, J. M., & Ruiz, C. J. (2020). Reason vs. emotion in the Brexit campaign: How key political actors and their followers used Twitter. First Monday.
  
 Sanchez, L. (2021). BOLSTERING THE DEMOCRATIC RESILIENCE OF THE ALLIANCE AGAINST DISINFORMATION AND PROPAGANDA. NATO Parliamentary Assembly. https://www. nato-pa. int/document/013-cds-21-edemocratic-resilience-against-disinformation-and-propaganda-report-sanchez. Sanchez, L. (2021). BOLSTERING THE DEMOCRATIC RESILIENCE OF THE ALLIANCE AGAINST DISINFORMATION AND PROPAGANDA. NATO Parliamentary Assembly. https://www. nato-pa. int/document/013-cds-21-edemocratic-resilience-against-disinformation-and-propaganda-report-sanchez.
 +
 +Semchuk, Z., & Petryk, I. (2019). BREXIT: CAUSES AND CONSEQUENCES. Scientific notes of Lviv University of Business and Law, 21, 94-98.
  
 Shires, J. (2021, May). Windmills of the Mind: Higher-Order Forms of Disinformation in International Politics. In 2021 13th International Conference on Cyber Conflict (CyCon) (pp. 257-273). IEEE. Shires, J. (2021, May). Windmills of the Mind: Higher-Order Forms of Disinformation in International Politics. In 2021 13th International Conference on Cyber Conflict (CyCon) (pp. 257-273). IEEE.
 +
 +Sousa, S., & Bates, N. (2021). Factors influencing content credibility in Facebook’s news feed: Inside view on the United Kingdom (UK) Post-Brexit. Human-Intelligent Systems Integration, 3, 69-78.
  
 Spangher, A., Ranade, G., Nushi, B., Fourney, A., & Horvitz, E. (2018). Analysis of Strategy and Spread of Russia-sponsored Content in the US in 2017. arXiv preprint arXiv:1810.10033. Spangher, A., Ranade, G., Nushi, B., Fourney, A., & Horvitz, E. (2018). Analysis of Strategy and Spread of Russia-sponsored Content in the US in 2017. arXiv preprint arXiv:1810.10033.
Line 189: Line 218:
  
 Walker, S., Mercea, D., & Bastos, M. (2019). The disinformation landscape and the lockdown of social platforms. Information, Communication & Society, 22(11), 1531-1543. Walker, S., Mercea, D., & Bastos, M. (2019). The disinformation landscape and the lockdown of social platforms. Information, Communication & Society, 22(11), 1531-1543.
 +
 +Watling, J., Danylyuk, O. V., & Reynolds, N. (2023). Preliminary Lessons from Russia’s Unconventional Operations During the Russo-Ukrainian War, February 2022–February 2023.
  
 Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646-1664 Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646-1664
disinformation.1684890655.txt.gz · Last modified: 2023/05/24 01:10 by jgmac1106