User Tools

Site Tools


disinformation

Bibliography

A’Beckett,L.(2013).Strategies to Discredit Opponents: Russian Presentations of Events in Countries of the Former Soviet Union. Psychology of Language and Communication,17(2) 133-156. https://doi.org/10.2478/plc-2013-0009

Agursky, M. (1989). SOVIET DISINFORMATION AND FORGERIES. International Journal on World Peace, 6(1), 13–30. http://www.jstor.org/stable/20751319

Alieva, I., Moffitt, J. D., & Carley, K. M. (2022). How disinformation operations against Russian opposition leader Alexei Navalny influence the international audience on Twitter. Social Network Analysis and Mining, 12(1), 1-13.

Arif, A., Stewart, L. G., & Starbird, K. (2018). Acting the part: Examining information operations within# BlackLivesMatter discourse. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1-27.

Atkinson, C. (2018). Hybrid warfare and societal resilience implications for democratic governance. Information & Security, 39(1), 63-76.

Badawy, A., Addawood, A., Lerman, K., & Ferrara, E. (2019). Characterizing the 2016 Russian IRA influence campaign. Social Network Analysis and Mining, 9(1), 1-11.

Bargaoanu, A., & Radu, L. (2018a). Fake News or Disinformation 2.0-Some Insights into Romanians' Digital Behaviour. Romanian J. Eur. Aff., 18, 24.

Bastos, M., Mercea, D., & Baronchelli, A. (2018b). The geographic embedding of online echo chambers: Evidence from the Brexit campaign. PloS one, 13(11), e0206841.

Bastos, M., & Mercea, D. (2018c). The public accountability of social platforms: Lessons from a study on bots and trolls in the Brexit campaign. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 376(2128), 20180003.

Bastos, M., & Farkas, J. (2019a). “Donald Trump is my President!”: The internet research agency propaganda machine. Social Media+ Society, 5(3), 2056305119865466.

Bastos, M. T., & Mercea, D. (2019b). The Brexit botnet and user-generated hyperpartisan news. Social science computer review, 37(1), 38-54.

Bastos, M. (2021). This account doesn’t exist: Tweet decay and the politics of deletion in the Brexit debate. American Behavioral Scientist, 65(5), 757-773.

Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.

Bennett, W. L., & Livingston, S. (2018). The disinformation order: Disruptive communication and the decline of democratic institutions. European journal of communication, 33(2), 122-139.

Berkowitz, B. (2000). Information warfare: Time to prepare. Issues in Science and Technology, 17(2), 37-44.

Berliner, D. C. (1992). Educational Reform in an Era of Disinformation.

Bjola, C. (2018). Propaganda as reflexive control: The digital dimension. In Countering Online Propaganda and Extremism (pp. 11-27). Routledge.

Boghardt, T. (2009). Soviet Bloc intelligence and its AIDS disinformation campaign. Studies in intelligence, 53(4), 1-24.

Bouwmeester, H. (2017). Lo and Behold: Let the Truth Be Told—Russian Deception Warfare in Crimea and Ukraine and the Return of ‘Maskirovka ’and ‘Reflexive Control Theory’. In Netherlands Annual Review of Military Studies 2017 (pp. 125-153). TMC Asser Press, The Hague.

Boyte, K. J. (2017). An analysis of the social-media technology, tactics, and narratives used to control perception in the propaganda war over Ukraine. Journal of Information Warfare, 16(1), 88-111.

Bradshaw, S., & Howard, P. (2017). Troops, trolls and troublemakers: A global inventory of organized social media manipulation.

Bradshaw, S., & Howard, P. N. (2018). Challenging truth and trust: A global inventory of organized social media manipulation. The computational propaganda project, 1, 1-26.

Bradshaw, S., & Howard, P. N. (2019). The global disinformation order: 2019 global inventory of organised social media manipulation.

Cervi, L., & Andrade, A. C. (2019). Post-truth and disinformation: Using discourse analysis to understand the creation of emotional and rival narratives in Brexit. Revista ComHumanitas, 10(2), 125-149.

Chamberlain, P. R. (2010). Twitter as a Vector for Disinformation. Journal of Information Warfare, 9(1), 11-17.

Chen, E., & Ferrara, E. (2022). Tweets in time of conflict: A public dataset tracking the twitter discourse on the war between ukraine and russia. arXiv preprint arXiv:2203.07488.

Collings, D., & Rohozinski, R. (2009). Bullets and Blogs: New media and the warfighter. ARMY WAR COLL CARLISLE BARRACKS PA CENTER FOR STRATEGIC LEADERSHIP.

Darczewska, J. (2015). The devil is in the details. Information warfare in the light of Russia's military doctrine. Ośrodek Studiów Wschodnich im. Marka Karpia.

Dawson, A., & Innes, M. (2019). How Russia's internet research agency built its disinformation campaign. The Political Quarterly, 90(2), 245-256.

de Cock Buning, M. (2018). A multi-dimensional approach to disinformation: Report of the independent high level group on fake news and online disinformation. Publications Office of the European Union.

Doroshenko, L., & Lukito, J. (2021). Trollfare: Russia’s Disinformation Campaign During Military Conflict in Ukraine. International Journal of Communication, 15, 28.

European External Action Service’s (2023). EEEAS Report on Foreign Information Manipulation and Interference Threats.eeas-threatreport-february2023-02.pdf

Fallis, D. (2009). A conceptual analysis of disinformation.

Fallis, D. (2014). The varieties of disinformation. The philosophy of information quality, 135-161.

Fallis, D. (2015). What is disinformation?. Library trends, 63(3), 401-426.

Ferrara, E., Chang, H., Chen, E., Muric, G., & Patel, J. (2020). Characterizing social media manipulation in the 2020 US presidential election. First Monday.

Farkas, J., & Bastos, M. (2018, July). IRA propaganda on Twitter: Stoking antagonism and tweeting local news. In Proceedings of the 9th International Conference on social media and society (pp. 281-285).

Flores-Saviaga, C., Keegan, B., & Savage, S. (2018, June). Mobilizing the trump train: Understanding collective action in a political trolling community. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 12, No. 1).

François, C., Nimmo, B., & Eib, C. S. (2019). The IRA copypasta campaign. Graphika, okt.

Franke, U. (2015). War by non-military means: Understanding Russian information warfare.

Giles, K. (2016). Handbook of Russian information warfare.

Giles, K. (2016). The next phase of Russian information warfare (Vol. 20). Riga: NATO Strategic Communications Centre of Excellence.

Golovchenko, Y., Buntain, C., Eady, G., Brown, M. A., & Tucker, J. A. (2020). Cross-platform state propaganda: Russian trolls on Twitter and YouTube during the 2016 US presidential election. The International Journal of Press/Politics, 25(3), 357-389.

Grčar, M., Cherepnalkoski, D., Mozetič, I., & Kralj Novak, P. (2017). Stance and influence of Twitter users regarding the Brexit referendum. Computational social networks, 4, 1-25.

Guimaraes, A., Balalau, O., Terolli, E., & Weikum, G. (2019, July). Analyzing the traits and anomalies of political discussions on reddit. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 13, pp. 205-213).

Hale, H. E., Shevel, O., & Onuch, O. (2018). Believing facts in the fog of war: identity, media and hot cognition in Ukraine’s 2014 Odesa tragedy. Geopolitics, 23(4), 851-881.believing_facts_in_the_fog_of_war_identi.pdf

Hatch, B. (2019). The future of strategic information and cyber-enabled information operations. Journal of Strategic Security, 12(4), 69-89.

Holland, M. (2006). The propagation and power of communist security services dezinformatsiya. International Journal of Intelligence and CounterIntelligence, 19(1), 1-31.

Hosaka, S. (2019). Putin the ‘Peacemaker’?—Russian Reflexive Control During the 2014 August Invasion of Ukraine. The Journal of Slavic Military Studies, 32(3), 324-346.

Howard, P. N., Ganesh, B., Liotsiou, D., Kelly, J., & François, C. (2019). The IRA, social media and political polarization in the United States, 2012-2018.

Huhtinen, A. M., Kotilainen, N., Särmä, S., & Streng, M. (2021). Information influence in hybrid environment: reflexive control as an analytical tool for understanding warfare in social media. In Research Anthology on Fake News, Political Warfare, and Combatting the Spread of Misinformation (pp. 243-259). IGI Global.

Humprecht, E. (2019). Where ‘fake news’ flourishes: a comparison across four Western democracies. Information, Communication & Society, 22(13), 1973-1988.

Jankowicz, N. (2020). How to lose the information war: Russia, fake news, and the future of conflict. Bloomsbury Publishing.

Karlova, N. A., & Lee, J. H. (2011). Notes from the underground city of disinformation: A conceptual investigation. Proceedings of the American Society for Information Science and Technology, 48(1), 1-9.

Kasapoglu, C. (2015). Russia's Renewed Military Thinking: Non-Linear Warfare and Reflexive Control. NATO Defense College, Research Division.

Kilkenny, E. (2021). Russian Disinformation–The Technological Force Multiplier. Global Insight: A Journal of Critical Human Science and Culture, 2.

King, F. (2018). Reflexive control and disinformation in putin’s wars.

Kling, J., Toepfl, F., Thurman, N., & Fletcher, R. (2022). Mapping the website and mobile app audiences of Russia’s foreign communication outlets, RT and Sputnik, across 21 countries. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-110

Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US presidential debate. Comprop data memo, 1, 1-4.

Kux, D. (1985). Soviet active measures and disinformation: overview and assessment. The US Army War College Quarterly: Parameters, 15(1), 17.

Lanoszka, A. (2019). Disinformation in international politics. European journal of international security, 4(2), 227-248.

Laruelle, M., & Limonier, K. (2021). Beyond “hybrid warfare”: a digital exploration of Russia’s entrepreneurs of influence. Post-Soviet Affairs, 37(4), 318-335.laurellelimonie2021influencer.pdf

Lewis, R., & Marwick, A. (2017). Taking the red pill: Ideological motivations for spreading online disinformation. Understanding and addressing the disinformation ecosystem, 18-22.

Lemke, T., & Habegger, M. W. (2022). Foreign Interference and Social Media Networks: A Relational Approach to Studying Contemporary Russian Disinformation. Journal of Global Security Studies, 7(2), ogac004.

Llewellyn, C., Cram, L., Favero, A., & Hill, R. L. (2018, May). Russian troll hunting in a brexit Twitter archive. In Proceedings of the 18th acm/ieee on joint conference on digital libraries (pp. 361-362).

Libicki, M. C. (1995). What is information warfare?. NATIONAL DEFENSE UNIV WASHINGTON DC INST FOR NATIONAL STRATEGIC STUDIES.

Linvill, D. L., Warren, P. L., & Moore, A. E. (2022). Talking to Trolls—How Users Respond to a Coordinated Information Operation and Why They’re So Supportive. Journal of Computer-Mediated Communication, 27(1), zmab022.

Lukito, J. (2020). Coordinating a multi-platform disinformation campaign: Internet Research Agency activity on three US social media platforms, 2015 to 2017. Political Communication, 37(2), 238-255.

Marwick, A., & Lewis, R. (2017). Media manipulation and disinformation online. New York: Data & Society Research Institute, 7-19.

McFaul, M., & Kass, B. (2019). Understanding Putin’s intentions and actions in the 2016 US presidential election. SECURING AMERICAN ELECTIONS, 1.

McGaughey, E. (2018). Could brexit be void?. King's Law Journal, 29(3), 331-343.

McKay, S., & Tenove, C. (2021). Disinformation as a threat to deliberative democracy. Political Research Quarterly, 74(3), 703-717.

Meister, S. (2018). Understanding Russian Communication Strategy: Case Studies of Serbia and Estonia (p. 60). DEU.

Mejias, U. A., & Vokuev, N. E. (2017). Disinformation and the media: the case of Russia and Ukraine. Media, culture & society, 39(7), 1027-1042.

Narayanan, V., Howard, P. N., Kollanyi, B., & Elswah, M. (2017). Russian involvement and junk news during Brexit. The computational propaganda project. Algorithms, automation and digital politics. https://comprop. oii. ox. ac. uk/research/working-papers/russia-and-brexit.

North, S., Piwek, L., & Joinson, A. (2021). Battle for Britain: Analyzing events as drivers of political tribalism in Twitter discussions of Brexit. Policy & Internet, 13(2), 185-208.

Ong, J. C., & Cabañes, J. V. A. (2018). Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines.

Pacheco, D., Flammini, A., & Menczer, F. (2020, April). Unveiling coordinated groups behind white helmets disinformation. In Companion proceedings of the web conference 2020 (pp. 611-616).

Panov, A. Comparative analysis of the discourse on Brexit in the context of Estonian and Russian language media.

Prier, J. (2017). The command of the trend: Social media as a weapon in the information age. SCHOOL OF ADVANCED AIR AND SPACE STUDIES, AIR UNIVERSITY MAXWELL AFB United States.

Puschmann, C., Ausserhofer, J., Maan, N., & Hametner, M. (2016, April). Information laundering and counter-publics: The news sources of islamophobic groups on Twitter. In Tenth International AAAI Conference on Web and Social Media.

Pynnöniemi, K. P., & Rácz, A. (2016). Fog of Falsehood:: Russian Strategy of Deception and the Conflict in Ukraine. In Fog of Falsehood:: Russian Strategy of Deception and the Conflict in Ukraine (p. 320).

Recuero, R., Soares, F. B., & Gruzd, A. (2020, May). Hyperpartisanship, disinformation and political conversations on Twitter: The Brazilian presidential election of 2018. In Proceedings of the international AAAI conference on Web and social media (Vol. 14, pp. 569-578).

Rid, T. (2020). Active measures: The secret history of disinformation and political warfare. Farrar, Straus and Giroux.

Rini, R. (2021). Weaponized skepticism: An analysis of social media deception as applied political epistemology.

Rini, R. (2019). Social media disinformation and the security threat to democratic legitimacy. NATO Association of Canada: Disinformation and Digital Democracies in the 21st Century.

Romerstein, H. (2001). Disinformation as a KGB Weapon in the Cold War. Journal of Intelligence History, 1(1), 54-67.

Rosa, J. M., & Ruiz, C. J. (2020). Reason vs. emotion in the Brexit campaign: How key political actors and their followers used Twitter. First Monday.

Sanchez, L. (2021). BOLSTERING THE DEMOCRATIC RESILIENCE OF THE ALLIANCE AGAINST DISINFORMATION AND PROPAGANDA. NATO Parliamentary Assembly. https://www. nato-pa. int/document/013-cds-21-edemocratic-resilience-against-disinformation-and-propaganda-report-sanchez.

Semchuk, Z., & Petryk, I. (2019). BREXIT: CAUSES AND CONSEQUENCES. Scientific notes of Lviv University of Business and Law, 21, 94-98.

Shires, J. (2021, May). Windmills of the Mind: Higher-Order Forms of Disinformation in International Politics. In 2021 13th International Conference on Cyber Conflict (CyCon) (pp. 257-273). IEEE.

Sousa, S., & Bates, N. (2021). Factors influencing content credibility in Facebook’s news feed: Inside view on the United Kingdom (UK) Post-Brexit. Human-Intelligent Systems Integration, 3, 69-78.

Spangher, A., Ranade, G., Nushi, B., Fourney, A., & Horvitz, E. (2018). Analysis of Strategy and Spread of Russia-sponsored Content in the US in 2017. arXiv preprint arXiv:1810.10033.

Starbird, C., Arif, A., & Wilson, T. (2018). Understanding the structure and dynamics of disinformation in the online information ecosystem. University of Washington Seattle United States.

Suk, J., Lukito, J., Su, M. H., Kim, S. J., Tong, C., Sun, Z., & Sarma, P. (2022). Do I sound American?: How message attributes of Internet Research Agency (IRA) disinformation relate to Twitter engagement. Computational Communication Research, 4(2), 590-628.

Sultan, O. (2019). Tackling Disinformation, Online Terrorism, and Cyber Risks into the 2020s. The Cyber Defense Review, 4(1), 43-60.

Tanchak, P. N. (2017). The invisible front: Russia, trolls, and the information war against Ukraine. Revolution and war in contemporary Ukraine: The challenge of change, 161, 253.

Teperik, D., Denisa-Liepniece, S., Bankauskaitė, D., & Kullamaa, K. (2022). Resilience Against Disinformation: A New Baltic Way to Follow?.

Thomas, T. L. (1996). Deterring information warfare: a new strategic challenge. The US Army War College Quarterly: Parameters, 26(4), 12.

Thomas, T. L. (1996). Russian Views on Information-Based Wafare. AIR UNIV MAXWELL AFB AL AIRPOWER JOURNAL.

Thomas, T. L. (1998). Dialectical versus empirical thinking: Ten key elements of the Russian understanding of information operations. The Journal of Slavic Military Studies, 11(1), 40-62.

Thomas, T. L. (2000). The Russian view of information war. The Russian Armed Forces at the Dawn of the Millennium, 335.

Thomas, T. (2004). Russia's reflexive control theory and the military. Journal of Slavic Military Studies, 17(2), 237-256.

Till, C. (2021). Propaganda through ‘reflexive control’and the mediated construction of reality. New Media & Society, 23(6), 1362-1378. Unver, A. (2022). Securitization of Disinformation in NATO Lexicon: A Computational Text Analysis. Available at SSRN.

Veebel, V. (2015). Russian propaganda, disinformation, and Estonia’s experience. Foreign Policy Research Institute.

Walker, S., Mercea, D., & Bastos, M. (2019). The disinformation landscape and the lockdown of social platforms. Information, Communication & Society, 22(11), 1531-1543.

Watling, J., Danylyuk, O. V., & Reynolds, N. (2023). Preliminary Lessons from Russia’s Unconventional Operations During the Russo-Ukrainian War, February 2022–February 2023.

Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019). Disinformation, performed: Self-presentation of a Russian IRA account on Twitter. Information, Communication & Society, 22(11), 1646-1664

Yang, A. (2019). Reflexive Control and Cognitive Vulnerability in the 2016 US Presidential Election. Journal of Information Warfare, 18(3), 99-122.

Zannettou, S., Caulfield, T., Setzer, W., Sirivianos, M., Stringhini, G., & Blackburn, J. (2019, June). Who let the trolls out? towards understanding state-sponsored trolls. In Proceedings of the 10th acm conference on web science (pp. 353-362).

Zannettou, S., Caulfield, T., De Cristofaro, E., Sirivianos, M., Stringhini, G., & Blackburn, J. (2019, May). Disinformation warfare: Understanding state-sponsored trolls on Twitter and their influence on the web. In Companion proceedings of the 2019 world wide web conference (pp. 218-226).

Zhang, Y., Lukito, J., Su, M. H., Suk, J., Xia, Y., Kim, S. J., … & Wells, C. (2021). Assembling the networks and audiences of disinformation: How successful Russian IRA Twitter accounts built their followings, 2015–2017. Journal of Communication, 71(2), 305-331.

disinformation.txt · Last modified: 2023/08/06 13:58 by jgmac1106