skip to main content
10.1145/3400806.3400817acmotherconferencesArticle/Chapter ViewAbstractPublication PagessmsocietyConference Proceedingsconference-collections
research-article

Coordinated Link Sharing Behavior as a Signal to Surface Sources of Problematic Information on Facebook

Published: 22 July 2020 Publication History

Abstract

Despite widespread concern over the role played by disinformation during recent electoral processes, the intrinsic elusiveness of the subject hinders efforts aimed at estimating its prevalence and effect. While there has been proliferation of attempts to define, understand and fight the spread of problematic information in contemporary media ecosystems, most of these attempts focus on detecting false content and/or bad actors. For instance, several existing studies rely on lists of problematic content or news media sources compiled by fact-checkers. However, these lists may quickly become obsolete leading to unreliable estimates. Using media manipulation as a frame, along with a revised version of the “coordinated inauthentic behavior” original definition, in this paper, we argue for a wider ecological focus. Leveraging a method designed to detect “coordinated links sharing behavior” (CLSB), we introduce and assess an approach aimed at creating and keeping lists of potentially problematic sources updated by analyzing the URLs shared on Facebook by public groups, pages, and verified profiles. We show how CLSB is consistently associated with higher risks of encountering problematic news sources across three different datasets of news stories and can be thus used as a signal to support manual and automatic detection of problematic information.

References

[1]
Hunt Allcott, Matthew Gentzkow, and Chuan Yu. 2019. Trends in the diffusion of misinformation on social media. Research & Politics 6, 2 (April 2019), 2053168019848554.
[2]
Marco Bastos and Johan Farkas. 2019. “Donald Trump Is My President!”: The Internet Research Agency Propaganda Machine. Social Media + Society 5, 3 (July 2019), 2056305119865466.
[3]
Marco Bastos and Dan Mercea. 2018. Parametrizing Brexit: mapping Twitter political space to parliamentary constituencies. Inf. Commun. Soc. 21, 7 (July 2018), 921–939.
[4]
Marco Bastos and Dan Mercea. 2019. The Brexit Botnet and User-Generated Hyperpartisan News. Soc. Sci. Comput. Rev. 37, 1 (February 2019), 38–54.
[5]
Marco T. Bastos. 2019. This Account Doesn't Exist: Tweet Decay and the Politics of Deletion in the Brexit Debate. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3485789
[6]
Yochai Benkler. 2019. Cautionary Notes on Disinformation and the Origins of Distrust. Retrieved from https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/
[7]
Alessandro Bessi and Emilio Ferrara. 2016. Social Bots Distort the 2016 US Presidential Election Online Discussion. First Monday. Retrieved from https://papers.ssrn.com/abstract=2982233
[8]
Axel Bruns, Tim Highfield, and Jean Burgess. 2014. The Arab Spring and social media audiences: English and Arabic Twitter users and their networks. American behavioral scientist, 57(7), 871-898.
[9]
Gabriella Coleman. 2015. Hacker, hoaxer, whistleblower, spy: The many faces of Anonymous. Verso books, New York, NY.
[10]
Jessie Daniels. 2009. Cloaked websites: propaganda, cyber-racism and epistemology in the digital era. New Media & Society 11, 5 (August 2009), 659–683.
[11]
Jessie Daniels. 2014. From crisis pregnancy centers to teenbreaks. com: anti-abortion activism's use of cloaked websites. In Cyberactivism on the participatory web. Routledge, 152–166. Retrieved from https://www.taylorfrancis.com/books/e/9781315885797/chapters/10.4324/978131 5885797-12
[12]
Joan Donovan and Brian Friedberg. 2019. Source hacking: Media manipulation in practice. Data&Society. Retrieved from https://datasociety. net/output/source-hacking-media-manipulation-in-practice
[13]
Johan Farkas, Jannick Schou, and Christina Neumayer. 2018. Cloaked Facebook pages: Exploring fake Islamist propaganda in social media. New Media & Society 20, 5 (May2018), 1850–1867.
[14]
Emilio Ferrara, Onur Varol, Clayton Davis, Filippo Menczer, and Alessandro Flammini. 2016. The rise of social bots. Communications of the ACM, 59(7), 96-104.
[15]
Richard Fletcher and Rasmus Kleis Nielsen. 2017. People dont trust news media and this is key to the global misinformation debate. AA. VV., Understanding and Addressing the Disinformation Ecosystem (2017), 13–17. Retrieved from https://firstdraftnews.org/latest/understanding-disinformation/
[16]
Deen Freelon, Charlton McIlwain, and Meredith Clark. 2018. Quantifying the power and consequences of social media protest. New Media & Society 20, 3 (March 2018), 990–1011.
[17]
Fabio Giglietto, Nicola Righetti, Luca Rossi, and Giada Marino. 2020. It takes a village to manipulate the media: coordinated link sharing behavior during 2018 and 2019 Italian elections. Information, Communication & Society, 1-25.
[18]
Fabio Giglietto, Nicola Righetti, and Luca Rossi. 2020. CooRnet: Detect coordinated link sharing behavior on social media. R package version 0.9.0. https://github.com/fabiogiglietto/CooRnet
[19]
Fabio Giglietto, Nicola Righetti, and Giada Marino. 2019. Understanding Coordinated and Inauthentic Link Sharing Behavior on Facebook in the Run-up to 2018 General Election and 2019 European Election in Italy.
[20]
Nathaniel Gleicher. 2018. Coordinated Inauthentic Behavior Explained. Retrieved from https://about.fb.com/news/2018/12/inside-feed-coordinated-inauthentic-behavior/
[21]
Nir Grinberg, Kenneth Joseph, Lisa Friedland, Briony Swire-Thompson, and David Lazer. 2019. Fake news on Twitter during the 2016 U.S. presidential election. Science 363, 6425 (January 2019), 374–378.
[22]
Andrew Guess, Jonathan Nagler, and Joshua Tucker. 2019. Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Sci Adv 5, 1 (January 2019), eaau4586.
[23]
Andrew Guess, Brendan Nyhan, and Jason Reifler. 2018. Selective exposure to misinformation: Evidence from the consumption of fake news during the 2016 US presidential campaign. European Research Council 9, (2018). Retrieved from https://pdfs.semanticscholar.org/a795/b451b3d38ca1d22a6075dbb0be4fc94b4000.pdf
[24]
Lei Guo and Chris Vargo. 2018. “Fake News” and Emerging Online Media Ecosystem: An Integrated Intermedia Agenda-Setting Analysis of the 2016 U.S. Presidential Election. Communic. Res. (June 2018), 0093650218777177.
[25]
Zied Ben Houidi, Giuseppe Scavo, Stefano Traverso, Renata Teixeira, Marco Mellia, and Soumen Ganguly. 2019. The News We Like Are Not the News We Visit: News Categories Popularity in Usage Data. In Proceedings of the International AAAI Conference on Web and Social Media, aaai.org, 91–102.
[26]
Philip N. Howard, Samuel Woolley, and Ryan Calo. 2018. Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration. Journal of Information Technology & Politics 15, 2 (April 2018), 81–93.
[27]
Caroline Jack. 2017. Lexicon of lies: Terms for problematic information. Data & Society 3, (2017). Retrieved from https://apo.org.au/sites/default/files/resource-files/2017/08/apo-nid183786-11805 16.pdf
[28]
Henry Jenkins. 2006. Fans, Bloggers, and Gamers: Exploring Participatory Culture. NYU Press, New York, NY.
[29]
Franziska B. Keller, David Schoch, Sebastian Stier, and Junghwan Yang. 2019. Political Astroturfing on Twitter: How to Coordinate a Disinformation Campaign. Political Communication (October 2019), 1–25.
[30]
Lance W. Bennett and Alexandra Segerberg. 2013. The Logic of Connective Action: Digital Media and the Personalization of Contentious Politics. Information, Communication & Society, 15:5, 739-768.
[31]
David Lazer, Matthew Baum, Yochai Benkler, Adam J. Berinsky, Kelly M. Greenhill, Filippo Menczer, Miriam J. Metzger, Brendan Nyhan, Gordon Pennycook, David Rothschild, Michael Schudson, Steven A. Sloman, Cass R. Sunstein, Emily A. Thorson, Duncan J. Watts, and Jonathan L. Zittrain. 2018. The science of fake news. Science, 359(6380), 1094-1096.
[32]
Brian D. Loader and Dan Mercea. 2011. Networking democracy? Social media innovations and participatory politics. Information, Communication & Society, 14(6), 757-769.
[33]
Luca Luceri, Ashok Deb, Silvia Giordano, and Emilio Ferrara. 2019. Evolution of bot and human behavior during elections. First Monday 24, 9 (August 2019).
[34]
Tessa Lyons. 2018. Increasing Our Efforts to Fight False News. Retrieved from https://about.fb.com/news/2018/06/increasing-our-efforts-to-fight-false-news/
[35]
Marwick, A. E. (2018). Why do people share fake news? A sociotechnical model of media effects. Georgetown Law Technology Review, 2(2), 474-512.
[36]
Alice Marwick and Rebecca Lewis. 2017. Media manipulation and disinformation online. New York: Data & Society Research Institute. Retrieved from https://apo.org.au/sites/default/files/resource-files/2017/05/apo-nid135936-12178 06.pdf
[37]
Katerina Eva Matsa and Elisa Shearer. 2018. News use across social media platforms 2018. Pew Research Center. Retrieved from https://www.journalism.org/2018/09/10/news-use-across-social-media-platforms-2018/
[38]
Fred Morstatter, Liang Wu, Tahora H. Nazer, Kathleen M. Carley, and Huan Liu. 2016. A new approach to bot detection: Striking the balance between precision and recall. In 2016 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM), ieeexplore.ieee.org, 533–540.
[39]
Jacob L. Nelson and Harsh Taneja. 2018. The small, disloyal fake news audience: The role of audience availability in fake news consumption. New Media & Society 20, 10 (October 2018), 3720–3737.
[40]
Dana Rotman, Sarah Vieweg, Sarita Yardi, Ed Chi, Jenny Preece, Ben Shneiderman, Peter Pirolli, and Tom Glaisyer. 2011. From slacktivism to activism: participatory culture in the age of social media. In CHI’11 Extended Abstracts on Human Factors in Computing Systems. dl.acm.org, 819–822. Retrieved from https://dl.acm.org/doi/abs/10.1145/1979742.1979543
[41]
Meredith Salisbury and Jefferson Pooley. 2017. The #nofilter self: The contest for authenticity among social networking sites, 2002–2016. Soc. Sci. 6, 1 (2017), 10. Retrieved from https://www.mdpi.com/2076-0760/6/1/10
[42]
Giovanni C. Santia, Munif Ishad Mujib, and Jake Ryland Williams. 2019. Detecting Social Bots on Facebook in an Information Veracity Context. In Proceedings of the International AAAI Conference on Web and Social Media, wvvw.aaai.org, 463–472. Retrieved from https://wvvw.aaai.org/ojs/index.php/ICWSM/article/view/3244
[43]
Clay Shirky. 2008. Here comes everybody: The power of organizing without organizations. Penguin Book, London, UK.
[44]
Brian E. Weeks and Homero Gil de Zúñiga. 2019. What's Next? Six Observations for the Future of Political Misinformation Research. American Behavioral Scientist, 0002764219878236.
[45]
Samuel C. Woolley and Philip N. Howard. 2016. Automation, algorithms, and politics| political communication, computational propaganda, and autonomous agents. International Journal of Communication, 10(2016), 4882–4890
[46]
Kai-Cheng Yang, Onur Varol, Pik-Mai Hui, and Filippo Menczer. 2019. Scalable and Generalizable Social Bot Detection through Data Selection. arXiv [cs.CY]. Retrieved from http://arxiv.org/abs/1911.09179

Cited By

View all
  • (2025)Computational Analysis of Communicative Acts for Understanding Crisis News Comment DiscoursesSocial Networks Analysis and Mining10.1007/978-3-031-78538-2_20(226-242)Online publication date: 25-Jan-2025
  • (2024)Bias and Polarization in the Qatargate Scandal: A Social Media PerspectiveSocial Media + Society10.1177/2056305124130632310:4Online publication date: 16-Dec-2024
  • (2024)Multifaceted online coordinated behavior in the 2020 US presidential electionEPJ Data Science10.1140/epjds/s13688-024-00467-013:1Online publication date: 19-Apr-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
SMSociety'20: International Conference on Social Media and Society
July 2020
317 pages
ISBN:9781450376884
DOI:10.1145/3400806
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 July 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. CrowdTangle
  2. Facebook
  3. coordinated inauthentic behavior
  4. disinformation

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

SMSociety'20

Acceptance Rates

Overall Acceptance Rate 78 of 189 submissions, 41%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)41
  • Downloads (Last 6 weeks)3
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Computational Analysis of Communicative Acts for Understanding Crisis News Comment DiscoursesSocial Networks Analysis and Mining10.1007/978-3-031-78538-2_20(226-242)Online publication date: 25-Jan-2025
  • (2024)Bias and Polarization in the Qatargate Scandal: A Social Media PerspectiveSocial Media + Society10.1177/2056305124130632310:4Online publication date: 16-Dec-2024
  • (2024)Multifaceted online coordinated behavior in the 2020 US presidential electionEPJ Data Science10.1140/epjds/s13688-024-00467-013:1Online publication date: 19-Apr-2024
  • (2023)Computational Communication Methods for Examining Problematic News-Sharing Practices on Facebook at ScaleSocial Media + Society10.1177/205630512311968809:3Online publication date: 26-Sep-2023
  • (2023)A Workflow to Detect, Monitor, and Update Lists of Coordinated Social Media Accounts Across Time: The Case of the 2022 Italian ElectionSocial Media + Society10.1177/205630512311968669:3Online publication date: 9-Sep-2023
  • (2023)Coordinated Botnet Detection in Social Networks via Clustering AnalysisProceedings of the 52nd International Conference on Parallel Processing Workshops10.1145/3605731.3608959(192-196)Online publication date: 7-Aug-2023
  • (2023)A language framework for modeling social media account behaviorEPJ Data Science10.1140/epjds/s13688-023-00410-912:1Online publication date: 23-Aug-2023
  • (2022)Rapid Sharing of Islamophobic Hate on Facebook: The Case of the Tablighi Jamaat ControversySocial Media + Society10.1177/205630512211291518:4Online publication date: 4-Nov-2022
  • (2022)Cross-platform spread: vaccine-related content, sources, and conspiracy theories in YouTube videos shared in early Twitter COVID-19 conversationsHuman Vaccines & Immunotherapeutics10.1080/21645515.2021.200364718:1(1-13)Online publication date: 21-Jan-2022
  • (2022)How coordinated link sharing behavior and partisans’ narrative framing fan the spread of COVID-19 misinformation and conspiracy theoriesSocial Network Analysis and Mining10.1007/s13278-022-00948-y12:1Online publication date: 20-Aug-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media