Carley Helms New Center To Fight Online Disinformation
$5M Knight Foundation Investment Creates Center for Informed Democracy and Social Cybersecurity
Bots, trolls, state-run propaganda, information warfare and hate speech are some of the most pervasive ways that societal discourse is being warped in the modern era. Carnegie Mellon University today announced the creation of a new research center dedicated to the study of online disinformation and its effects on democracy, funded by a $5 million investment from the John S. and James L. Knight Foundation. The new center will bring together researchers from within the institution and across the country.
The Center for Informed Democracy and Social Cybersecurity (IDeaS) will study how disinformation is spread through online channels, such as social media, and address how to counter its effects to preserve and build an informed citizenry. Directed by Kathleen M. Carley, professor in the School of Computer Science's Institute for Software Research, the center will take a multidisciplinary approach, engaging researchers from across the university to examine and develop responses to both technological and human facets of the issue. Douglas Sicker, head of Engineering and Public Policy in the College of Engineering, and David Danks, head of Philosophy in the Dietrich College of Humanities and Social Sciences, will be co-directors.
"Addressing the complex issues posed by online disinformation requires robust collaboration from experts spanning sociology and economics, technology and communication theory," said J. Michael McQuade, vice president for research at Carnegie Mellon. "This initiative aligns perfectly with the vibrant collaborative ecosystem we have built at CMU to solve challenges such as these, which are among the most pressing faced by our society. We are thrilled to partner with Knight Foundation on this critical issue."
The center will have three main objectives: expand the body of research across a holistic spectrum of topics, build an interconnected community among the more than 2,000 people working in the field, and educate journalists and policymakers. Its research focus will include topics such as how to better recognize disinformation online, how to identify who is spreading it, how to inoculate groups against it and how to counter it.
"We are in the middle of a social media war. It's being conducted across Twitter and Facebook, websites, Reddit, pick your favorite media." — Kathleen M. Carley
The six-year investment will provide funding for Knight Fellows — graduate students who will deepen their research in related areas. It also will sponsor an annual conference of scientists, practitioners, journalists and policymakers to discuss research and public policy. The investment is part of a broader Knight Foundation initiative that is investing nearly $50 million for research around technology's impact on democracy. The program will provide cross-disciplinary funding to 11 universities and other research and advocacy organizations.
"We are in the middle of a social media war. It's being conducted across Twitter and Facebook, websites, Reddit, pick your favorite media. It is about to get worse with the increased sophistication of bots, memes and the beginning of deep fakes," Carley said. "Other countries and non-state actors use these tools to impact and shape what you read and who you talk to on social media. The United States doesn't have the tools or the policies it needs to respond. In IDeaS, our goal is to change this."
While the public is exposed to multiple kinds of disinformation, Carley notes the most insidious kind is the one that isn't easily recognizable by the broader public. Examples include innuendos and logical fallacies. These are increasingly spread by large, orchestrated campaigns and disseminated in order to influence group behavior beyond simply the absorption of the information. For example, in recent elections in many countries including the United States, Sweden, Germany and the United Kingdom, bots spread disinformation to groups on both sides of contentious issues to polarize them against each other. The result was people in the exposed groups stopped listening to outside information and began operating from an emotional rather than a rational state. This created barriers to communication and understanding between the groups, and in some cases, protests.
The center's goal is to develop effective solutions with teams of experts in network analysis, machine learning and natural language processing to recognize how the information is being spread; sociologists, psychologists and philosophers to analyze the individual and group response; and public policy researchers to address issues of governance.
"There is clear evidence that, for all the positive potential of the internet, it has made our democracy vulnerable to misinformation and manipulation," said Sam Gill, Knight Foundation vice president for communities and impact. "Solutions will come from deeper understanding. As one of the world's leading centers of technology research, CMU is perfectly positioned to jump into the breach."
IDeaS will build upon the renowned body of multidisciplinary research from CMU's Center for Computational Analysis of Social and Organizational Systems (CASOS), which brings together network analysis, computer science and organization science; and CyLab, the university's security and privacy institute.
"This new center is at the heart of what Carnegie Mellon does best as the pioneer of computer science, but also as an institution with a core value of ensuring the effects of technology are positive," said Tom Mitchell, interim dean of the School of Computer Science. "We're uniquely positioned through our deep expertise and strong collaborative culture to create real-world solutions at this critical juncture for society."