University of Notre Dame | October 1, 2018
It was 2012. Tim Weninger was a Ph.D. student surfing Facebook and Reddit. As a computer scientist, his procrastination took a turn when he noticed some alarming trends on the sites. The Islamic State group, commonly known as ISIS, was on the rise in Syria and was using social media to effectively convert and recruit young terrorists from the West. How, Weninger wondered, did they have such sway?
One year later, Weninger was an assistant professor at Notre Dame and continued to study Islamic State activity on social networks. He identified a group of individuals within the group who regularly pushed pro-IS material online by “liking” and retweeting posts and buried anti-IS reports by downvoting those posts. By working as a unified group, they significantly altered what the public saw and believed, effectively bringing in tens of thousands of recruits to their terrorist ideology.
Perhaps more ominous was a new discovery: Foreign actors could apply the same tactics to influence world events and elections. Weninger wrote a letter to the U.S. Air Force detailing his assessments and predictions, including the prediction that Russia would try to influence world events such as the 2016 U.S. election through propaganda and known vulnerabilities in social media systems. The Air Force responded quickly, directing him to continue studying the issue.
Today, Weninger’s resume details projects with the Air Force Office of Scientific Research, the Army Research Office, the Defense Advanced Research Projects Agency (DARPA) and the Pacific Northwest National Laboratory. These projects touch on questions related to how humans consume information online, how people interact in social environments and how digital information shapes beliefs and actions online and offline.
Weninger was optimistic when he began — certainly people could identify and avoid deliberately false or misleading information. But what he uncovered was surprising.
Read more here.