Nadler, an associate professor of media and communication studies, co-authored Weaponizing the Digital Influence Machine: The Politics and Perils of Online Ad Tech, published in October by Data & Society, a research institute in New York City focused on the social and cultural issues arising from data-centric and automated technologies.
“Online advertising allows parties, candidates, political groups and others the ability to test highly-targeted messages that could be extreme or manipulative,” Nadler says. “If I’m a political operative, I could choose a group of people, target them with a variety of images and slogans, and see what generates engagement.”
Digital ads tend to be a more effective tool for these groups than TV ads, which hit a broader audience. These “weaponized” digital influence campaigns seek to amplify existing resentments and anxieties, create distrust and influence decisions.
“I can ensure a digital ad is seen by 1,000 people and I can test it,” Nadler says. “Are they clicking and sharing? How are they engaging? And, I can use it for far more devious things. What are some of the divisions within my opponents’ coalition, and how can I thrash them apart? I could also try to heighten a sense of fear around a vision of an immigrant caravan, for example, and specify who I want to see that ad based on who I think would be most susceptible to such a fear appeal.”
If online ad systems allow these tactics to offer even marginal advantages in close elections, this can have a significant impact on politics. Nadler says that while the last presidential election brought digital ad manipulation to the forefront, it is unknown just how widespread it could have been prior to 2016.
“That was the breaking point,” he says. “Russian operatives were caught using these tactics and pressure was put on Facebook, Google and other tech companies to investigate and then make their findings public. Could it have happened earlier? Certainly. There’s good evidence that they weren’t the only ones using these tactics.”
“As long as these opportunities exist, there will be political influencers who try them,” Nadler says. “But there are also some groups who have ethical norms. In general, the candidates themselves aren’t going to want to be linked to these manipulative tactics.”
Nadler suggests several tactics for curtailing digital ad manipulation, such as requiring explicit, non-coercive user consent for viewing any political ads that are part of a split-testing experiment; refusing to work with dark money groups that fund such ads; and creating community review boards to set ethical guidelines for political advertising.
Students from Nadler’s “Fake News and Propaganda” course read an early version of his report and helped develop some ideas Those insights contributed to the final published report.
“It was relay helpful because the entire class got to read an early draft of it,” he said. “They were already collecting research on this new and evolving topic, so it was very helpful to me to hear their own analysis and read some of the sources they found on their own.”
Nadler’s expertise is in conservative news, populism, media policy and digital media culture. He is the author of Making the News Popular: Mobilizing U.S. News Audiences (University of Illinois Press, 2016), named one of the best books on journalism in 2016. He is also a fellow of the Tow Center for Digital Journalism at Columbia University, where his research seeks to understand how communities in metro Philadelphia think about trust in media. —By Ed Moorhouse