In This Presidential Campaign, Disinformation May Be the Winner

As polarization grows and tech platforms hesitate to take down misleading messages, false information is gaining traction, Fletcher School experts say

The outcome of the 2024 U.S. election is still unknown, but one clear winner of this campaign season is disinformation, according to professors at The Fletcher School. This false information, which is spread deliberately to deceive voters and advance political goals, is dangerous and it’s challenging to regulate, the experts said.

False claims have been flying in political debates and on social media, with real-world consequences, said Bhaskar Chakravorti, Fletcher’s dean of global business, as he introduced a panel discussion on the topic on October 15.

Disinformation “is being fed to citizens and potentially affecting the functioning of democratic institutions,” he said. For example, he pointed to how false claims about hurricane disaster response were hampering FEMA’s recovery efforts in North Carolina. He also suggested that false claims about the outcome of the last presidential election are sowing doubt among voters about whether this fall’s results should be trusted.

Here are three key takeaways from the discussion.

Lies can be legal.

Major social media platforms don’t want to remove posts that might be disinformation for two reasons: They don’t want to be in the position of deciding what is true and what is false, and they know that free speech has strong legal protections in the United States, said Josephine Wolff, associate professor of cybersecurity policy at Fletcher and associate professor of computer science at the School of Engineering.

“There’s willingness to provide labels” on nonpolitical disinformation, she said. But “there’s not a huge amount of appetite for removing stuff.”

This is a change from 2020, when platforms seemed to be trying to do more, and it’s partly due to recent U.S. legal cases regarding free speech and social media, said Wolff, who is also director of the Hitachi Center for Technology and International Affairs.

If you propose that the government should somehow control information on social media in the United States, “you immediately run into these massive First Amendment problems.”

Josephine Wolff, associate professor of cybersecurity policy at The Fletcher School

If you propose that the government should somehow control information on social media, “you immediately run into these massive First Amendment problems,” she said. “Lies are not great in lots of contexts. They are legal.”

In particular, speech that is considered political receives high levels of protection under the First Amendment, so the platforms don’t want to touch it, Wolff said. But more and more topics—such as information about COVID-19—have become politicized. Therefore “everything very quickly becomes political speech” that the platforms let stand, she said.

If the government can’t criminalize disinformation, Wolff said, the remaining options for social media platforms may be to label the disinformation and try to deprecate it with an algorithm that makes it visible to fewer people. But “actually removing it, I think, becomes politically tricky for the platforms and legally very tricky for a lot of governments.”

People can’t always tell truth from fiction.

While we might hope that informed citizens in a democracy can identify disinformation on their social media feeds and elsewhere, evidence suggests that people in fact can’t distinguish between truth and fiction when it comes to this sort of manipulation, said Thomas Cao, assistant professor of technology policy at Fletcher.

There is no evidence that all citizens "have the capacity to distinguish between what is true and what is misinformation,” he said. This is “because a lot of the misinformation we see on social media is specifically designed to cater to … a particular demand from a certain segment of voters.”

Unlike official propaganda from authoritarian regimes in other countries, which might be more obviously false or clearly designed to promote a specific political perspective, disinformation in the United States is often more potent because it “can be concealed as a reasonable viewpoint spreading some kind of information [where] the political agenda is not particularly obvious,” Cao said.

Politically driven speech in the United States also can be regarded as “just freedom of speech of parties spreading their own agenda and nothing particularly concerning,” he said. But “if you mix truth and falsehood together when spreading this kind of information, I think that presents a unique danger.”

Controlling information strengthens political power.

While there are differing opinions about regulating content on social media platforms, political leaders clearly understand that the control of information is linked to power, said Monica Duffy Toft, professor of international politics, director of the Center for Strategic Studies, and academic dean at Fletcher.

In other countries, such as Poland before its 2023 election, government leaders have increased their power over the judiciary and the media and attacked the opposition as a way of consolidating control, Toft said. In such circumstances, “no longer are fellow elites or citizens seen as loyal opposition challenging the state, but they become enemies of the state,” she added. She pointed to how former President Donald Trump used the phrase “enemy from within” recently to describe Rep. Adam Schiff, a Democrat who oversaw the congressional investigation that led to Trump’s first impeachment. Schiff is now running for the Senate.

“We have seen this before,” Toft said. “There is a playbook … [for how you] get control. You recognize the importance of the media information space, you use it and you frame and you lie and you do misinformation and you say it enough and people start to believe it. … And that systematic effort we’re starting to see is really playing off in the United States with this election.”

Toft said Americans tend to have too much faith in technology and to value freedom over regulation. “But I think in this case we need to really be guarded…. Technology in the wrong hands, when it’s not used by sage, wise people with good intentions, can really go awry. And I think that’s where we’re at.”

Back to Top