News & Stories > Combatting Misinformation: The Role of Local Leaders

Project Highlights

Combatting Misinformation: The Role of Local Leaders

Words by Matt Stempeck • Feb 23 2023

As part of Matt Stempeck’s work researching the ongoing work within the Code for All Network confronting disinformation, we are sharing the most exciting takeaways from those exchanges in the Civic Tech & Disinformation Project Series.

These interviews were conducted in mid-2022. Please note that the information might change as the organization seems appropriate.

Get to know Nichole Sessego’s work combatting misinformation

Nichole Sessego has spent the last few years focused on misinformation and election integrity work at some of the major social media platforms. We worked together on the Digital Organizing team at Hillary for America in 2016. 

Nichole is also an Integrity Fellow at the newly formed Integrity Institute. The Integrity Institute is a unique organization for this exploration because it’s comprised of former platform integrity workers (the tech companies’ name for the teams that work on critical issues like content moderation and user safety) who have left the companies to address these issues from a civil society perspective.

Given the massive volume of existing work in the disinfo space, this Code for All research project focuses on what civic tech, specifically, can contribute.

When I ask Nichole which approaches to managing misinformation she’s found particularly effective, she tells me, “It’s sort of what works in politics and a lot of other spaces, which is getting people that communities trust to go out there, ideally proactively, but really at any point, with the truth and just offer authoritative information.” People are always going to just trust people that they know, [so we need to] figure out who those local leaders are.” ‘Local’ here can be geographically local, or ‘local’ as in, relevant to the communities of affinity that dominate the internet.

Local leaders influence communities

“If you have someone you trust who’s willing to go in and tell their story, their truth, that’s the most effective thing,” Nichole adds. “But it’s impossible to scale. It’s like relational organizing [where people leverage their genuine relationships to have more meaningful conversations about politics].” There are some ways to begin to scale up this practice, such as when a community views a single influential person as an expert on a subject. But in the case of COVID misinformation, Nichole says, people vilified even their own family doctors. So we need to systematically consider who’s trusted in a given community. She gives the positive, and powerful, example of the Mormon Church coming out in favor of vaccination.

Nichole views our increasing difficulty to identify trusted authorities as a commentary on where we are as a society these days, including the deep consequences of breakdowns in civil relations between people. She sees it as a particularly acute issue in the United States, where, coupled with disinvestment in education, the public discourse has become a real mess.

Limited replicability

We know that trusted speakers conveying information is a reliable way to reach a given community. But Nichole’s not sure anyone’s figured out how to scale the approach. “It’s super effective, but really, really, really hard to scale. I think creative solutions for that could be really great and exciting, and something that people should be (and are!) looking into.”

Limits of fact-checking

“People were viewing third-party fact-checking as a silver bullet,” she says, “but people distrust those groups as well, and they make mistakes.” Research is mixed on how effective fact-checking is, Nichole, says, partially because it can prompt people to dig in their heels on their (mis)beliefs.

“I think sharing authoritative information and resources is really great, Nichole says. “That’s going to help a lot of people online. It will not help some of the people who are already really far down a certain path.” We need to keep in mind that certain tactics like fact-checking aren’t going to get through to everyone in a given community.

Factcheckers often systematically address the misinformation that’s trending, based on data that the social platforms share, but the virality of a piece of misinformation doesn’t necessarily equate with potential harm. What constitutes the most harmful misinformation could be specific to a smaller demographic or community.

Data access

Nichole’s favorite project for managing misinformation “was” Crowdtangle, which allows marketers, researchers, and others to follow public Facebook conversations at scale. The platform is easy to use and allows journalists to discover major stories. Ph.D. researchers use the API to conduct high-volume social research. Nichole says that Facebook’s support for the product facilitated its wider use in 2020, a major election year in the US. The tool allowed researchers to search the text on images, for example, allowing the inclusion of misinformation spread via image memes. 

But despite Crowdtangle’s important utility for misinformation research and journalism. This summer, Facebook disbanded its Crowdtangle team, paused all development, and re-assigned employees to different divisions while removing its founder from running the product. That founder, Brandon Silverman, subsequently testified to the US Senate Judiciary Committee’s Subcommittee on Privacy, Technology, and the Law about the wide range of important civil society uses for Crowdtangle. Those use cases, over time, may no longer be possible because of Facebook’s moves.

Nichole doesn’t attribute Facebook’s move to freeze Crowdtangle’s development as malicious but she’s hopeful that the Platform Accountability and Transparency Act (PATA) bill will be passed, which would require large social platforms to protect data access for university-affiliated researchers.

Nichole’s now at the Integrity Institute, which pushes social platforms to be better at transparency. With her inside knowledge of what works and what doesn’t, she’s looking to mitigate the worst effects of disinformation.

She’d like to see more exploration of disinfo-fighting treatments that add context back to viral disinfo. Rather than directly refute what a piece of media someone has just viewed, perhaps we could give them additional contextual information about it: When is the image really from? What comes up with a reverse image search on it? Providing users with additional context like this may not trigger the strong defensive reaction that simply labeling things as ‘FALSE’ can. 

Fact-checking isn’t cheap, but Meta funds the International Fact-Checking Network. The company has relationships with various NGOs and open communication with human rights teams. But there isn’t a great mechanism to enable civil society organizations to provide the platforms with the critical context that lets the companies understand what they’re looking at with flagged content. The platforms have their own internal systems and signals to prioritize which content to respond to.

In addition to improved systems for civil society expert organizations, Nichole sees the following needs:

  • De-monetizing disinformation
  • De-platforming disinformation promoters (or related strategies)
  • Updating citizens’ media literacy
  • Platform interventions
  • Hardening civil society and civic institutions

Nichole indicates that there’s growing commercial interest in pushing back on digital disinformation. This makes sense, as an untrustworthy marketplace is bad for business. Big companies have the resources to invest to protect their brands, for example, which could drive action from unexpected quarters.

Interested in learning more?

Matt Stempeck, the National Democratic Institute, and Code for All worked on a research project to understand how civic tech can help confront disinformation. The project’s goal is to learn from (and share out!) lessons learned from organizations that focus on this area.

Check out the Disinformation Research Project to learn more and help us spread the word!


Author picture

Matt Stempeck

Disinformation Research Consultant

Matt Stempeck is a freelance technologist based in Lisbon and Berlin, where he researches civic tech and builds engagement products for clients like the Boston Globe. He serves on mySociety's The Impacts of Civic Tech Steering Group, and recently authored People Powered's Guide to Digital Participation Platforms. Matt is a Technologist in Residence at Cornell University, where he's helped launch Cornell Tech's Public Interest Tech program.

Related Projects

More News & Stories

Keep up to date with the global civic tech community