Projects > Disinformation and Civic Tech Research

active project

Disinformation and Civic Tech Research

Earlier this year, the National Democratic Institute and Code for All commissioned Matt Stempeck to explore what our partner organizations around the world have learned in their work confronting digital disinformation. Here you can explore top findings around approaches that have worked, as well as the greatest hits from our global network!

Matt recently hosted a webinar session with NDI and Code for All. Watch the recording and check out the presentation!

Disinformation and Civic Tech Playbook

The Disinformation and Civic Tech Playbook is a tool for people who are interested in understanding how civic tech can help confront disinformation. This guide will help you successfully advocate for, and implement disinfo-fighting tools, programs, and campaigns from partners around the world.

Stats of the project

While disinformation is nothing new, it spreads much faster now. Often it rides on a wave of emotion through personal social networks. Disinformation comes from a variety of sources and it has evolved into a complex phenomenon with impacts in the real world.

So, before you dive into the Playbook, we want to test your assumptions in a 1-minute survey. Are you ready?

Disinformation and Civic Tech Playbook

Index

The playbook includes:

  • Directly applicable program ideas your organization can adopt
  • Results of a needs assessment for civil society organizations fighting disinfo
  • Access to an open collection of 450 disinfo-fighting tools, programs, and campaigns
  • Field-wide recommendations for how we move forward from here

The misinformation management workflow

In order to effectively fight misinformation at a societal scale, three stages of work must be completed in sequential order:

  1. Monitor or research media environment (traditional, social, and/or messaging apps) for misinformation
  2. Verify and/or debunk
  3. Reach people with the truth and counter-message falsehoods

These stages ascend from least impactful to most impactful activity.

Researching misinformation in the media environment has no effect whatsoever on its own. Verifying and debunking falsehoods have limited utility unless stage three is also achieved: successfully reaching communities with true information in a way that gets through to them, and effectively counter-messaging the misinformation that spreads so easily.

Unfortunately, the distribution of misinformation management projects to date seems to be the exact inverse of these stages. There has been an enormous amount of work to passively monitor and research media environments for misinformation. There is also a large amount of energy and resources dedicated to verifying and debunking misinformation through traditional fact-checking approaches. Whether because it’s the hardest one to solve or just third in the consecutive sequence, relatively few misinformation management projects have made it to the final stage of genuinely getting through to people and experimenting with effective counter-messaging and counter-engagement (see The Sentinel Project interview for further discussion).

Fact-checking

Funders may want to consider de-prioritizing pure fact-checking projects. The approach has gained incredible popularity around the globe, representing over 25% of our database of disinfo-fighting projects, including outlets in places like Syria that don’t usually show up on civic tech maps.

But research is mixed on whether these initiatives have the intended effect in the context of broader disinformation warfare. The journalistic model of publishing judgments to a passive audience is also an outdated approach. Some organizations even pay to amplify their debunks of popular disinfo on social platforms, creating the galling possibility that they are effectively doing more to spread disinformation narratives than they are to limit them.

There’s more hope to be had in groups like Code for Africa which has hired actual digital forensics experts to uncover non-obvious patterns. Fact-checking is part of this work, but far from the only focus.

A sound fact-checking program in 2022 must:

  • Demonstrate awareness of the broader disinfo environment, where even fact-checkers can be used as Trojan horses to amplify a narrative.

  • Demonstrate a credible strategy for reaching communities via push tactics (radio partners, syndication through media alliances, deals with social platforms and messaging apps) rather than count on pull approaches (the public coming to visit their website, in most cases, or tagging them on social media / messaging apps).

  • Study how their debunks are actually received by the public, and experiment with alternative approaches and formats for improving results.

This list is just a start, but unfortunately, many fact-checking publications fail even with this basic rubric.

Some organizations that started as fact-checkers have continued to innovate on the model and expand their offerings. These organizations, especially those that have already survived over a decade, are home to considerable institutional knowledge as well as experimentation and should be supported. But for every one of these organizations, there are probably five publications following the same old fact-checking routine.

Media and social media analysis is a booming area of practice, including significant commercial activity. While it predates the current wave of disinformation work, the disinfo response has driven a significant expansion of media and social media analysis (including tooling, discrete projects, organizations working in this area, and so on). The migration of users from public social networks to more closed, encrypted messaging apps has also driven work to better understand disinformation on these platforms.

Media literacy programs are a common approach in disinfo work, possibly second only to fact-checking publications. But rigorous research on their effectiveness was less prominent. The programs tend to target student and youth populations or act as public service announcements. One critique of this approach is that it’s proliferated because it’s fairly unobjectionable, rather than because it’s particularly effective. At least one organization shared that the larger their media literacy public service campaign, the less impactful it seemed to be in terms of messaging construction and delivery, due to major partners diluting the content.

Media partnerships are a successful vehicle for impact. As partners for classic civic tech projects, newsroom partners solve the marketing and distribution challenge that has long hindered civic tech projects from gaining significant traction.

This also includes media-only partnership networks. The unholy combination of disinformation and elections is apparently what it took for rival media companies to collaborate, and now we’ve seen several high-profile and well-populated newsroom disinfo partnerships in regions around the world.

Needs Assessment

Funding

Perhaps not surprisingly, funding was the most-stated need. But within this answer, there was considerable nuance. Respondents emphasized the challenges of the inconsistency of funding more than they did the complete absence of funding. Organizations wish funders would view misinformation management as the chronic annual concern that it is, rather than a trendy issue that can be ‘solved’ by a given solution. 

Organizations fear funders will move on from this trend as new issues pop up each year. There’s significant research to suggest that media attention to certain areas of the world, or current events, influences funding levels for various places and causes. There’s potential to build a coalition of funders committed to serving the communities most in need, rather than the causes and places most in fashion. Perhaps connecting the donor community to the effective altruism movement could seed this idea.

Organizations also stress about aligning the work they believe will be most impactful with the funder’s current interest, or the theme of a given grant round. In terms of what works, successful founders say that it’s a constant challenge to raise their budget each year, but that diversification is key to long-term sustainability. In general, most of the groups accept funds from foreign governments, tech companies, and other sources as long as they can maintain their editorial independence. There was little evidence, anywhere, of market support for this work.

Some organizations have figured out how to self-organize with complementary groups and pitch funders as a package deal. In these cases, the groups are able to work together to contribute their strengths. There’s often a lead grantee who organizes the others and is able to sub-grant funds. Respondents from multiple continents shared that funders appreciate when civil society groups team up like this and get more impact on the funders’ budget.

Knowledge Exchange

In the needs assessment, peer exchange of knowledge came second only to funding. Interview subjects repeatedly brought this up as a resource they would both participate in and benefit from. NDI or another funder could convene the disinfo practitioner community to candidly share what they’re working on, how they do it, and what’s working and what’s not. 

This would ideally be an in-person event, and participation is limited to practitioners so that the focus can be on a candid exchange of knowledge rather than representing their organization to the broader community. With the possible exception of application-minded academics – there’s strong interest in closer research partnerships to understand the impacts of their work.

Another format that knowledge exchange could take would be microgrants to fund organizations to provide them the time to capture their lessons learned. Specifically, capturing the details of which specific approaches are effective or ineffective toward specific types of misinformation. To ensure discoverability, these lessons should be hosting them in a central, consistent location that others can easily discover.

Technical knowledge also came up as a need, as in, simply knowing how to do certain things, or knowing if certain approaches work. At the same time, most respondents said they don’t consult academic literature on this work.

An improvement that could benefit technical knowledge as well as drive rigorous impact evaluation is improved research partnership models. There’s a boom in disinfo work as well as academic investigation into it. So it certainly feels like a waste that the two aren’t more closely linked. Respondents identified a gap in the practicality of the academic literature, with long publishing cycles and elite language that makes it difficult to interpret and apply published findings. They also shared that they haven’t been impressed by the findings they have seen.

There is hope, though, as some organizations commissioned academic researchers to work in-house to evaluate impact, and were able to immediately apply their findings in well under twelve months (see Chequeado post).

A conference or other format for bringing together the most practical, application-forward academics with the most evaluation-hungry practitioners could bear fruit. I’ve suggested MySociety’s TICTeC event (which is considering narrowing to specific themes) and MisinfoCon (when it’s hosted at a university) as potential partners or models.

Globally distributed research partnerships

Another knowledge theme was interesting in how the same theme or platform plays out in different contexts globally. For example, how does misinfo move on Whatsapp or Telegram? What’s the best way to tackle disinfo in the lead-up to a major national election? What role in public narrative shaping can the advertising industry help play, with its massive campaign budgets? Supporting a research project that pulls together a geographically wide range of partners might be worthwhile. Groups like Global Voices have conducted this type of research, where they leverage their global contributor network to chronicle, for example, how Facebook’s Free Basics program materialized in different markets.

Access to data

Access to closed, proprietary, ever-shifting social media network data is a challenge for studying this space. For well over a decade it’s been well known that Twitter is a poorly representative sample for studying the public sphere, but to this day its data is far easier for researchers and others to access, limiting the applicability of their research findings. Facebook’s decision to acquire and then de-resource Crowdtangle will only exacerbate this issue. Legislative efforts to enforce sharing of societally important data with qualified researchers are critical to this field.

Respondents reflected that one area where they could gain significant efficiencies is through the collective pooling of disinfo data. There are challenges here, though, as seen by Meta’s aggressive moves to shut down the NYU Ad Observatory. A central party organizing this resource could bring it into being, although you’d want to iteratively validate that competing organizations would both contribute and use the resource.

Database analysis

Of the 446 projects in our database, some patterns are clear:

  • Disinformation tools and platforms are the most common type of project in the database, but only 60% are still active.

  • Fact-checking publications are also a very common type of project and cover a remarkable geographic spread.

  • The USA dominates the overall project geography concentration, and while our data is likely biased toward US projects, this finding likely stands.

At least 88 of the projects in the database are no longer active. Evaluating these failed projects, it seems that we can identify some project traits likely to correlate with failure:

  • Browser extensions and plug-ins of any kind.

  • Projects depend on crowdsourced data collection to drive their core activity.

  • For-profit disinformation fighting services.

Platforms

I consistently got the impression that for civil society groups working on disinformation, working with TikTok is basically like working with Facebook in 2007 – they have all the scale and momentum but insufficient tech policy experience, or at the very least limited access.

Many of the groups interviewed receive funding from big tech platforms, often through International Fact-Checking Network membership. While this doesn’t prevent the groups from critiquing the tech platforms, it likely does have a dampening effect on the volume of their critique. 

For their part, the civil society organizations are grateful to have a dialogue with the tech giants but unconvinced that their input and feedback amounts to much internally. In particular, they are well aware that they’re usually conferring with professional civil society relationship managers and policy people, and not the actual product or integrity teams. There is a strong appetite for getting into conversations with product and integrity teams. I had the repeated opportunity to promote NDI’s Design 4 Democracy Initiative and it was consistently well-received as a good idea.

Influencers play a meaningful role in misinformation management, with both positive and negative impacts. The term ‘disinfluencer’ has been coined to describe people who leverage the influencer economy, including its professional management utilities, to scale up their work. They aren’t always economic actors, although that’s a common motivation.

At the same time, civil society groups and researchers have identified that locally-relevant community leaders are one of the most effective ways to promote sound information and limit the spread of harmful misinformation. Political parties, like the Democrats in the US, have an established practice of coordinating sympathetic influencers during key moments like debates and election days. The lack of transparency surrounding the influencer economy only adds to uncertainty with regard to understanding their impact on disinformation.

The specter of AI

Despite the marginal benefits of semi-automating the fact-checking workflow with AI utilities, civil society groups have made it clear that debunking a falsehood takes far more energy and time than creating it in the first place. AI seems far more likely to exacerbate this dynamic than improve it. As completely invented audio, video, and imagery become available to anyone to deploy for free, it’s likely only a matter of time before forgeries flood our communication channels. In this environment, battling over each individual piece of media is unlikely to be an effective strategy. Engagement-based approaches like investing in civic and media literacy, actively deploying community leaders, and supporting strong journalism will be more important than ever.

Acknowledgment

There are several people and organizations whose time, insights, experiences, participation, contributions, and expertise made it possible for this work to be developed.

Related Disinformation and Civic Tech Research News & Stories

More Projects

Get involved

For individuals

For organizations

Keep up to date with the global civic tech community