News & Stories > Battling Disinformation with Code for Africa

Project Highlights & The Network

Battling Disinformation with Code for Africa

Words by Matt Stempeck • Feb 28 2023

As part of Matt Stempeck’s work researching the ongoing work within the Code for All Network confronting disinformation, we are sharing the most exciting takeaways from those exchanges in the Civic Tech & Disinformation Project Series.

These interviews were conducted in mid-2022. Please note that the information might change as the organization seems appropriate.

Get to know Code for Africa

Meet the incredible staff members whose invaluable insights on how to combat disinformation, as well as their experiences and best practices, you will be able to learn and put the best practices in action.

Allan Cheboi is Senior Investigations Manager at Code for Africa. He was was one of the first recruits to build Code for Africa’s Investigative Lab (iLab) team, a digital investigative unit supporting the African Network of Centres for Investigative Reporting (ANCIR).

Stacy Ewah is the Program Manager of New Initiatives at Code for Africa. She’s passionate about civic tech. Previously, Stacy managed the Civic Innovation Lab Abuja, which focused on govtech, social innovation startups, and the UN Sustainable Development Goals.

“Code for Africa is the continent’s largest network of digital democracy laboratories, with over 90 full-time data scientists, forensic researchers, technologists, and digital storytellers working in support of investigative media and watchdog [civil society organization] partners in 21 African countries. [Code for Africa] builds digital solutions that provide actionable information to citizens to encourage informed decisions, while also amplifying voices to strengthen civic engagement for improved public governance and evidence-driven accountability.”

Let’s delve into the challenges they’ve faced and the strategies they’ve employed to stay ahead of the curve. Get inspired and gain practical insights on how you can make a difference in the fight against disinformation.

Effective partnerships

A critical challenge for civic tech projects, and one of their greatest downfalls, is finding enough users. It’s a difficult task for any new product. But in civic tech, we rarely have the resources and marketing budgets to attract users the way for-profit products might. Advertising is rarely employed. Too often, well-intentioned techies focus on building a product or service meant to address a problem in society and don’t sufficiently prioritize the distribution and uptake side of the equation. The result is far, far less impact for these projects.

Some of the most successful civic tech efforts, as measured by the reach of their work, address this challenge through partner ecosystems. They establish strategic partnerships that immediately address the challenges of distribution and reach. 

Code for Africa has identified and surmounted the distribution challenge by partnering closely with existing media organizations across the continent as well as social media platforms. Many, if not all, of Code for All’s disinformation projects were formed, launched, and run in concert with partner organizations.

“One policy we go in with is the mentality that one organization is never enough to solve a problem. There is a proven model for coordinating disinformation work at the moment: Don’t do it yourself.”

Allan Cheboi

Newsroom partnerships

The institutions supporting the fight against misinformation in Africa.

In the case of disinformation, an effective partnership strategy might very well include journalists. Newsrooms across Africa already reach local audiences, almost by definition of what they do. Rather than try to replicate that relationship with a relatively small staff, Code for Africa works to train and instill disinformation research methodologies, including digital forensics, into the work habits of practicing journalists. 

As a pan-African organization, Code for Africa seeks to serve over 1.2 billion people. That means they will not always understand local nuances. By partnering with local newsrooms, the organization empowers journalists who already understand their local information environments to effectively report on disinformation. The iLab team does the digital research and identifies the problem, and newsrooms are interested in being part of the solution – sometimes even proactively reaching out to Code for Africa.

“We fund and run the new African Fact-Checking Alliance of some 240 newsrooms in 20 African countries, plus the African Digital Democracy Observatory of academic and think-tank research institutes, which all use a stack of civic tech tools we make available to partners.”

Justin Arenstein

Training models

Allan has found that longer-term mentorship opportunities, in the form of fellowships, create longer-lasting change than individual training. Code for Africa increasingly gives trainees the opportunity to apply the methodology, rather than listen to presentations about it. Their fellowship model resembles a sort of short-term modern apprenticeship. 

“When you show a tool to a newsroom, people forget about it in under two days,” says Allan. 

When people become research fellows with Code for Africa, they continuously apply the skills, knowledge, and resources they’re learning to develop investigative habits. They get to see the impact of the investigative research they’ve done, which fuels further interest in doing more. Over the course of a fellowship, they might embark on several different investigations. The practice becomes embedded in their minds, and they take that with them back to their newsroom and onward in their journalism careers after the fellowship ends. In this way, Code for Africa is able to leverage a relatively small staff (large by civic tech standards, but small relative to the population of Africa). They improve the information consumed by many, many more people, by not going directly to news audiences alone.

Package deals

One way Code for Africa ensures its healthy partner ecosystem is by funding itself through sub-granting. This may sound like a wild idea, given that almost every group we’ve spoken to has identified funding as their top need. But it is possible. Code for Africa develops projects together with partners and then approaches funders together, as a package deal.

“We work with our funders to give grants to newsrooms and researchers, two communities of practice that we’re developing. We like to identify ourselves as ecosystem builders,” says Stacy. “And to be able to do that, we understand the necessity for us to spearhead projects that would affect a lot of people.” In doing so, Code for Africa guides funders in aligning related groups together to limit duplication of efforts. But how do you align incentives for civil society groups that often must compete for limited resources from funders?

It can happen, according to Allan, “If civil society tells funders, “We can definitely work as a consortium, we don’t necessarily need to work in independent silos.” For us to drive impact, we showed them that there is a need for different organizations to work together.”

This collaborative mindset needs to originate from civil society organizations. “We come together and we work together,” Alan says. “Even the proposals we are writing for funding, we need to be writing them together and say this, this is my strength. This is this other person’s strength. And then we tell the funder, if you fund all these organizations once, then it becomes easier. Everyone is attributed at the end of the day, and they feel happy that there’s impact.”

Code for Africa took this approach most recently in a consortium of partners that aligned their efforts around the 2022 elections in Kenya.

Platform partners

Beyond newsroom partnerships, Code for Africa also partners with (and is funded by) social media platforms like Facebook parent company Meta. Code for Africa’s fact-checking team is the largest in Africa, spanning 14 countries. They produce some 2,000 fact-checks per year. By partnering with Meta, those couple thousand fact-checks contribute to some 5 million posts being labeled as harmful or misleading, on Facebook alone.

Facebook’s AI can also use fact-checks as a starting point to identify mutant versions of the same debunked images or text elsewhere on the platform, automating large amounts of the work.

Upgrading fact-checking

Code for Africa has revamped the fact-checking process from start to finish to get more done with limited resources. They’re building factcheck databases that many fact-checking organizations can reference before committing resources to debunk a claim that’s already been checked. They’re using AI and ML to detect related claims and enhance the debunking process. By feeding their system quality data, they can train it to learn and identify claims on its own.

Finding funders

Humanitarian organizations have emerged as an unexpected but entirely welcome funding source. They increasingly understand that disinformation is affecting their own missions. For example, disinformation around the Kenyan elections could lead to violence, and disturbance of peace, falling squarely under the United Nations Development Programme’s purview.

The International Committee of the Red Cross and Red Crescent has shown interest in training on disinformation as well, as disinfo leads to conflicts and can cause the very death and suffering they work tirelessly to prevent. In their line of work, disinformation is an early funnel condition.

Maintain editorial independence

Code for Africa seeks funding from anyone who wants to solve the problem of disinformation using data. That said, they include independence clauses in every contract with any given funder. This enables them to do the research they need to do, even if it doesn’t reflect kindly on a funder’s platform. Code for Africa avoids funders with obvious agendas, like certain governments.

“People should find funding with organizations where the scope of work aligns with what they do, regardless of where it is,” says Stacy. “We work with journalists, so we cannot have our funders suggesting they’re the editor or limiting editorial freedoms in any case, or in any capacity. It’s very important as the person who we are doing it for is for the people.”

Disinfo forensics

Code for Africa has developed a robust research methodology that consistently gets to the bottom of coordinated disinfo campaigns. The iLab team itself doesn’t conduct fact-checking, but rather network analysis. They take an ever-growing corpus of fact checks and use them like dots on a map to ascertain broader patterns.

Using tools like CrowdTangle, Meltwater, PrimerAI, and CivicSignal, they can connect those dots to find the key narratives, drivers, and actors to and solve the problem holistically. Mapping the information ecosystem in this way helps Code for Africa’s researchers understand who may be compromised or influenced, and who controls the infrastructure. 

As the work expands into leveraging machine learning and AI, it calls for additional skill sets. For example, Allan’s background as a forensic researcher complemented the team’s existing data science skills, and he was able to look for anomalies in the data. 

By establishing and refining a consistent methodology, partners can trust and verify how Code for Africa reaches its conclusions. Their research has been used by global players like Digital Public Square (DPS), Global Disinformation Index (GDI), Institute for Strategic Dialogue (ISD), Reuters Institute at Oxford University, and partners such as the Equal Access International (EAI), Climate Action Against Disinformation (CAAD), Deutsche Welle Academy (DWA), the United Nations Development Programme (UNDP), United States Agency for International Development (USAID), the University of Washington (UW), US State Department, and the World Resources Institute (WRI).

Package outputs for target audiences

Allan’s team takes a ‘work once, communicate often’ approach to the work. The packaging of the analysis is often as important as the findings themselves. The core research methodology, unpacking key narratives, tactics, and actors behind disinfo trends, speaks to policymakers, humanitarian organizations, and funders, as it allows them to take action based on the narratives unearthed. 

Tech platforms like Meta, meanwhile, prioritize individual cases over narratives. They want to know about specific platform-based examples of disinformation. The same iLab digital forensics research linking narratives, tactics, and cases, allows Meta to act on specific cases.

In the case of government partners, they may be most interested in cases: who to punish for violating laws around disinformation. 

So Code for Africa tailors their information based on the audience. It’s the same amount of upfront work doing the research, but it will go so much further with target audiences if civil society organizations take the time to package it for the recipients. 

The most powerful form of ‘packaging information’ may well be storytelling. Code for Africa’s data analysis must be converted into narratives that matter to people.

“One thing many people forget is that you’re not speaking to data scientists. Most of the reports I read out there focus more on, “How did I achieve these results?” Which is not the important element of a report at the end of the day. You’re reporting something because you want to make someone take a particular action. 

And that is where storytelling comes in very, very handy. And it is deliberate. I have on my team people we call Insights Managers, or Insights Editors. They are from an editorial, journalistic background. They look at the report you’ve done as Allan, and me, from a very cyber, tech background. They read it and they are like, “Haha! Okay, the normal citizen will not even understand it. A policymaker at a government institution somewhere or at a humanitarian organization will not understand what you’ve written here.”

That is the skill set that we need to convert this very complex data into something that many people will actually resonate with. Storytelling is a very big part of Code for Africa to be specific, and it enables us to actually convert that complex analysis into something that people and decision-makers can actually consume.”

Connect to issues people care about 

Civic tech organizations’ work confronting digital disinformation must be clearly linked to specific issues that people truly care about. While many civic organizations lose their audience with an overemphasis on abstract ideals of civic engagement, Code for Africa intentionally covers immediately pressing issues like climate change, land ownership, and religious extremism. 

You’re focusing on things that matter to the society because at the end of the day, you’re not doing this work because you just want to do it, because you just want to create a particular resource or output. No, no, no. You’re doing it because the public needs it. What is the problem for the people? And then how do we address that problem using the skills that we have internally at Code for Africa?

Allan Cheboi

The nature of disinfo campaigns

As they work to serve all 54 countries on the continent, Code for Africa has noticed that disinformation is becoming a regional and even global industry. So they’ve had to take a continent-wide approach, as well. The investigation they’ve done in West Africa, for example, demonstrated coordinated networks of accounts pushing foreign influence operations across the region. They saw several accounts in Cote d’Ivoire, Mali, Central African Republic, and Nigeria, all pushing the same message.

Disinfo traverses political boundaries and languages or dialects

According to Code for Africa founder, Justin Arenstein, their IO forensic investigation team uses open source intelligence (OSINT) tools for its network analysis to detect coordinated inauthentic behavior or other coordinated amplification and to monitor for hate speech and other toxic content. The team works in three international languages (Arabic, English, and French), plus a range of African languages across 20 African countries.

Their analysis is used by media and civil society watchdogs for investigative data journalism, as well as by a range of UN agencies, embassies, and development organizations for early warning in conflict countries. 

Addressing regional disinfo campaigns in this way requires having feet on the ground, or local employees, across the region. Right now, Code for Africa has feet on the ground in about 23 African countries, almost half of Africa. This staff speaks the local language(s) and dialects, enabling better monitoring in those languages.

Code for Africa relies on its distributed team to achieve lingual diversity across at least six different languages. In-house lingual diversity also helps the team in identifying lexicons and keywords for their technical disinformation research. 

We employ locals in the country, we always have in-country staff. They’re able to understand the languages locally, are able to interpret the languages, and are able also to produce content in those languages for easy dissemination. That is one of our major strategies.

Stacy Ewah

Those local employees are also deeply interested in what’s happening locally, and give a critical perspective to the rest of the team based elsewhere. Together, the team can collaborate across the continent to piece together the key narratives they’re tracking that would be difficult or impossible to detect from a single location.

These lexicons, or collections of watchwords, can be used to understand and neutralize coordinated online operations. In one of Stacy’s favorite projects, Harassment Manager, “we’re looking out for trigger words indicating harassment, especially of female journalists online,” she says. “We’re helping create the tool and methodology around that. I’m quite pro-feminist and I find the harassment of female journalists online quite infuriating, for lack of a better word.”

Biggest challenges in confronting disinfo

  1. Funding is always important. It used to be more difficult to pitch these projects, according to Allan, but more recently, funders come looking for it. Code for Africa strategically shows funders the evidence of the disinfo that’s being propagated first, which prompts them to ask what can be done. The team has more resources and a better methodology than they did four years ago when they started out, and that’s in large part thanks to their early funders, whose belief in the organization was absolutely critical to laying the foundation for the work they’re doing now.
  2. Data access for research is a main concern that everyone fighting digital disinfo struggles with. How do we get TikTok to provide access to their data? It’s one of the major platforms in Kenya, for example, and we have no visibility into how the platform’s being used. How do we tell YouTube to give researchers access to data?
  3. The big tech platforms continuously roll out new features, and these are obviously used to affect the public discourse. But the platforms may not adequately monitor the new modes of communication they roll out for disinfo abuse or even provide civil society organizations sufficient access for them to be able to help. The global scale at which these platforms operate demands closer attention.
  4. Convenings of the relevant actors are necessary. “We need to bring funders, tech platforms, and humanitarian organizations like the UN together and tell them what the disinfo problems and solutions are,” Allan says. And when the tech platforms are invited to the table, the relevant teams must attend for a meaningful conversation to follow. The companies usually send their public policy and public relations people, but it’s the product and integrity and user safety teams that civil society needs to be able to speak to.

    [Author’s note: This is the goal of NDI’s Design 4 Democracy Coalition].

    Stacy adds that the ability to establish more partnerships and communities of practice would drive their work’s impact.
  5. The digital advertising giants need to step up to a greater degree because advertisers’ budgets fund so much disinfo. Many disinfo actors create blogs because they want to earn money from Google Adsense, for example.

Free idea: Write non-technical case study documentation for each disinfo tool you launch

ADDO launched a tool, TrollTracker, and the accompanying blog post is a great overview providing context into why someone would want to use it. The team felt that too often, tools are made available, but it’s not clear how or why one would actually use them.

Just making a tool open source isn’t enough; you need to explain what it can do. Once you’ve done that, a simple web search will help people find your documentation post, which will drive re-usage. People can also reach out via the post and suggest new applications for the tool.

Interested in learning more?

Matt Stempeck, the National Democratic Institute, and Code for All worked on a research project to understand how civic tech can help confront disinformation. The project’s goal is to learn from (and share out!) lessons learned from organizations that focus on this area. Check out the Disinformation Research Project to learn more and help us spread the word!


Author picture

Matt Stempeck

Disinformation Research Consultant

Matt Stempeck is a freelance technologist based in Lisbon and Berlin, where he researches civic tech and builds engagement products for clients like the Boston Globe. He serves on mySociety's The Impacts of Civic Tech Steering Group, and recently authored People Powered's Guide to Digital Participation Platforms. Matt is a Technologist in Residence at Cornell University, where he's helped launch Cornell Tech's Public Interest Tech program.

Related Projects

More News & Stories

Keep up to date with the global civic tech community