Using what we know: How to ensure tech projects meet the brief

Tom Walker

This post was originally published on the Making All Voices Count blog.

Over the last four years, Making All Voices Count (MAVC) has published about 70 research reports, practice papers and journal articles, investigating tech-enabled projects that aim to amplify citizens’ voices and encourage government to respond to them.

They cover a huge range of kinds of organisation and types of technology, ranging from broad assessments of 38 organisations’ tech selection processes to in-depth accounts of how one Ghanaian organisation implemented an interactive voice response survey.

This research holds huge potential for practitioners working in this space. So, as the programme draws to a close, my colleagues at The Engine Room and I have been reading through these research pieces and looking for common threads.

Making tech effective – building on what we already know

One thread stands out: the way that organisations choose and use technology, and how this affects their projects. The overall picture painted by the research is clear: many organisations are struggling to use technology in a way that they feel makes their projects more effective.

This may come as no surprise. (Researchers have been saying similar things for some time now.) The factors behind it are not entirely new, either: earlier research has repeatedly cited the need for stronger understandings of what users need and deeper collaboration with a wider range of groups that projects want to influence.

But, taken together, Making All Voices Count’s research outputs add considerable weight to the evidence behind these messages – and should make them even harder to ignore. What’s more, they point to practical steps that projects are already taking to address these challenges.

Thoughts on when tech works, from the horse’s mouth

Many of the research pieces concentrate on asking organisations how they felt their project had gone, and what role tech played.

This is significant partly because it’s relatively rare. In a sector that’s often accused of trying to run before it can walk, projects don’t often get the chance to step back and reflect – especially in a way that researchers can compare across multiple projects. Making All Voices Count actively encouraged this reflection throughout the programme, and will carry on doing so at its final learning event this week.

But this research method was also interesting for another reason: it may have encouraged organisations to be candid.

Researchers often asked organisations to describe their project’s progress from beginning to end, or think back to a specific moment that was important for their project. Many participants were given anonymity, or the space to explain their project’s context in detail, and as a result, had more freedom to describe how they had really used tech.

Many responded by being, as one report put it, “unexpectedly open and often self-critical” about their own work. As such, the research frequently gives valuable insights into the unvarnished reality of designing and implementing technology in transparency and accountability initiatives.

Many organisations are disillusioned with the contribution technology had made to their project

As we read, we repeatedly encountered one overriding sentiment: disappointment. All too often, organisations told researchers that they were dissatisfied with what tech had been able to do for them.

For example, of the seven MAVC-funded projects that used information and communication technologies (ICTs) to promote accountability in health systems, four experienced: “disconnects between their expectations of what the technology could do, what it actually did, and the implications for accountability.” Meanwhile, in Kenya and South Africa, more than 75% of the 38 transparency and accountability initiatives interviewed said that they weren’t happy with the technology tools they’d chosen. Many had already moved on to a new tool by the time that the researchers spoke to them.

In several cases, researchers linked this to organisations’ limited knowledge of the people they wanted to use the tools. Only 15 of the 38 Kenyan and South African initiatives did any user research before choosing a technology tool. More than half of the total then said that their tool had not been used in the way they had hoped.

Elsewhere, it was attributed to more technical issues. In Kenya, 59% of the 24 projects interviewed by researchers said that a lack of technical knowledge was a barrier for their projects: a finding echoed by Hyrnick and Waldman’s mHealth research.

But we now have more evidence on what’s helping organisations make better decisions

When user research did take place, in-depth practice papers showed the profound difference that it made to projects. For example, the South African organisation Yowzit used user research to discover that its users would only find their citizen participation platform convenient if they spoke English, were digitally literate and already used similar ratings platforms. This helped them to target their work with a revised, tighter focus.

Researchers found that although organisations had to invest time up-front in doing thorough user research, it could actually save them effort in the long run. In South Africa, for example, the Foundation for Professional Development (FPD)’s user research showed them that they needed to focus on building an app focused on clients’ needs, rather than the more ambitious programme they had planned. Others found effective ways of combining offline and online content, as with Map Kibera’s community engagement work.

The research re-emphasised that projects that use tech depend on strong relationships as well as well-designed technology products. For example, Tiago Peixoto and Jonathan Fox found, projects using technology to increase government responsiveness are more likely to suceed when governments already want to get (and act on) feedback from citizens. Meanwhile, the multi-country assessment of mHealth initiatives suggested that projects were more likely to report success when they had strong, long-standing connections with the the people or government actors they were working with.

Changing course during a project: the toughest challenge of all

Finally, research indicated that organisations need to be able to do more than this. They also needed to be able to research all three aspects required for a successful tech project: the accountability problem they were trying to solve; the people they wanted to reach; and the tech options available. Then, they needed to put what they learned into practice, by adapting to what they found out during the process. In Kenya and South Africa, those of the 38 organisations that trialled their tools with their users were by far the most likely to be happy with their eventual choice.This challenge proved to be beyond many of the organisations that took part in the research. Looking back, 79% of the 24 Kenyan organisations interviewed said that “adapting to context mattered a lot for the project,” but were unable to adapt fully themselves. For projects to be genuinely adaptive, they need to be supported with flexible funding and tools for transparent communication with funders. The FPD and Yowzit practice papers provide examples of this approach in action, and we’re looking forward to discussing what this means.

What’s next?

The research that’s come out of this four-year programme is a valuable contribution to our understanding of how tech projects working in transparency and accountability can be most effective. Being given time to reflect and critically assess your own work, doesn’t come around that often, and we’re grateful to Making All Voices Count for providing many such opportunities.

If the research findings are taken as practical lessons that influence future work by practitioners and civil society more broadly, they will have a better chance of success. The next challenges are for practitioners to understand which findings could influence the way they design their next project, and for funders and intermediaries to keep supporting the kind of networks that allow these findings to spread widely.

MORE

This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.