Science based evidence – the lifeblood of the humanitarian ecosystem
Consider if you will – how, the Covid-19 pandemic that is now nearing a 18-month cycle – impacts all aspects of our lives and work. But it has also become clearer how important data, statistics and scientific evidence have become. Not only as part of our lived and personal experiences, but how we make decisions based on what we see and learn from the data presented to us.
Now, more than ever, governments around the world are grappling with tough decisions in terms of distributing resources and providing help. Where should protective equipment be sent, who should receive emergency assistance, how should these resources be deployed, for how long, how do we ensure enough coverage, how do we distribute vaccines, what risks are involved and how do we ensure cost effective, efficient, timely delivery?
As people fall ill, lose their jobs, businesses close, demands for social assistance increase, tax revenues decline, responding quickly and effectively becomes literally a question of life and death. The decisions about who should get aid and in what format to made by government, aid agencies, emergency personnel and social purpose organizations depend on the kind of information they have. For example, decisions around whether to provide cash, vouchers, or in-kind assistance involve trading-off letting people themselves determine how to use assistance to meet their needs vs. the desire to encourage certain types of spending, such as on food. Two factors, however, should always enter these critical decisions about social and humanitarian assistance: data and evidence.
It is our conviction that the next generation of humanitarian and social support and protection ecosystems will be built on data and evidence. But by the same token, while our ability to collect immense amounts of data, conduct impressive analysis, and share it all over the world has radically improved, it seems that the social sector and its stakeholders have not yet come to understand the importance of using data to make both funding and service delivery decisions.
Data in silo’s
Every day, immense quantities of field data are collected and stored by NGOs, philanthropists, development finance institutions, grantmakers and social investors all around the world. Their data is used for needs assessments, monitoring and evaluation, and informing the design of future interventions. While the data gathering means are generally quite different (paper questionnaires, spreadsheets, and online surveys), its end destination is often the same: data silos and on-site servers.
This does not have to be the final resting place for data: hard-earned data can be shared, extending its lifetime and utility to support future projects or research groups that otherwise would never have had the opportunity or means to collect it themselves. Sadly, 85% of research data is never reused, and international development programs are no exception to this reality.
Missing data
With just 10 years left to achieve the Sustainable Development Goals, more than 50 indicators remain undefined, with data missing to identify the progress against these goals. According to experts, it is the basics of data collection that is holding up the “revolution” for many nations that need better data for decision-making — including agreement on data standards, building a government structure that uses data effectively, and funding.
Doing Development Better
Relevant and high-quality evaluation is an important tool to track the results, effectiveness and impact of development programs. Research and evaluation can help explain why programs are succeeding or failing and can provide recommendations for how best to adapt to improve performance. Along with monitoring, evaluation contributes evidence to improve strategic planning, project design and resource decisions, and evaluations are part of a greater body of knowledge and learning.
Evaluation is not a silver bullet, but without it, managers may not have sufficient evidence to understand potential reasons why a program is exceeding, meeting or falling short of performance expectations. Without independent and transparent evaluation, stakeholders may lose confidence in a program’s ability to achieve results. And without relevant evaluations, project designers and strategic planners may lack the necessary information to inform what works best for future interventions.
Evidence is a contested field, with differing opinions on what should be most valued or deemed most relevant to decision makers. In our opinion research that is underpinned by scientific notions of proof, validity, reliability, and has minimized bias, has the advantage of rigour, relevance and independence.
A post Covid world
If there was one sentence used during the pandemic that have made sense – for the travel bans, to liquor and tobacco bans, or repeated lockdowns and school closures – it was ‘science-based evidence’. For the humanitarian sector to be more effective in a post Covid world, we will have to adopt the same science-based approach in our own work. This will require:
- Building the capacity of internal staff: To ensure high quality data, that can be used to inform better practices, we will have to ensure our teams are equipped with the skills not only to collect data and use different methodologies to conduct evaluations and impact assessments, but more importantly - the competencies to analyze the data and interpret disparate pieces of data.
- Expand our tools and partnerships for evaluation and learning: To know whether interventions are achieving higher-level outcomes, we will have to encourage project/program and portfolio evaluations at every opportunity.
- Collaborate and entering partnerships: We will have to explore and enter partnerships and collaborations to consider evidence gaps, map spending and development patterns, and conduct systematic reviews in key sectors such as health, education, community development and employment for example. Systematic reviews can help us understand the available research on a specific question, portfolio or development challenge, providing the best possible evidence of what is known.
- Assess and improve the quality of our research, data and evidence efforts: A robust and transparent approach to selecting and generating high-quality evidence for ourselves and our stakeholders. This might include: What constitute good enough data, applying standards to our work, mapping existing and considering new sources of evidence, and synthesizing useful evidence. Most importantly, not only use data and evidence but share our insights.
- A focus on impact: Making a commitment to learn from our activities, including successes and failures, so that we can increase our effectiveness in achieving our humanitarian objectives.
Next Generation Consultants helps social and impact investors get the most from their impact data. We develop impact strategies, impact management and measurement frameworks, and verify and analyze impact data. Because we understand how important impact data is, we work with our clients to set up processes to incorporate the lessons learnt from impact assessments. For evidence of our work, please visit our website, www.investmentimpactindex.org or call us for more assistance at info@investmentimpactindex.org