Harvard ALI Social Impact Review

View Original

Elevating Qualitative Data in Impact Performance Reporting

An impact investor reports that they created 1,000 new jobs. But what does this number really mean?

The impact investing industry has made great strides when it comes to quantification and standardization of impact data. Now it’s time for the qualitative side to catch up.

As part of a recent research project to understand best practices for impact performance reporting, we interviewed 57 diverse practitioners, with a focus on investment decision-makers. We asked our interviewees a number of questions about the state of impact reporting, including what are impact investors getting wrong.

Each response echoed the last: numbers alone do not provide the information we need to understand a fund’s impact results. Frustrated with the industry’s fixation on quantitative key performance indicators (KPIs), these practitioners were hungry for more nuanced reporting that focuses on strategy, context, and learning.

“I often find myself reading lots of numbers in impact reports that really don’t tell you much. Thus, I ask myself, so what?” responded Gianluca Gaggiotti, Research Manager at EVPA. “Without contextualizing, without elaborating on those quantitative measurements, I cannot infer anything about the quality of the impact that has been achieved.”

“I think that in the world of impact measurement, there is way too much focus on precision,” said Dr. Gillian Marcelle, CEO and Founder of Resilience Capital Ventures. “And so people imagine that they can come up with a quantitative metric that precisely measures dimensions of social impact. I mean, that's not particularly useful. But if you are not familiar with or engaged deeply with the epistemics involved in influencing societal change, then I suppose that seems like a valuable contribution.”

It’s no secret that qualitative data is critical to understanding impact, which is inherently multidimensional, context-dependent, and dynamic. But as an industry, we’ve undervalued qualitative information, dismissing it as feel-good stories rather than a critical ingredient to make sense of KPIs and to drive learning and improvement. To strike a better balance between quantitative and qualitative information, we need to elevate qualitative data to the level it deserves.

The limitations of a metric-centric approach

What happens when we get the balance wrong?

Let’s take “number of jobs created”: a metric cited in countless fund-level impact reports. This metric is understandably popular among investors: it applies across a wide variety of sectors, it can (relatively easily) be aggregated and compared, and it is a helpful shorthand for both scale and economic impact. As a result, a great deal of work has gone into making this metric precise, calculable, and comparable (see: IRIS+ and the ESG Convergence Project). According to fund managers we interviewed, a great deal of work also continues to go into sourcing the data and generating these numbers for reports.

Here’s where relying too heavily on this metric goes awry. First, context matters: 1,000 manufacturing jobs in Detroit are not the same as 1,000 agricultural jobs in Guatemala, but a metrics-first approach assumes that these two results are equivalent. Second, the number of jobs says nothing about whether the jobs are stable, safe, or making a meaningful difference in the job holder’s livelihood. Third, job creation may not actually be relevant to the company’s impact thesis: the company’s impact strategy may be several steps removed from creating jobs, or may not even prioritize job creation at all.

Without context, a discernible link to the company or fund’s impact thesis, or a window into the job holder’s actual experience, this metric becomes largely meaningless. And yet, it continues to be used prolifically because of its advantages of consistency and comparability. It begs the question: what has comparability achieved? What decisions has it enabled? In other words, is all the effort poured into reporting on this metric making us any better at understanding our impact?

The right kind of qualitative data

Integrating qualitative data in a thoughtful, systematic way has the potential to fill this gap. Our interviews suggested three key areas where qualitative data could add the most value to an impact performance report:

  • Context for metrics: Qualitative information can help a report reader better interpret a metric in the form of added details on geography, sector, population, or issue area. This information helps a reader distinguish between, say, a solar company that reaches 1,000 customers in suburban Massachusetts, helping them transition to a cleaner and more efficient energy source, and one that reaches 1,000 customers in rural Mali, electrifying communities without previous access.

  • Stories that illustrate – or challenge – the impact thesis: Qualitative data can clarify the specific role that an investor’s capital is playing, as well as how the provision of that capital and other value creation activities have led to outcomes for the company’s stakeholders. For example, a reader could learn how an investment helped a solar company expand its production and distribution capacity, which allowed it to expand into a new community, which enabled rural businesses to electrify their operations. Critically, this could also illustrate cases where the impact thesis does not play out as planned. Ideally, these stories come directly from stakeholders, offering a feedback loop and accountability mechanism to allow companies and funds to respond and change course.

  • Rationales, risks and reflections: Finally, qualitative information can offer insight into a fund’s decision-making processes, including the trade-offs weighed when making the investment decision. For example, a fund may share its rationale for deciding to invest in a given company, citing factors such as the company’s unique approach to addressing a problem as well as the mission-orientation of the leadership team alongside quantitative projections or scores. It may also disclose potential risks associated with the investment’s ability to deliver on its expected impact goals. Finally, it can share reflections, lessons, and evolving thinking around its impact thesis and contribution, challenging themselves and others to continue improving their impact.

Tensions and gaps

Our interviews indicated that many market actors have a desire for more complex and nuanced reporting. But these desires are at odds with another need: impact reports that are approachable and digestible for readers.

To date, the impact investing field has equated “digestible” with “quantified,” imitating the conciseness of a financial statement. “I think impact investing operates largely on the template of commercial ‘non-impact’ investing,” argued Michael Brown, Head of Research at the Wharton Social Impact Initiative. “And I think that because of that template, there's a lot of value placed in standardization.” Many report preparers fear that qualitative information will mire their readers in details that distract from the impact “bottom line.” As a result, qualitative information can be perceived as overkill or dismissed as fluff. This perception, of course, perpetuates itself: the more the industry perceives qualitative data as “soft,” the fewer resources it will invest in gathering, communicating and interpreting it.

This stands in stark contrast to the social science approaches that dominate in the public and non-profit impact sectors (most notably program evaluation, impact measurement’s older cousin). These disciplines advocate for mixed methods, integrating quantitative and qualitative data to create a more complete understanding. But while impact investing draws upon a handful of these methods, it has largely assimilated to match the style, processes, and preferences of the financial sector.

In doing so, impact reporting has sacrificed the hard-won lesson that context matters. By trying to boil everything down to numbers, and by focusing our decision-making on metrics and scores alone, we lose the nuance that would ultimately make us better impact investors.

Resetting the balance

How do we reset the balance? Is it possible to embrace qualitative data, but package it in ways that are digestible and usable? And if we integrate more nuance and context, do we have to give up on comparability?

A few initiatives have begun to help investors strike the balance. One example is the Impact Management Project’s five dimensions of impact: this model emphasizes that impact is complex and multidimensional, while codifying these dimensions in a standard way. Another is Lean Data, the methodology developed at Acumen and now championed by the team at 60 Decibels. This method collects open-ended qualitative data that, over time and at scale, can evolve into close-ended KPIs and feed into sector benchmarks. There are precedents on the finance side, too: the IFRS Conceptual Framework for Financial Reporting describes the role and characteristics of qualitative information that is useful for decision making for investors, and the Taskforce on Climate-Related Financial Disclosures (TCFD) principles are “informed by qualitative and quantitative characteristics of financial information.”

But more stones need to be laid. For qualitative information to be fully embraced, the field needs to:

  1. Define the types of qualitative information that should appear in impact performance reports: We’ve taken a first crack at this in Raising the Bar: Aligning on the Key Elements of Impact Reporting, offering five categories of information that constitute a complete impact performance report. These categories can serve as a way to organize and codify qualitative information, making it more manageable and predictable for consumers to process this data. In the report, we call on standard setters to champion reporting guidelines that include robust qualitative data, and we urge institutional investors to request fund-level reports that meet this bar.

  2. Build capacity for impact investing professionals to level up their qualitative data use: Qualitative data collection and analysis is a skill set unto itself. There are numerous established methodologies and practices for conducting qualitative analyses that are regularly used by diverse disciplines. The field can level up its qualitative chops through developing training programs, case studies, and other learning forums for impact investing professionals to learn ways of conducting interviews and focus groups, coding and organizing open-ended data, identifying patterns and themes, and presenting findings and results in a digestible way.

  3. Call on practitioners as powerful translators: Many professionals who bridge the worlds of social science and finance can act as resources and advisors. Impact investors should look for opportunities to collaborate with evaluators and other applied social scientists (e.g. behavioral economists) who are motivated to use and adapt their tools to support the goals of the impact finance sector. Such collaborations have the potential to accelerate learning and uptake of qualitative data analysis practices for the field more broadly.


About the Authors:

Sarah Gelfand is the Managing Director at BlueMark, an independent impact verification company. She was a founding Director at the Global Impact Investing Network (GIIN), where she led development of the industry’s leading system for measuring and managing impact (IRIS+).

Laura Budzyna is the founder and principal consultant of Beyond Measure, an evaluation consultancy specializing in adaptive impact measurement and management. Previously, she led the monitoring, evaluation and learning strategy at MIT D-Lab.