The Impact Edit: The Three Proof Points Every Impact Report Needs in 2026
Something interesting is happening in impact reporting right now, and it feels worth paying attention to. People aren’t necessarily asking more questions, but the questions they are asking have become a lot more precise. They’re less about broad commitments and more about what’s actually happening, what’s improving, and what evidence sits behind the claims being made. Investors want to understand material risks in a way that goes beyond intention. Tenants want clearer answers about what’s improving in the buildings they occupy. Communities want outcomes they can see and recognize. And internally, people trying to make responsible decisions with partial information are looking for data they don’t have to translate or decode.
The reports that are landing well in this moment aren’t the longest ones, the flashiest ones, or even the most technically polished ones. They’re the ones built on proof that holds up when someone looks a little closer, asks one more question, or wants to understand the connection between an input and an outcome.
Across the reports I’m working on and reviewing right now, three proof types keep showing up in the ones that are genuinely useful.
1. Proof of Relevance
Does this information help someone understand what actually matters?
A lot of reporting misses the mark not because the information is wrong, but because it’s disconnected from the decisions people are trying to make. It answers questions that nobody asked while skimming past the questions that actually shape risk, wellbeing, cost, or community stability.
Tracking volunteer hours when the underlying tension is displacement, counting “green events” when the building’s actual exposure is poor air quality, listing diversity goals when the real issue is where the money was spent - these aren’t bad metrics, they’re just mismatched to what matters.
Relevance looks different. It looks like noticing that tenants keep talking about ventilation, retrofitting the HVAC system, watching PM2.5 drop by 37 percent, and updating operations so those improvements stick. It looks like seeing a groundwater risk zone expand from three percent to eleven percent of the property, adjusting the envelope design, and shifting $1.2 million to address it. It looks like mapping suppliers, realizing only four of fourteen critical vendors have verifiable labor data, and changing your screening process because that gap actually matters.
Relevance is simply the alignment between information and decisions. If a data point doesn’t influence a decision - or at least illuminate one - it probably doesn’t belong in the report.
2. Proof of Verifiability
Can someone check this if they want to?
This isn’t about defensiveness or preparing for scrutiny. It’s about clarity. A claim without a source lands like marketing. A claim with a visible evidence trail feels like something you can trust, even if the number itself is imperfect.
Verifiability shows up in small ways. You ran fourteen community sessions, thirty-two people attended on average, the input was coded into six themes, and those themes shaped the site plan. The dataset is available if someone wants to look. 80% of your Tier 1 suppliers completed independent labor verification and the documentation exists if needed. Baseline CO2 was 1,650 ppm, post-upgrade levels are around 820 ppm, and the monitors are visible in the lobby so people can see it for themselves.
This isn’t about over-explaining or creating more work. It’s simply making the logic visible. If someone wants to follow the trail, they should be able to.
3. Proof of Impact
What actually changed because of what you did?
This is the place where activity often gets mistaken for impact. A lot of the times it’s because it's easier to describe what was done than to measure what shifted.
For example- the impact is not just the existence of a training program. It’s sixty-four workers completing it, thirty-nine securing long-term employment within six months, and wages increasing by twenty-two percent compared to baseline.
Impact isn’t installing healthy materials. It’s switching to low-VOC adhesives, seeing VOC spikes drop by forty-one percent, and watching indoor complaints decrease.
Impact isn’t “supporting local businesses.” It’s allocating 15% of construction spend within a ten-mile radius, three firms expanding to meet the demand, and twenty-seven new local jobs being created.
Impact requires a starting point, a result, and some clarity about who benefited and by how much. It doesn’t need to be dramatic. It just needs to be honest.
Why This Matters Now
It feels like we’re at a transition point- and this is not because the reporting landscape is suddenly transforming, but mostly because the conversation around impact is maturing.
People aren’t asking for more claims. They’re asking for clarity. They aren’t asking for flawless stories. They’re asking for real ones. And they don’t need more metrics. They need the right metrics.
Relevance. Verifiability. Impact. None of these are complicated, but they do require choosing what goes into a report with more intention than we’ve used in the past.
The organizations navigating this moment well are the ones treating impact reporting as a way to make their thinking visible. And that shift, even in small doses, feels like progress.
The Conversation Continues...
This post is part of our ongoing exploration into how the future of impact reporting depends on moving beyond broad claims to precision, verifiability, and evidence-based outcomes. As problem-solvers, we believe the best insights emerge when diverse perspectives meet. Have you encountered similar challenges or discovered different approaches? Share your story.
Connect with us as we continue to prototype, test, and learn:
Subscribe to our newsletter
Join us on Linkedin
Explore our resources
We acknowledge that social sustainability is always a work in progress. These insights represent our current understanding, shaped by our partners, communities, and continuous learning.