The Grant Report Most Funders Are Quietly Tired of Reading
- Michaela Rawsthorn
- Apr 29
- 4 min read

Most nonprofit grant reports follow the same pattern. They open with a thank-you. They list the activities completed during the grant period. They cite output numbers—people served, sessions delivered, materials distributed. They include a short success story. They close with a sentence about the importance of continued partnership.
There is nothing wrong with any of those elements. The problem is that almost every grant report a program officer reads in a given month uses the same structure, in roughly the same language, with the same shape of evidence. After a while, the reports start to blur together—and the organizations behind them blur with them.
A grant report is a chance to say something a funder will remember about your work. Most reports waste it. Here is what to do differently.
1. Lead with what changed, not what you did
The default structure of a grant report — activities, outputs, story, thank-you — puts your most forgettable information first. By the time a program officer reaches the part of the report that explains what actually shifted because of the work, they have already skimmed past three paragraphs of activity descriptions.
Reverse the order. Open the report with a single, specific statement of what is different now than it was at the start of the grant period. Not a metric. A claim. "Twelve months ago, our intake process was losing roughly a third of referred clients between the first call and the first appointment. As of this month, we are losing fewer than ten percent — and we now know which step of the process was the bottleneck." Then go into the activities, the outputs, and the story.
A funder who reads only the first paragraph of your report should still come away knowing the most important thing your organization wants them to know. Most reports are structured so that the first paragraph is the one that matters least.
2. Name a problem you ran into
The instinct in a grant report is to present the work as having gone smoothly. This is a mistake — not because funders punish honesty, but because the absence of any difficulty in the report makes the report less believable, not more.
Program officers know that real programs run into real problems. A staff member left mid-grant. A partner organization changed direction. A population you expected to reach turned out to be harder to engage than the proposal anticipated. Naming one of these honestly — and explaining what you learned or adjusted—does three things at once. It signals organizational maturity. It demonstrates that your evaluation is genuine and not performative. And it gives the funder a reason to trust the rest of the report.
A useful test: If your report could be mistaken for a marketing brochure, it is doing the wrong job. A grant report should read like a memo from a colleague who is being straight with you, not a pitch from a vendor. |
3. Quote the work, not the press release
Most reports include a participant's story. Most participant stories are written in the same elevated, tidy language used in fundraising appeals — quotes that sound like they were edited to sound moving rather than recorded as someone actually spoke.
A program officer reading reports all month can tell the difference between a quote that has been polished into impact-speak and a quote that sounds like a real person. The polished version is forgettable. The real one stays.
When you include a story or a quote, leave in the specifics. The name of the program. A detail that could only belong to this person. A line that sounds like something a person would actually say to their case manager, not something a development officer would write on a brochure. Those specifics are what make a story usable to the funder later, when the program officer is making the case for renewal to their own board.
4. Connect outputs to outcomes—explicitly
Many reports list outputs and assume the funder will draw the line to outcomes on their own. They will not. Or, more precisely, they might — but they will draw the line less generously than you would.
If you served 1,200 people, do not stop at the number. Tell the funder what that number meant. Did those 1,200 people complete the program at a higher rate than last year? Did the population you reached change in ways that suggest your outreach is working? Did the cost per outcome shift? Did the people you served report something different at the end than they did at the start?
The work of connecting outputs to outcomes is the work the funder is paying you to do. A report that lists outputs without making that connection is asking the funder to do your interpretive work for you—and most program officers do not have the time, even when they have the inclination.
5. End with what you would do with another year
Most reports end with a thank-you and a vague gesture toward continued partnership. The thank-you is appropriate. The vague gesture is a missed opportunity.
A grant report is, among other things, a soft renewal conversation. The funder is reading it partly to decide whether to fund this work again or to fund the next version of it. Closing with specificity — what you learned that you did not know a year ago, what question that learning has surfaced, and what you would do with another year of support — gives the funder a reason to keep reading the next proposal.
This is not a pitch. It is a continuation of the conversation the report started. The best version reads as if you and the program officer have been thinking about the same problem together for the last twelve months—because, ideally, you have been.
The bottom line
A good grant report is not a longer or prettier version of a bad one. It is a different document entirely. It leads with change, not activity. It admits where the work was hard. It quotes the work honestly. It connects what was counted to what it meant. And it ends with the next question, not the last thank-you.
Funders are not asking for more reports. They are asking for fewer reports that are worth reading. The organizations that figure out the difference are the ones that get remembered when the next funding cycle opens.