In the Talking About Charities 2013 survey, Canadians continue to give charities low ratings for the degree to which charities report how donations are used, fundraising costs and the impact of programs. Today’s donors want more information. Transparency and accountability matter more than ever.
Donor accountability goes hand-in-hand with financial transparency. To objectively gauge donor accountability, Charity Intelligence grades a charity’s social results reporting. This social results reporting grade has the highest weighting in our charity star rating, accounting for 40%.
The goal of this work is to help promote and encourage better reporting by charities on their social results. With better reporting, Charity Intelligence believes donors will be better informed.
Charity Intelligence reviews a charity’s publicly-posted information in its annual report, website and current newsletters and scores each charity on 26 questions related to its mission, strategy,
Working with charities since 2012 on this project we’ve listened to their feedback. Today there are 3 different scoring guides:
General scoring guide for operating charities
Scoring guide for hospital foundations for fundraising organizations granting (mostly) to one operating charity, i.e., hospital foundation fundraising to grant to one hospital
Scoring guide for other
Additionally, for those writing a charity's annual report, this is a great resource: Queen’s University, Voluntary Sector Reporting’s Best Practices Guide in Charity Annual Reporting
After 3 years, a 12.4% improvement in charity reporting
Charity Intelligence is seeing improvements in charity reporting. In 2016, we measured a 12.4% improvement over 2013 in grades.
In 2013, Charity Intelligence graded 453 charities on social results, covering a diverse range of sectors; from hospital foundations to food banks, from fundraising charities to homeless shelters. To see our 2013 results, click here.
References: Talking About Charities 2013 survey, Muttart Foundation.
Results reporting is a new area of charity analysis.
For those interested, here is Kate Ruff's discussion on charity analysis, with a brief history of financial accounting. Her work was also published in Stanford Social Innovation Review on why we need skilled analysts to improve social capital markets, "The Next Frontier in Social Impact Measurement Isn't Measurement at All"
To read Charity Intelligence's first report on social results, including how Canadian charities compare with British charities, please click below:
Social Results Reporting
Kate Ruff, Greg Thomson
and Zoe Young, March 2013
This report was created with the goal of improving the information to donors to allow them to make informed giving decisions. We hope that by drawing more attention to social results reporting, overall disclosure on the Canadian charitable sector will improve.
Charity Intelligence's 2013 Findings on Social Results Reporting
In 2013 Charity Intelligence (Ci) continued its work to help promote and encourage better reporting by charities on their social results. In 2012, Ci conducted research into charity reporting to understand better the information gaps reported by Canadian donors in how charities use donations, the impact of charities’ work, and information about the programs and services charities deliver. Following up on this research, during 2013, Ci scored 453 charities on social results, covering a diverse range of sectors, from hospital foundations and other fundraising charities to food banks and homeless shelters. Ci will soon be including results reporting of over 400 charities on our online charity profiles and charity ratings.
Overall average scoreof 31% of informationavailable, consistent with 2012 findings.
- Charity size matters
:The largest charities, with revenues over $20m, out-performed the average while charities with revenues under $1m scored lower.
- Significant differences across sectors with leaders being universities and animal welfare and environmental charities. Lower-scoring sectors include sports & recreation, religion, intermediaries, and arts & culture.
- No correlation between results disclosure and administrative cost ratios. Curiously, our hunch that charities with higher administrative cost ratios would perform better did not hold.
What did Ci measure? Charity Intelligence scored charities on answers to 26 questions covering strategy, activities, outputs, outcomes, and learning. We modeled the scorecard on charity sector resources developed by the Canadian Institute of Chartered Accountants, New Philanthropy Capital in the UK, Global Reporting Initiative, and Queen’s University Centre for Governance. Charity Intelligence used a matrix of questions focusing on how well a charity reported:
1. The problem it addresses (Problem/Need)
2. The programs and services it provides to fix the need (Activities)
3. Quantifying its programs and services (Outputs)
4. The results it achieves (Outcomes)
5. The reliability and clarity of reporting (Quality)
6. The learning and changes made (Learning)
Ci scored charities in these six areas and covered seven indicators: timeliness, balance, consistency, clarity, reliability, forward-looking, and accuracy. The final score is a measure of the charity's social reporting. This scoring does not assess the strategy, the quality of activities, the level of outputs or the impressiveness of outcomes. All it does is assess if enough information has been disclosed, such that any reader would have the opportunity to make those assessments.
The findings from our 2013 social results reporting work show there are significant information gaps, in line with what we had seen in 2012 and from our ongoing charity analysis. While most charities are aware of the importance of measuring and reporting impact, there is a need to translate this into practice. The average score from the charities scored in 2013 was 81 out of 260 or 31%, compared to 82 out of 240 or 34% in 2012. This minor change was a result of the samples of charities scored, as fewer “mega-charities” were included in the 2013 sample.
It is important that we emphasize that we have created a scoring system that is more like a long jump than a high jump. We have created a long pit and do not expect most charities to jump even
In our reporting, Charity Intelligence will report individual charity scores on a bell-curved, letter grade (for example, B+, A-), not showing percentage scores, as we do not want the average score of 31% to be viewed as a "failing grade".
In examining results reporting by
While there are limited sample sizes in some sectors, we have presented in our report the average scores by sector. Results point to the highest disclosure among universities, followed by animal welfare and environmental charities, while intermediaries and charities in the sports & recreation, religion, and arts & culture sectors scored significantly lower than average. We believe that these results speak to the nature of the different sectors rather than charities in these sectors.
No relationship between social results reporting and administrative costs One comment that we have heard numerous times is that tracking and reporting on this type of information is time-consuming and costly. If so, it could be reasonably assumed that we would see that those charities with better reporting have higher administrative cost ratios. However, we did not find any evidence to support this relationship. The overall correlation between results reporting score and administrative cost ratio was 0.02, thus we found no relationship between the two. With this, we urge charities concerned about the cost of tracking social results to think instead about the cost of not tracking social results – could they be achieving a greater impact today if they had better tracking?
Feedback from charities
During 2013, Ci scored 453 Canadian charities across all sectors. After each charity was scored, the results were sent to the charity for feedback and to ensure that nothing material was missed in the scoring. The overall response rate from charities was 35%, with 60% of the response being a simple
- Scoring tool is unclear / do not understand some aspect(s)
- Charity believed that Ci missed key information when scoring
- Why should we put this information together for Charity Intelligence?
- Why does Charity Intelligence only look at charities’ websites?
- Our charity is
unique / differentand cannot be scored like others
- We know we have to improve our reporting and this is very useful to help us get there
- This will help us with what information we should put on our website
- This will help us with what information we should put in our annual report
- We are looking forward to seeing examples of best practices Ci response to feedback from charities
The most significant feedback from charities was on the topic of clarity of the scoring tool. We recognize that the format and wording of the questions
We have no desire for charities to put information together especially for Charity Intelligence. We believe that key activity, output, and outcome data should be a cornerstone of any charity’s operations to ensure that they are doing the best for their clients or attaining their mission in the most effective way possible. This data should be captured for the ongoing operation of the charity. And given that donors are asking about how charities spend their donations and what impact the charities are having, once the data is collected, we believe that it should be presented for any donor to see. This is why we believe that it should be included in an annual report or an impact report and then posted on the charity’s website. For donors who care about impact, charities should provide data to demonstrate that impact.
To the question of uniqueness of charities, we fully recognize the diversity of the sector. However, we believe that the scoring tool can be used generally for almost any charity in any sector. We are only asking that charities report on those key metrics that they use in the general management of their charity – whatever those metrics may be. How does management know that it is doing a better job achieving its mission today than 3 years ago? And how will they know if they are doing a better job 3 years from now? Each charity is different and may answer these questions differently, but we believe the same questions can be asked. That being said, we do recognize that some questions do not apply in some sectors. Given this, we have created slightly different versions of the scoring tool for a couple of sectors (hospital foundations as well as United Ways and community and other foundations). We are working on tweaking some questions in other sectors as well. We urge charities to continue to share their thoughts on this.
Next Steps: Ci is continuing to improve our methodologies and we welcome feedback on our results reporting scoring tool.