Foundation Operations: Evaluation-> Program Evaluation-> Learning from Evaluation Reports - GIH Skip Navigation

Foundation Operations: Evaluation-> Program Evaluation-> Learning from Evaluation Reports

Learning From an Evaluation Report: How to Reflect, Process, and Communicate Results

One of the most important things to consider before undertaking an evaluation is how the foundation is going to use the results. As part of the initial planning process, foundation staff and evaluators need to determine how the report can best be understood, processed, and presented internally and then to other audiences. Some foundations fund evaluations for their own use only – the evaluations are intended to assess their grantmaking work, and the results are only for planning purposes. Other foundations enter into an evaluation process knowing they fully intend to share it all – the good and the bad.

The Atlantic Philanthropies links their evaluation efforts with “strategic learning.” The foundation’s Strategic Learning and Evaluation team develops evaluation techniques, encourages learning from foundation and grantee experiences, and facilitates dissemination of that learning. Their efforts are aimed at:

  • enhancing grantees’ capacity to improve their work by helping them assess their progress and learn from their experience;
  • improving the foundation’s overall philanthropic work by helping the board and staff plan strategically, assess their progress, and learn how to best target resources; and
  • demonstrating useful and credible lessons that help other funders, policymakers and practitioners work effectively on behalf of disadvantaged and vulnerable people during Atlantic’s lifetime and beyond.

In the James Irvine Foundation’s “Evaluation Policies and Guidelines,” the following principle is stressed: “In the interest of achieving broader impact, the Foundation will assess the value of evaluation results within the larger field, and, where appropriate, package that information for specific target audiences so that the innovations and lessons in our grantfunded programs may be understood, accepted and adopted in other settings.”

You should work with others in your foundation to consider various options for sharing results. Having this conversation with the evaluator is also important and can be helpful in guiding the type of final report that will be most useful to your foundation. If an evaluator has to subcontract with a writer to produce the kind of report your audiences will read, be sure to consider that option. Evaluators are experts in evaluation – not necessarily experts in packaging reports for multiple audiences.

The following are some lessons learned about communicating evaluation and research findings:

  • an executive summary that can be used as a stand-alone product is helpful;
  • use headings and subheadings as much as possible to organize the information in a cohesive way;
  • graphics and other visual presentations are useful;
  • use bullets as much as possible to make the points clear rather than dense language which can obscure key messages;
  • major findings and recommendations should be presented up front;
  • methods and instruments can generally be included as an appendix;
  • information about statistical significance and confidence levels needs to be explained in layman terms, and examples should be used to demonstrate its meaning;
  • use both oral (for example, roundtable discussion, panel presentation) and written forms to convey the findings;
  • use the oral format as an opportunity to encourage reflection about program improvement, planning, and other strategy development; and
  • it can sometimes be more effective when peers are used to relate the evaluation or research findings (e.g., film, panel, co-authorship of publication).

At the end of the evaluation, you will have other valuable information as well – lessons learned from the process of conducting this evaluation. It may be helpful to keep notes on the process; when you fund your next evaluation, you will remember what you might want to do differently.



The California Wellness Foundation, Evaluations and Lessons Learned from our Grantmaking (Woodland Hills, CA: 2003). This is a Web-based series in which foundation staff, grantees, and contractors share lessons learned and information gleaned from grantmaking programs and strategies. The foundation presents these publications three or four times a year.

The California Wellness Foundation, Reflections: On Evaluating Our Grants (Woodland Hills, CA: 2004). This issue of Reflections discusses the evaluation experiences of The California Wellness Foundation, organizational milestones for evaluation, specific grants that exemplify aspects of evaluation grantmaking, and some general conclusions about foundation evaluation.

Centers for Disease Control and Prevention, Practical Evaluation of Public Health Programs Workbook (Atlanta, GA: 2007). This course-based workbook defines program evaluation in practical terms, explains the importance of evaluation in the current public health environment, discusses why evaluation is important, and identifies barriers to evaluation. Available at

David, Tom, “Evaluation and Foundations: Can We Have an Honest Conversation?” February 2006. In this paper, author Tom David presents an open discussion of evaluation in the field of health philanthropy. Available at

FSG Social Impact Advisors, From Insight to Action: New Directions in Foundation Evaluation (Boston, MA: 2007). This report identifies new ways foundations are using evaluation including more performance-centered approaches that provide foundations and their grantees with current information and actionable insights.

Harvard Family Research Group, “The Evaluation Exchange – Evaluation Methodology” 11 (2), Summer 2005. This issue of “The Evaluation Exchange” periodical focuses on evaluation methodology, covering topics in contemporary evaluation thinking, techniques, and tools. John A. Healy, director of strategic learning and evaluation at The Atlantic Philanthropies, shares ways to position learning as an organizational priority. Available at

The James Irvine Foundation, “Evaluation Policies and Guidelines.” These policies describe the purpose and role of evaluation at Irvine and the roles and responsibilities of respective program staff for evaluation activity. Available at

The Philanthropic Initiative, Inc., Making a Difference: Evaluating Your Philanthropy (Boston, MA: 2003). From The Venturesome Donor series, this publication presents simple approaches to evaluation that can be tailored to the size, scale, and complexity of the initiative under review and conformed to the donor’s philanthropic goals, strategy and learning style. Available at

Print Print   Share Share   RSS RSS
GIH Connect

Connect with GIH to learn, collaborate, and grow through education, networking, and leadership opportunities.

Sign Up
Sign up for the GIH Bulletin to stay on top of news from the field.

Funding Partner Portal Login
Login to access exclusive Funding Partner resources from GIH.