September 1st, 2015 | RESEARCH
There are a number of places evaluators can share their reports with each other, such as the American Evaluation Association’s eLibrary, the website informalscience.org, and organizations’ own websites. Even though opportunities to share reports online are increasing, the evaluation field lacks guidance on what to include in evaluation reports meant for an evaluator audience. If the evaluation field wants to learn from evaluation reports posted to online repositories, how can evaluators help to ensure the reports they share are useful to this audience? This paper explores this question through the analysis of 520 evaluation reports uploaded to informalscience.org. The researchers created an extensive coding framework to align with features of evaluation reports and evaluators’ needs. It was used to identify how often elements were included or lacking in evaluation reports. This analysis resulted in a set of guiding questions for evaluators preparing reports to share with other evaluators.
Document
Reporting_for_Evaluator_Audience.pdf
Team Members
Amy Grack Nelson, Author, Science Museum of MinnesotaZdanna King, Author, Science Museum of Minnesota
Funders
Funding Source: NSF
Award Number: 1010924
Related URLs
Tags
Audience: Evaluators
Discipline: Education and learning science | General STEM
Resource Type: Reference Materials | Report
Environment Type: Exhibitions | Informal | Formal Connections | Media and Technology | Professional Development | Conferences | Networks | Public Programs