March 23rd, 2020 | RESEARCH
Embedded assessment (EA) is particularly well-suited for evaluating citizen science volunteers’ proficiency of science inquiry skills; however they remain uncommon in informal education. Using design-based research, we are examining processes to streamline EA development by building on existing data validation procedures within five citizen science projects. Here, we focus on the critical first step of supporting citizen science project leaders in identifying appropriate skills that are important, relevant, accessible, and potentially hiding in plain sight in their existing data. Our research reveals that the project leaders can bring broad but uncertain conceptualizations of volunteers’ skills relevant to their citizen science efforts. These leaders need time and support to refine expansive notions of skill into concrete and clearly defined specifics that can be assessed from their existing data. Our research shows that identifying appropriate skills for EA is a complex multi-step procedure that benefits from supporting oral and written tools. Understanding the processes for developing embedded assessment is valuable for education research in diverse venues.
Document
Stylinski-et-al.-2020-NARST-2020.pdf
Team Members
Cathlyn Merrit Davis, Author, University of Maryland Center for Environmental ScienceVeronica Del Bianco, Author, University of Maryland Center for Environmental Science
Tina Phillips, Author, Cornell Lab of Ornithology
Karen Peterman, Author, Karen Peterman Consulting Co.
Rachel Becker-Klein, Author, University of Nebraska Omaha
Jenna Linhart, Author, Two Roads Consulting
Citation
Publication: NARST 2020
Tags
Audience: Evaluators | General Public | Learning Researchers | Museum | ISE Professionals
Discipline: General STEM
Resource Type: Conference Proceedings | Reference Materials
Environment Type: Citizen Science Programs | Public Programs