Skip to main content

Evaluation, avaliação, การประเมินผล: A Standardized Language for International Evaluation

The World Biotech Tour was a three-year initiative supported by the Biogen Foundation and managed by the Association of Science-Technology Centers (ASTC). The project was designed to promote a greater understanding of biotechnology through science center and museum programming and provided insights on the resources required and challenges encountered when implementing a year-long, multi-country program in informal science environments. 

Pictured: The "Take a Cellfie" tabletop activity hosted by the National Science Museum Thailand.


 

The World Biotech Tour spanned 12 different countries and involved students, teachers, researchers, science communication professionals, and the public. But how does one evaluate such a complex initiative, particularly as a U.S.-based institution managing events in regions where English is not always the first language?

ASTC’s approach was to create a common skeletal framework for organizations to build upon. By setting milestones rather than providing a detailed execution plan, each organization had flexibility on how they accomplished the requirements while still achieving the project’s main objectives. This customizable plan gave organizations the opportunity to showcase the relevant biotechnology topics that are important to their communities and use different informal learning techniques that best fit their audiences. Due to the fluidity of the project as it traveled from country to country, the creativity and commitment the science centers had in interpreting the requirements produced several unexpected, yet positive, outcomes. 

The program was evaluated with a two-pronged approach. Each participating science center submitted an internal review of its accomplishments and challenges implementing the project. In addition, a local third-party evaluator, commissioned by ASTC, conducted an external evaluation of the science center in their region. Collecting evaluation from the two perspectives has provided valuable information about the impact this kind of program had internally from the museum side and externally from the public-facing side. Twenty-four reports from 12 different countries, what could go wrong? Learn if evaluation methods really differ all that much around the world and recommendations for evaluation plans for similar complex programs in the final summative report.

In August 2018, World Biotech Tour Project Manager Carlin Hsueh interviewed two of the project’s external evaluators: Aunkrisa Sangchumnong, a researcher at Suan Dusit University in Bangkok, Thailand, and the evaluation team of Percebe, a research firm in São Paulo, Brazil, with experience in the areas of science education, museology, public studies, communication and education in museums and similar spaces. Below are translated excerpts from the discussion.

Question 1

The World Biotech Tour was a multi-faceted project that included a variety of audiences that science centers engage with. What evaluation model did you decide to use when evaluating the project in your location?

Percebe evaluation team

São Paulo, Brazil

Percebe: The objectives and the process of management and realization of the program, as well as its results, were evaluated with the different audiences. For the evaluation of the processes, qualitative research methodologies were used as a priority. For the evaluation of the results, qualitative and quantitative methodologies were used, in a combined way, according to the moment and the target audience to be evaluated. 

Sangchumnong: We decided to use Tyler’s Objective Model in order to emphasize consistency among project objectives, learning experiences, and outcomes. The design of the methodology constituted qualitative and quantitative instruments such as individual interviews, group interviews, questionnaires, and observations. The collection procedure was divided based on the activities that were handled by the National Science Museum Thailand, which had five main activities comprised of an Ambassador Camp, Press Release event, World Biotech Tour Festival, National Science and Technology Fair 2016, and Community Outreach.

Question 2

There was a growing desire in the program for evaluators to connect with each other to discuss what approach they were taking in their evaluation. Do you feel that sharing and co-development are essential for the global evaluation field to grow and improve?

Percebe: No doubt this kind of strategy is very important, but difficult to get it when there are so many teams involved, from different parts of the world. To improve the collaboration process, the different evaluators from each country should connect and discuss the best ways to conduct the evaluation process before the project commences. These different groups could be responsible for different parts of the evaluation. In the WBT, they shared the previous reports for past WBT years, and for us this contact with these materials was the first inspiration to design our research tools. Unfortunately, beyond that, sharing with the other evaluators teams was specific and not deep enough and we did not have the opportunity to co-develop any tools. The language is also a big challenge, but it was a good experience.

Aunkrisa Sangchumnong

Suan Dusit University Bangkok, Thailand

Sangchumnong: I strongly agree with this statement. I looked forward to the opportunity to develop assessment tools with other countries that work in the same year, but this was very difficult because of the limited time available. In developing project tools in Thailand, we could only learn from the results of the past assessors and compare with the context of Thailand before developing our own tools. However, each tool developed had to be consistent with the activities that would be held by the National Science Museum Thailand, which differed from other institutions’ activities. As a result, the evaluators had to have open discussions with the organizers. It will be better if the evaluators can have the opportunity to exchange and develop tools together because it will lead to developing a higher standard of evaluation for all. However, the co-development of tools can only happen if all the activities from the different countries are consistent.

In general, this project has allowed many international evaluators to share their work and experiences through online channels, which really impressed me because I had the opportunity to study how the countries evaluated their programs and I was able to compare it to my work. In addition, those online exchanges also gave us time to share the problems and solutions, which was another very useful thing to do because everyone had the opportunity to learn from each other.

Question 3

Do you feel this same sharing and co-development should happen with the institutions you are evaluating, such as the science centers?

Percebe: Sharing ideas with the museum team was one of the most important aspects of the evaluation. That improved the evaluation process and helped us to get more reliable data. The opportunity of sharing with the ambassadors in the youth program in-person (and virtually) contributed to measuring the sociocultural aspects of the project.

Sangchumnong: In my opinion, I think if the main focus of the work is understanding and improving the overall quality of the program around the world, then it is essential for all parties to talk together, because in my view this is not a competition but an opportunity for the public and youth to realize their importance in biotechnology. An issue with global programs in particular is that some countries may have trouble with time zone differences, which can affect the sharing process in some ways. 

Question 4

The final summative report suggests using the multisite evaluation (MSE) approach of the negotiated centralized evaluation model to streamline the evaluation process. The model includes three stages: (1) creating local evaluations, (2) creating the central evaluation team, and (3) negotiating and collaborating on the participatory MSE. Reflecting on your evaluation approach to the World Biotech Tour, what are your suggestions for future evaluations of multisite programs? 

Percebe: Hiring local evaluators seems to be a good strategy. We shared with the other evaluators only our methodology and strategies. In this format we had few opportunities to learn with each other. If there is an intention in this direction, we suggest exchanging specific questions in the email groups or similar platforms.

Sangchumnong: In my opinion, each country has different ways of working to make this project successful. Sometimes, when I listened to the information on how an evaluator from a different country was approaching the evaluation, I noticed how different it was from mine. This made me confused and wonder if our style of organizing was the best way to meet the conditions set forth by the organizers. Some countries, such as Thailand, expanded their work outside the organizer’s initial framework to a larger scale, while some decided to focus on a smaller scale. The organizers allowed flexibility on how we approached the evaluation of this project while still giving clear criteria of what they wanted evaluated. Our only suggestion for the future of this work is to continue this program and encourage more teenagers to learn about biotechnology for a better future world. 


Visit the World Biotech Tour website to learn more about the institutions that participated in the program, where they were located, the project requirements, summaries of how they implemented the requirements in their regions, and the final summative report that reviews the multiple evaluation methods evaluators used around the world.

Posted by Carlin Hsueh