CSTPR has closed May 31, 2020: Therefore, this webpage will no longer be updated. Individual projects are or may still be ongoing however. Please contact CIRES should you have any questions.

Red Cross/Red Crescent Climate Centre Internship Program :: Center for Science and Technology Policy Research

About the internship program Application information Summer Placements Notes from the field

 

notes from the field

These field notes are personal views and do not necessarily reflect the views of Red Cross/Red Crescent Climate Centre

Leslie Dodson

Zambia

Blog Post 3
July 14, 2014

 

Evaluating Serious Games for Climate Change Adaptation Creating Evaluation Frameworks for Serious Games

Serious games and experiential learning “immerse players in experience, simulate entire worlds, and most importantly, engage and inspire broader audiences” (Mitgutsch & Alvarado, 2012), and although there’s growing enthusiasm for using serious games for humanitarian action, there aren’t many monitoring and evaluation (M&E) tools to evaluate their efficacy. Techniques haven’t been developed yet to explore, let alone understand, whether and how serious games effectively raise awareness, influence behavior change, and shape programs and policies related to climate change adaptation. So, for the past month I’ve been working with the Red Cross/Red Crescent Climate Centre to create a preliminary framework to evaluate some of the Climate Centre’s facilitated games for climate adaptation. We’ve looked at a wide range of evaluation criteria from the fields of humanitarian assistance, educational games, digital games and games for social change. And it helps that we’ve played a lot of games.

Evaluating Game Design and Game Dynamics
We were guided by the IFRC’s Framework for Evaluation that states that assessments and evaluations should be as systematic and objective as possible and should provide information that is “credible and useful, enabling the incorporation of lessons learned into the decision-making process of both recipients and donors” (IFRC, 2011). By evaluating facilitated games that support climate change adaptation and disaster risk management, we can better understand which elements of the games are vital to the learning process. Evaluations also help identify strengths and weaknesses in the games so the Climate Centre can build on what works and improve games for greater impact. For example, the Climate Centre wants to understand whether their games:         

  • adequately reflect genuine concerns about the impacts of climate change and the landscape of risk associated with (a) resource scarcity, (b) food and other insecurities, and (c) the complex vulnerabilities of the poor (Bours et al., 2014)?
  • contribute to the development of values, skills and knowledge needed for effective climate change adaptation or disaster risk reduction, and if so, how?
  • effectively put climate change adaptation and disaster risk reduction problems in context and help players better understand CCA and DRR risks and options?
  • support stakeholders in “inhabiting the complexity of climate risk” and in making decisions that reflect that complexity (Climate Centre)?
  • assist players in exploring plausible futures and testing a range of future scenarios?

The preliminary evaluation framework focuses on game design and game dynamics --  making the distinction between the mechanics of a game and the experience of playing the game (Mitgutsch & Alvarado, 2012). Game design encompasses the learning content, user interface and game rules (Winn, 2009). Game dynamics includes an assessment of the gameplay experience and players’ interactions with the game.
           
Difficult to Measure
The benefits of participatory serious games may be profound, but they are difficult to measure. Short-term (right after or shortly after) game outcomes often register as increased interaction between stakeholders, elevated levels of trust, strengthened relationships and increased comfort with or confidence in complex information. Even though it’s hard to quantify these qualities and achievements, they may “influence a person to change behaviors and actions” (Mendler de Suarez, et al., 2012). Longer-term, the key question is whether “game-derived learning” carries over to real-world situations and has an impact on adaptation decision-making (Mendler de Suarez, et al., 2012). Once players leave the workshops, we want to know whether they independently replicate what they learned from the games and apply it to their own contexts (Suarez & Bachofen, 2013). Longer-term impact, though, is tricky to study. Impact is difficult to measure because it encompasses behavior changes and shifts in decision-making that are certain to be compounded by other stressors, experience, lessons and leadership. Furthermore, impact can accrue to individuals, groups, institutions or institutional decision-making, so any longitudinal evaluation has to take into account that lessons from serious games will be only one of many factors that contribute to the development of adaptive capacity and better risk management actions. Therefore, the Climate Centre approach is to try to develop criteria to assess “discrete impacts within a particular stakeholder community where a game has been used systematically” as part of what in reality is a wider range of efforts (Mendler de Suarez, et al., 2012).

Refining Evaluation Tools
As an engaging vehicle for learning and social change, serious games “represent a potentially powerful and effective tool” for achieving development and humanitarian objectives (Climate Centre). The Red Cross/Red Crescent Climate Centre continues to innovate new games and incubate their inclusion in humanitarian training and decision-making to tackle a changing climate that will be complex, volatile and uncertain. The Climate Centre is also developing evaluation techniques that lend credibility to the integration of games in disaster risk reduction, climate change adaptation, preparedness and resilience, but the game evaluation process is still in its infancy (Raphael et al., 2010). The next steps are to refine the preliminary M&E tools, develop a database and solicit feedback from game designers, facilitators, participants and other stakeholders. 

 

Bours, D., McGinn, C., Pringle, P. (2014). International and Donor Agency Portfolio Evaluations: Trends  in Monitoring and Evaluation of Climate Change Adaptation Programmes. SEA Change CoP, Phnom Penh and UKCIP, Oxford.

International Federation of Red Cross and Red Crescent Societies (IFRC). (2011). IFRC Framework for Evaluation.  Planning and Evaluation Department. IFRC Secretariat.

Mendler de Suarez, J., Suarez, P., Bachofen, C. (eds). (2012). Games for a New Climate: Experiencing the Complexity of Future Risks. Task Force Report, Nov. 2012. The Frederick S. Pardee Center for the Study of the Longer-Range Future. Boston University.

Mitgutsch, K., Alvarado, N. (2012). Purposeful by Design?  A Serious Game Design Assessment Framework.   In Proceeding FDG ’12.  Proceedings of the International Conference on the Foundations of Digital Games. Pp 121-128.

Raphael, C., Bachen, C., Lynn, K-M., Mckee, K., & Baldwin-Philippi, J. (2010). Games for civic learning: A conceptual framework and agenda for research and design. Games and Culture, 5,199-235.

Red Cross / Red Crescent Climate Centre. www.climatecentre.org.

Suarez, P., & Bachofen, C. (2013). Using Games to Experience Climate Risk: Empowering Africa’s Decision-makers. Final Report: CDKN Action Lab Innovation Grant.

Winn, B.M. (2009). The Design, Play, and Experience Framework. In Handbook of Research on Effective Electronic Gaming in Education. Ferdig, R.E. (ed). Kent State University.