Apply Now Enquire Our Events WhatsApp

Back to Basics: Explainable AI for Adaptive Serious Games

Game Design, Campus Berlin

Abstract

The spread of AI technologies has given rise to concerns regarding the fairness, appropriateness, and neutrality of machine-made decisions. Explainable AI aims at countering this by enforcing simple or transparent solutions. Adaptive serious games themselves have long been a playground for various approaches to achieve machine-regulated adjustments for increased learner success or player satisfaction. Results have been commendable, but can not readily be complemented with explainability. Analysing 18 models of adaptivity in game-based learning and related domains, we propose a simple and explainable adaptivity model for serious games. It is designed as a rule-based, short-term decision making algorithm, proposes game progress as a reliable learning progress indicator, and adapts to both under- and over-performing learners. We present the implementation of the model in two distinct serious games, and the result of an evaluation in a controlled trial (n=80n=80), demonstrating its suitability for adaptive serious games. In conclusion, we underline the importance of designing serious games both as teaching and playing experiences, and of an iterative design process to assure the quality of the final product.

 

DOI:10.1007/978-3-030-88272-3_6

In book: Serious Games, Joint International Conference, JCSG 2021, Virtual Event, January 12–13, 2022, Proceedings (pp.67-81)

 

Contact us today