Digital (or video) games have been a source of popular entertainment since the 1970s. Games on contemporary onsoles such as the Xbox, Wii and Playstation often sell by the million, while the most popular can generate revenues - with development and marketing budgets sometimes measurable in the hundreds of millions of dollars - that far outstrip those of other media. Recent digital devices, such as smartphones and tablets, and social media platforms such as Facebook, have provided more gaming opportunities, making names such as Angry Birds and Candy Crush Saga as ubiquitous as popular television programs, pop bands and best-selling books.
These digital games are usually designed for profit and/or fun. However, parallel to their development and increasing complexity has been an interest in their use for non-entertainment purposes; to inform, educate, reinforce or develop skills, or provide a forum for experimentation and discovery. This interest comes not just from the education sector but also from the business and economic sector, the arts and sport, and the biomedical and other sectors. Terminology and phrases such as “serious games”, “pervasive games”, “game-based learning” and “gamification” are used across professional and mainstream media - often hazily and ambiguously - implying that digital games and gaming systems have more ideologically worthy uses beyond ‘merely’ fun and entertainment.
This interest is, at least partially, driven by not just the impressive demographics of digital game players, but the enthusiasm and focus exhibited by players. The frequent, over-simplified and flawed assumption is that adding a layer of “more fun” elements - namely, digital gaming elements - to a “less fun” task such as learning, will make the latter more enjoyable, thereby increasing the focus of the person carrying out the task e.g. a student, and producing better results. This is known as the chocolate-covered broccoli theory. While some responsible work is being done to carefully evaluate how games can be used successfully in learning situations, the combination of an aggressive game development industry constantly seeking new markets e.g. education, popular media pieces on gaming, and institutional demands of better grades provide a seductive route for teachers keen for their students to learn “more” or “quicker”.
However, the history of game-based learning is littered with many failed, and often expensive, attempts to fuse together gaming and learning. Students sometimes focus on the entertainment; they are disappointed that the game is less fun than the games they usually play; the learning is compromised; the evaluation of what is learnt proves problematic; the teacher or facilitator is inexperienced in gaming and unable to provide direction. These, and many other problems make developing game-based learning systems both difficult, and an inexact and risky “science”.
In addition, digital games, virtual worlds, simulations or online environments are sometimes not the best solution for a specific learning experience. This is a healthy viewpoint; no technology is inherently suitable, or superior, to all other technologies for all learning needs. Blindly or evangelistically applying one technology - such as digital games - to all learning situations, without careful consideration of the learners’ needs, may add to the considerable pedagogic history of expensive, mediocre or ineffective learning experiences. It is okay - good, in fact - not to use digital gaming techniques or technologies where they are not the best solution.
This infokit serves to guide people within the academic sector towards making sensible decisions in the use of digital games in learning. These decisions should include: