What makes games such effective tools for math learning (when they are)?

BrainQuake
5 min readJun 27, 2020
The Gears puzzle, one of three math learning puzzles in the new BrainQuake app, began life as the standalone mobile and Web-based game Wuzzit Trouble (shown here). It provides an interactive interface for solving simultaneuous linear equations in up to four unknowns. Expressing it that way makes it sound heavy and (for many people) dull. But as countless school-aged children the world over have discovered, it’s actually a lot of fun. It also leads to improved results in written math tests. Image by BrainQuake.

Shortly after we released Wuzzit Trouble in late 2013, two mathematics education researchers at Stanford University conducted a classroom study to see how effective it was as a supplementary tool to support mathematics learning — in particular, in developing numbers sense and overall creative problem solving ability.

The two researchers learned about the new game from their research leader, Prof Jo Boaler, a Stanford colleague of mine who acts as a volunteer adviser to BrainQuake (unpaid and with no stake in the company).

The study was carried out completely independently of me and of BrainQuake. So I had no information about how the research was going until after the project ended, when Boaler informed me of the results. In consequence, it came as a significant shock to me to learn that the study had shown significant learning gains (in the target areas of number sense and creative problem solving) by the students (compared to a control group taught by the same teacher) after a remarkably short period of time.

Specifically, the researchers found that use of Wuzzit Trouble for as little as 120 minutes of play, spread over four weeks in ten-minute bursts at the end of math class, produced significant improvements in student number sense and problem solving capacity, as measured by a written pre- and post-test.

Conducted in 2014–15, the Stanford University classroom study of Wuzzit Trouble demonstrated significant learning gains in number sense and problem solving capacity after a very short period of time, raising questions as to what features of the game led to such rapid improvement. Image by Stanford University.

To be clear, what shocked me wasn’t that the game led to improvements in performance. It had, after all, been designed based on many years of research into mathematics learning. Rather, what I had not expected was that those improvements came after a ridiculously small amount of time.

Indeed, I initially suspected the result must be a statistical outlier, a freak result that could not have any meaningful educational implications. So, I contacted another university mathematics learning research team I knew, based in Finland, who were also looking at the use of video games in math education, and asked if they would be interested in running a similar study.

They said they would, with one modification. They wanted to use a fractions-learning game they were developing (Semideus) as a pre- and post-test for Wuzzit Trouble, in addition to the written pre- and post-tests from the Stanford study. I agreed to that, and the study went ahead.

A second university study of Wuzzit Trouble conducted by a university in Finland used a fractions learning game called Semideus as an additional pre- and post-test. Image courtesy of Prof Kristian Kiili.

The results from the Finnish study were almost identical to those from the Stanford study. I was convinced. There was a “there” there.

Both studies were subsequently published in a peer reviewed scholarly journal which can be accessed on the BrainQuake website, so I won’t go into details here, other than to make one remark about the written pre- and post-test used in both studies. (The entire test is presented as an appendix to the Stanford paper.)

The test had five questions. Four of them focused on basic number skills related to the Wuzzit Trouble puzzle; the fifth was different. It was a rich performance task about positive integers that was unfamiliar to the students. It required creative thinking to find a solution. As discussed at length in the Stanford paper, it was this question that produced the bulk of the dramatic performance gain of the intervention group, shown in the graph and chart above. The game clearly resulted in improved number sense and problem-solving ability.

At that point, the question foremost in my mind, as a designer of the game, was this: exactly what features of the Wuzzit Trouble game led to such a significant improvement after such a short period of time?

There were many possibilities, among them:

  • increased engagement with mathematical ideas.
  • numbers, arithmetic, and algebraic reasoning all had meaning in the games; they weren’t just abstract symbols on paper
  • good math games develop positive attitude towards — and a willingness to engage in — mathematical practice and problem solving
  • good games encourage a willingness to “play with the problem” before (or instead of) trying to find and apply a known solution technique
  • good games encourage willingness to take risks and learn through failure
  • good games help develop a growth mindset
  • good games require and develop fluid intelligence (Gf) — a complex human ability that allows us to adapt our thinking to a new cognitive problem or situation.

There are more possibilities, but those are the ones I felt were particularly significant as possible factors.

Mathematical puzzle games are known to encourage, support, reward, and develop all of those outcomes, so it is possible that all have significance.

The general point I want to make is that, using (well designed) games in mathematics education can do far more than simply provide an alternative, intrinsically appealing way to package familiar classroom math. (Indeed, using games that way frequently results in products that are neither appealing nor educationally effective.) Rather, games can provide a completely separate path to math learning that leverages one or more (often more, I believe) features of games, such as those listed above. That’s a lot of educational firepower.

I actually knew all of that long before the Wuzzit Trouble study. Indeed, I used to present that list in talks I would gave about the potential of game-based learning in math education. But until we had built Wuzzit Trouble and Stanford conducted their study, that was all theorizing. Part of me did not fully believe it would actually work. As a result, I was as surprised as anyone else when the Stanford results came out and showed that it did. (Maybe I should not have been, but I was. You get used to failure in math; it’s a crucial part of the process.)

To this day, I still don’t know for sure what features of the Wuzzit Trouble “Gears” puzzle make it such an effective learning tool. The same is true for our more recent BrainQuake app, where Gears is just one of three puzzles. From an educational perspective, however, it doesn’t really matter; the important thing is that the approach does work. But the scientist in me continues to drive me to find answers. Not least, because knowing why it works can inform the design of future math learning puzzles. As and when I, or my colleagues at BrainQuake and elsewhere, find answers, readers of this blog will be among the first to know.

– Keith

FOOTNOTE: Subsequent to my reaching out to Prof Kristian Kiili of Tampere University in Finland to run a second independent study of Wuzzit Trouble, I started to collaborate with him and his team on a number of university research projects. I am listed as a co-author on their Wuzzit Trouble study.

--

--