A brain-computer interface (BCI) infers our actions (e.g. a movement),
intentions (e.g. preparation for a movement) and psychological states
(e.g. emotion, attention) by interpreting our brain signals. It uses the
inferences it makes to manipulate a computer. Although BCIs have
long been used exclusively to support disabled people (e.g. through
brain-controlled wheelchairs, spellers), with the emerging low-cost and
portable hardware, they have started to be considered for a variety
of human-computer interaction (HCI) applications for non-disabled
people as well. Among these, games have been receiving the interest of
researchers and practitioners from both the BCI and HCI communities.
In BCI research, games have long been used solely to demonstrate
the performance of signal processing and analysis methods. Therefore,
they have been evaluated only for their performance (e.g. recognition
accuracy, information transfer rate). However, games are not meant
to satisfy our practical needs. They satisfy our hedonic needs. They
challenge us, let us make our fantasies true, evoke our memories, and
so on. We look for these experiences while playing games. Thus, rather
than the performance of the controller used, the user experience (UX)
of the game is essential.
UX of a game is a consequence of the player's internal state, the game
characteristics and the context. Evaluating such a complex phenomenon
is non-trivial. Often, UX is measured in terms of other, measurable
concepts such as flow, immersion, presence, social behaviour and so
on. Methods to evaluate UX also vary and include questionary, interviewing,
and observation analysis. Evaluating UX of BCI games is even
harder because UX evaluations may be biased due to the low recognition
performance of the BCI. But this should not keep us from investigating
UX of BCI games and identifying the good and bad practices, independent
of performance. Because, ignoring UX while trying to improve
performance might lead to games that are perfectly functional, but not
enjoyed or played by anyone.
In this work, we investigated how the BCI control can influence
the UX of a computer game. We considered a futuristic scenario in
which BCI functioned as perfectly as other modalities. To simulate
this scenario, we proposed and followed an approach called equalised
comparative evaluation (ECE). For this, we equalised the perceived
performance of BCI and several other modalities. We did not simply
introduce artificial errors on the modalities as this could reduce player
effectance and, thus, enjoyment. Instead, we manipulated the challenge of the tasks the players performed. Then, we evaluated and compared
the UX while playing with BCI and with the other modalities.
Our work consisted of three studies in each of which we evaluated
different UX related concepts and used different data collection methods.
In all the studies, participants played an experimental multimodal game
that we had developed, called Mind the Sheep! (MTS!). They controlled
3 dogs using different modalities in order to herd 10 sheep across a
meadow. The goal was to pen all the sheep as quickly as possible.
In Study 1, we showed the effectiveness of our ECE approach. Pairs
of participants played a collaborative, multi-player version of MTS! once
using a BCI that relied on the steady-state visually evoked potential
(SSVEP), once by simple mouse pointing and clicking (non-ECE approach)
and once using a visuomotor control mechanism that was as
challenging to control as BCI (ECE approach). We relied on observation
analysis, interviewing and questionnaires to evaluate UX in terms of
social interaction. We found that challenging control dampened collaborative
social interaction but it improved emotional social interaction.
In Study 2, participants played single-player MTS! once using BCI
and once using the visuomotor control mechanism we used in Study
1. They indicated their UX in terms of affect and immersion using
questionnaires. We found that the BCI selection method was more
immersive and that the participants were more indulgent towards BCI
control. One question that arose from our findings was whether the
positive UX of BCI control was due to a novelty effect. This was what
we investigated in Study 3.
In Study 3, we compared UX of BCI control to that of automatic
speech recogniser (ASR) control, under the assumption that both ASR
and BCI were novel game input modalities. Participants played singleplayer
MTS! once with BCI, once with ASR and once with the option
of switching between the two. Using questionnaires, they rated their
expectations, engagement and workload levels as well as perceived
game/controller quality. We also conducted interviews and analysed
game logs. The participants rated BCI control higher in hedonic quality
but lower in pragmatic quality than ASR control. The challenge and
novelty of BCI influenced their modality switching behaviour.
The contributions of our work are manifold. The ECE approach
we proposed allows evaluating UX of BCI games (or applications in
general) independent of their performance and investigating the unique
capabilities of BCI. The UX evaluation results demonstrate the ways
the challenge, cognitive involvement and novelty offered by BCI can
influence the UX of a game. Our discussion on the preferable (and
non-preferable) measures and methods for evaluating the UX of BCI
games provides guidelines to other researchers. Furthermore, the BCI
and physiological computing (PC) frameworks we proposed allows
developers to situate their applications among other BCI or PC systems. |