International Journal of Computer Games Technology

International Journal of Computer Games Technology / 2020 / Article

Research Article | Open Access

Volume 2020 |Article ID 2450651 |

Konrad Biercewicz, Mariusz Borawski, Jarosław Duda, "Method for Selecting an Engagement Index for a Specific Type of Game Using Cognitive Neuroscience", International Journal of Computer Games Technology, vol. 2020, Article ID 2450651, 19 pages, 2020.

Method for Selecting an Engagement Index for a Specific Type of Game Using Cognitive Neuroscience

Academic Editor: Michael J. Katchabaw
Received08 Oct 2019
Revised22 Jul 2020
Accepted31 Jul 2020
Published18 Aug 2020


The popularity of video games means that methods are needed to assess their content in terms of player satisfaction right from the production stage. For this purpose, the indicators used in EEG studies can be used. This publication presents a method that has been developed to determine whether a person likes an arcade game. To this end, six different indicators to measure consumer involvement in a video game using the EEG were compared, among others. The study was conducted using several different games created in Unity based on the observation () of the respondents. EEG has been used to select the most suitable indices studied.

1. Introduction

Computer games can evoke many emotions in a person who likes a given type of gameplay such as excitement, competition, fun, or relaxation. Game developers are most interested in ensuring that the player reaches the level of engagement in the game to such extent that he/she is performing the current activity for as long as possible. In other words, player involvement is one of the dimensions of gaming experience and can be associated with many concepts such as [1, 2] flow [3, 4], game flow [5], presence [6, 7], immersion [810], pleasure [11], motivation [1214], enjoyment [15], arousal [16], and fun [17]. Therefore, in order to unleash such a state in the recipient, it is necessary to maintain the player’s involvement at a certain level, e.g., by introducing unexpected twists and turns of action, which will encourage him to further explore further areas of the game. Firstly, to assess whether a participant is not discouraged by the game, it is necessary to introduce research, among other things, into the participant’s involvement in the game. In addition, the growing community of video game players is creating a demand among game developers for a better approach to indicating when and at what point the player’s interest is changing. However, before you start to study what elements in the game should be improved, you should examine whether a person likes a particular type of game.

On the face of it, it may seem that evaluating a player’s involvement in a video game is quite an easy task. This is not quite so, as evidenced by the methods provided by the manufacturers in the source code, which, for example, count how many blows the player took and how many times he played a particular level. It is difficult to check in real time, for example, what the player is looking at and what emotions accompany them. An example of such a module is the use of Unity Analytics if the game is created in the Unity engine. This tool helps to understand why people are playing the game or, on the contrary, why they are giving up the game. By understanding the people involved in the game and how they play, you can make improvements to the game.

Using Unity Analytics, you can monitor your game in the following areas [18]:(i)Onboarding. Do players use mechanisms such as tutorials or starter levels?(ii)Progression. Do players pass the game levels?(iii)Economics. Does the game economy work as expected?(iv)Design Validation. Does the game work properly?(v)Application Validation. Are all application areas used as expected? Are there any items that the players ignore or do not notice?(vi)Earnings. Are earning strategies optimal?

The use of the Analytics Dashboard helps analyze player behavior. The tool allows you to create paths that show how players pass through a linear step sequence. For example, you can create a tutorial sequence that shows the percentage of users that have passed the tutorial steps. Pathways are useful for identifying places in applications where we lose a player.

The sample path shown in Figure 1 shows the player’s progress in a hypothetical game. Each step on the path is the completion of the game level. There is always the possibility of a decrease from level to level; too much decrease after a given level may indicate a problem at this level. The path will not give the reason—it may be due to a problem with the game, error, boredom, or level being too difficult—but it will indicate the area to be explored.

The sample path shown in Figure 1 refers to the scientific literature; several studies have used questionnaires which are not entirely a good form of investigating a player’s experience [19]. The problems arise from the formulation and context of these forms [20].

The most common methods used to test a player’s involvement in digital games are the following test methods [21]:(i)Questionnaire [6]. By asking the appropriate questions in the survey, we can determine the degree of involvement in the individual elements of the game.(ii)Involvement Consisting of Focusing on Attention Using Eye Tracking (ABE) [2226]. The time during which the respondent looks at the elements on the monitor screen is measured. Concerning the time when the person was not looking at the monitor, it is possible to deduce how much attention the person tested gave to the game.(iii)Electrodermal Activity (EDA). This, also known as skin galvanic reaction (GSR), allows determining the emotions of the tested person based on the measurement of skin conductivity [16, 27, 28].(iv)Examination of Facial Expression. The facial expressions were examined by observation [29].(v)Mouse Clicks and Mouse Movement [30]. Measurements of the number and location of clicks and mouse movements allow determining the level of player involvement during the game.

The above methods are very limited. In a self-reported survey, the researcher relies on the observations of the respondent. The test person may have difficulty in remembering his or her feelings during the whole game. It is difficult to determine the exact time frame within which the growth of interest in the game begins and ends on the basis of this study. Attention-Based Engagement (ABE) depends not only on the player’s engagement but also on the type of game and the situation in the game. During a fight, the player’s focus on the game will be very high, because he/she has to react quickly to the opponent’s actions, while in a game where the player is just wandering around the city, their focus can be lower.

Galvanic skin response (GSR) allows you to define emotions in the first place. However, it is not always that a player’s involvement can be emotional. Certain elements of the game may not generate emotions until some success or failure is achieved.

Mouse clicks and mouse movement are strongly dependent on the scenario of the game itself. They can be useful if you are able to refer to other players. You can tell from them which player is more involved and which one is less involved. For example, during a fight, the mouse movements will depend on the weapon chosen by the player and the way the opponent fights. They may, therefore, be incomparable between opponents.

It is necessary to look for such methods of engagement research that will allow determining the level of engagement at any time in the game, while not being dependent on other factors. An example of such methods is a method of cognitive neuroscience. They are becoming more and more useful because they allow us to get to know the current state of the brain. This task is facilitated by the indices calculated on the basis of the recorded signals. In the literature on the subject, numerous indices of engagement can be found, which will be presented later in this chapter. They allow us to know the level of human involvement in a given activity in a given moment.

New developments in Brain-Computer Interfaces (BCI) using wireless electroencephalographic (EEG) systems provide recordings and access to neuronal activity, enabling the computer to retrieve and analyze information from brain waves. It has been demonstrated that EEG has the ability to determine the involvement of the user [3134]. The frequency bands are determined from the EEG signal using the spectral method. More details can be found in Section 2.5.

Using the EEG device, we can determine the preferences of the player, as well as which moment of the game is not very interesting, and we can improve it to make the player fully active in the game. New EEG devices are increasingly being used outside of medicine and are finding more and more new applications.

Using the EEG to measure the commitment of tasks is not a new concept. Pope et al. [35] built a system to control the level of automation of tasks based on whether the operator had increased or decreased his involvement. Freeman et al. [36] extended this system by evaluating the performance of each task with the use of absolute values of commitment. Berka [37] has invented a more accurate and effective method for people to interact with technology, with the ability to develop more productive work environments that increase motivation and productivity. The results suggest that the commitment measured using the EEG reflects information gathering, visual processing, and attention allocation. Smith and Gevins [38] used a flight simulator to study the reactions of the human brain to low-, medium-, and high-difficulty exercises. Studies have shown increased activity of the frontal lobe waves together with decreased activity of parietal lobe alpha waves during demanding tasks. In turn, Yamada [39] measured the activity of theta waves along with blinking of the eye and discovered that children playing video games had higher activity of theta waves during more frequent blinking. These results suggest that interesting tasks cause higher activity of theta waves, while the task inhibits the activity of blinking eyes. Kamzanova et al. [31] compared the sensitivity of a series of EEG engagement indices by examining time-pressured individuals performing tasks of varying degrees of stress to determine which one was most effective. McMahan et al. [32] investigated in the Super Meat Boy game whether there is a connection between engagement and arousal in events of death and general entertainment. The results of their research suggest that by combining engagement data with arousal data, we can establish thresholds indicating when a player has left the flow state. On the other hand, Ewing et al. [33] investigated the sensitivity of EEG power in the (front) theta and (parietal) alpha bands to changing levels of demand for play. Besides, they also conducted a study that assessed the adaptive performance of Tetris in terms of system behavior and user experience. Vourvopoulos et al.’s [34] research focuses on the impact of gaming experience on modulating brain activity, as an attempt to systematically identify elements that contribute to high BCI control and that can be used in the design of a neurogame.

The above author’s research studies [3234] examine player engagement but focus on topics related to dependencies or BCI. There is no prior research that will show what approach to take in order to determine whether a person likes a particular type of gameplay. In such a case, the relationship between engagement and arousal may prove to be more accurate, because we will be able to present the results as they were presented in people who like or dislike the given type of game.

This article presents the results of the research, which was aimed at developing a method of researching the involvement of a player during an arcade game. Therefore, we can determine which index should be used to determine whether a participant likes this type of gameplay.

2. Materials and Methods

The test procedure is shown in Figure 2. The first step of the test procedure is to formulate the problem. It is presented in Introduction. The second stage of the research procedure is the preparation of a questionnaire. The survey is aimed at learning about the preferences of players—their expectations as to the content of the game, their favorite type of gameplay, and the activities performed in the game. The results of the survey will be the basis for determining the player’s profile.

It took several games to find out what kind of game the player was involved in. The games were created in the Unity engine in the C# language. They were made in such a way that it is possible to record in-game events in order to synchronize them with the registered EEG signals. EEG signals were recorded in a group of 31 people. First, a pilot study was carried out on several persons to verify the correctness of registration.. After the pilot study and the correction of all errors, an appropriate study was carried out.

The recorded signals were used to calculate the EEG indices. Based on the received involvement of the respondent and the respondent’s answer, a comparison was made between the responses and the engagement indices. Based on this selection, it was determined which index should be used for a given type of game.

2.1. Questionnaire

EEG data were collected from 31 healthy subjects (, ), and the average age was 23. The subjects were informed about the course of the study. They then signed consent to participate in the study and seated on a comfortable chair with access to a keyboard and mouse. The next step was to put on a cap and connect the electrodes to the skin of the participant’s head and to a device that recorded data from the participant’s brain. After performing the above activities, the study was started. Before each game, there was information about what the game was going to be about, what goal to achieve, and how to move around in it. Immediately after the end of the survey, each participant was interviewed about their experience with computer games and which type of game is the most popular. During the game, the players were asked to organize the games in terms of their involvement.

The game of each participant was saved in the resolution of using the programmed registration in the game. Each shot on the screen generated a timestamp for EEG data to determine the position of the beginning and end of each section. The screenshots were saved for later reference during the data analysis phase. In addition to EEG, an eye tracker (“The Eye Tribe”) was used in the study to track what was particularly important to the respondent.

2.2. Description of the Games

The games were downloaded from the Unity Asset Store and adapted to the needs of this research in the Unity engine. Before the start of the game, there was a short instruction on how to move and the purpose of the game.

The following games have been created:(i)2D Shooter [40]. The game was about achieving the best possible result by killing monsters (Figure 3).(ii)Puzzle [41]. The game consisted of arranging a large picture from small fragments with characteristic shapes (Figure 4).(iii)Hexlogic [42]. The task was to indicate the excess number of figures (Figure 5).(iv)3D Shooter [43]. Just like in the 2D shooter game, the game consisted of achieving the best possible result by killing creatures (Figure 6).(v)Tower Defense [44]. The task was to stop further waves of enemies by building defense towers (Figure 7).(vi)Flying Mushroom [45]. The task was to fly the mushroom to the indicated location (Figure 8).(vii)Racing Game. The game consisted of driving as many meters as possible avoiding obstacles (Figure 9).(viii)Ball Control [46]. The task was to avoid obstacles and achieve the highest possible result (Figure 10).

Each game lasted a minute. During each game, the start and end time of the game was recorded and then saved to an Excel file.

2.3. Electrodes Used in the Examination

The cap with 21 electrodes placed in AF3, AF4, F3, F4, F7, F8, FC5, FC6, P7, P8, T7, T8, O1, O2, P3, C3, Pz, Fz, Cz, FPz, and P4 was used (see Figure 11). The channels have been distributed according to the 10-10 system, the international EEG electrode distribution system [47]. The electrodes required a dampened socket to improve conductivity. The sampling frequency was 500 Hz.

2.4. Game Survey

The participants answered a series of questions evaluating previous experiences with video games and other personal characteristics. The participants were asked to submit their favorite type of game: arcade (27 people) and logical (4 people). The participants were also asked if they would qualify as “recreational players”; the answer was positive.

The last questions concerned the game itself and, more specifically, which elements of the game should be improved according to them, as well as in which situations they believe that their involvement grew and decreased.

2.5. Analysis of Data

All data were analyzed using MATLAB R2019a and Statistica. Events such as blinking of the eyes, head movements, or body movements may cause undesired EEG registration data. Most EEG analyses require the removal of such events in order to identify medical problems. However, this is not a problem to analyze the gameplay. Such events are common in everyday play [48].

The EEG measured on the scalp corresponds to a recording at frequencies of 0.5 to 30 Hz. Four basic bands are recognized in this range [49]:(i)Delta (0.5-4 Hz). The brain waves of the delta are generated in the deepest meditation and sleep. Delta waves suspend external consciousness and are a source of empathy. In this state, treatment is stimulated and regeneration, which is why deep restorative sleep is so important for the healing process.(ii)Theta (4-8 Hz). Theta waves occur most frequently in sleep but are also dominant in deep meditation. Theta waves are noticed during learning or remembering.(iii)Alpha (8-12 Hz). Alpha activity is best seen in the back regions of the brain and is typical for relaxation. It occurs when closing the eyes.(iv)Beta (12-30 Hz). Beta activity can be divided into low-activity waves (12-15 Hz), medium-activity waves (15-20 Hz), and high-activity waves (18-30 Hz). The average range of beta activity is associated with increased energy, anxiety, performance, and concentration. The most visible is in the leading regions.

The EEG spectral signal was analyzed using a Fast Fourier Transform (FFT) and an overlapping 3-second time frame with a 1-second jump at the relevant alpha, beta, and theta frequencies listed in Table 1.

BandwidthFrequency (Hz)


Measurement of the level of engagement is one of the elements determining the player’s experience while playing a computer game. In particular, it can be used to determine the player’s preferences for the same type of game.

For this purpose, the exposure indices used to calculate the exposure level are those presented in Table 2.(i)Index 1 [32]. has been calculated for each participant using the following electrodes: AF3, AF4, F3, F4, F7, F8, FC5, FC6, P7, P8, T7, T8, O1, and O2.(ii)Index 2 [32]. was calculated using the average registration value from electrodes placed on the frontal lobe of theta (F3, F4, FC5, and FC6) and divided by the average registration value from electrodes placed on the parietal lobe of alpha (P7, P8).(iii)Index 3 [32]. Theta was calculated using the average registration value from the electrodes placed on the frontal lobe of theta: AF3, AF4, F3, F4, F7, F8, FC5, and FC6.(iv)Indices 4 and 5 [31]. was calculated using the average registration value of the following electrodes: F3, F4, F7, F8, Cz, P3, Pz, and P4.(v)Index 6 [50]. was calculated using the average registration value from the following electrodes: P3, C3, Pz, Fz, Cz, and FPz.

Index numberFormulaCounting method

Index 1Average registration value of all electrodes on the head
Index 2Average registration value from electrodes placed on the frontal lobe of theta and parietal lobe of alpha
Index 3ThetaThe average value of registration from electrodes placed on the frontal lobe of theta
Index 4Average registration value from electrodes: F3, F4, F7, F8, Cz, P3, Pz, and P4
Index 5Average registration value from electrodes: F3, F4, F7, F8, Cz, P3, Pz, and P4
Index 6Average registration value from electrodes: P3, C3, Pz, Fz, Cz, and FPz

The variance is then analyzed to see if there is a material significant difference between the calculated exposure indices. For this purpose, the ANOVA statistical test was used. Before comparing population averages, the Shapiro-Wilk statistical test was used to check whether the examined features have a similar distribution to normal. The study of degradability normality can de facto be disregarded because, with sufficiently large samples (), a breach of the normality assumption should not cause serious problems; this means that parametric procedures can be used even if the data are not normally distributed [51]. The number of samples in the tests performed is greater than 100, so that the data distribution can be ignored [51]. If any differences were detected, the post hoc (postfact) HSD Tukey test was used. The post hoc tests are carried out as a further step in the analysis of variance since the analysis of variance itself only tells whether differences in the compared averages exist or not. We do not know which groups have these differences. A significant -factor only indicates the validity (or otherwise) of rejecting a zero hypothesis. If we reject it, we have to find out whether all the averages are different or just some.

After selecting which groups differ from each other, the average of the whole game was calculated. This allowed each index to select three games for the time when the average commitment was highest. After comparing the first positions with the opinion of the respondent, the index that corresponded most closely to the opinion of the respondent was selected. Where two different indices indicated the same game, the results of a post hoc analysis were considered. Where they did not appear in the table, the second or third position to compare with the opinion of the respondents was taken into account.

3. Results

Favorite games, which were mentioned by the respondents, are listed in Table 3. Indices with the greatest commitment were assigned to each researched person and to which game this occurred (Table 4). More detailed data can be found in Table 5.

Examining the number1st place2nd place3rd place

1Flying mushroom3D shooterBall control
2PuzzleTower defenseOther
3Racing gameOther
4Tower defense3D shooter2D shooter
53D shooterBall control2D shooter
6PuzzleHexlogicBall control
7Flying mushroomOther
82D shooterBall control3D shooter
9Racing gameFlying mushroomBall control
10I was not interested in anything
11Ball controlTower defense2D shooter
12Flying mushroomOther
13Flying mushroomOther
14Flying mushroomOther
15Racing gamePuzzleHexlogic
163D shooterFlying mushroomOther
17Flying mushroom3D shooterRacing game
18Flying mushroomTower defenseRacing game
193D shooterPuzzleOther
203D shooterPuzzleOther
213D shooter2D shooterPuzzle
223D shooter2D shooterPuzzle
23Racing gameTower defensePuzzle
24Racing game3D shooter2D shooter
253D shooterFlying mushroomRacing game
263D shooterRacing gamePuzzle
273D shooterHexlogicFlying mushroom
28Tower defense3D shooterPuzzle
29Flying mushroom2D shooterRacing game
303D shooter2D shooterOther
31Flying mushroomTower defensePuzzle

Examining numberIndex 1Index 2Index 3Index 4Index 5Index 6


prof1_0: 2D shooter; prof1_1: puzzle; prof1_2: hexlogic; prof2: 3D shooter; prof3: tower defense; prof4: flying mushroom; prof5: racing game; prof6: ball control.

Examining numberIndex 1Index 2Index 3


In Table 6, there are presented averages of indices between which there are no significant differences. The post hoc HSD Tukey test had to be applied to each patient because the ANOVA test (Table 7; for other subjects, the results were very similar) had to be rejected. This is because the average square of error (error) is the variance we expect and the average square (MS) is the variance in our dataset. We can see that the variance is much greater than the one we would expect, so the value of , the probability, is very low. In this case, as well as for other respondents, the value of indicates that the alternative hypothesis H1 should be assumed; i.e., there are differences between exposure indices. When two indices are in Table 6, choosing an engagement index is practically meaningless because there are no significant differences between their averages.


11.61.6,, 4.51.6, 4.51.6,,
31.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.5
54.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.5, 1.6, 4.5, 5.61.6, 4.54.5
6NoneNoneNoneNone4.51.6, 4.5NoneNone, 1.5, 1.6,, 4.51.6, 4.51.6, 4.5None1.6
9NoneNone4.51.6, 4.51.5, 1.6, 4.51.5, 1.6, 4.51.6, 4.5None
111.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.5
12None1.6,, 1.5, 4.51.6, 1.5, 4.51.6, 1.5, 1.4,
14NoneNoneNoneNoneNone1.6, 4.5NoneNone, 4.51.5, 4.51.5, 1.6, 4.51.5, 4.54.5
16NoneNoneNone1.51.51.5, 4.51.5None
181.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.5, 1.5, 4.51.6, 4.51.6, 4.5
204.51.6,, 4.51.6, 4.51.6, 4.51.6, 4.54.5,,,, 1.5, 1.6,
251.6, 4.5None1.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.5
27None1.6, 4.5None1.6,, 4.51.6, 4.5None
28None4.5None4.54.51.5, 4.54.5None
291.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.5
311.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 4.51.6, 1.5, 1.4, 4.5, 4.6, 5.61.6, 4.51.6, 4.5










Before the ANOVA test, normal distribution was tested (Figure 12). An example is shown because the results are similar to the other games and the people surveyed. was obtained in all cases (Table 8), which indicates the lack of normality of degradation. Nevertheless, parametric procedures can be used because the sample count is more than 100 [51]. In all tests, a materiality level of was established.

Zmn1 (Figure 12(a))Zmn2 (Figure 12(b))Zmn3 (Figure 12(c))Zmn4 (Figure 12(d))Zmn5 (Figure 12(e))Zmn6 (Figure 12(f))


Zmn1: index 1; Zmn2: index 2; Zmn3: index 3; Zmn4: index 4; Zmn5: index 5; Zmn6: index 6.

Analysis has shown that index 2 best reflects a player’s commitment, which translates into a player’s preferences. The number of situations which differed from the opinion of the examined person was 9, of which 1 opinion should be rejected because the examined person (no. 10) was not interested in the game at all; for 5 opinions, the game differed, but the type of game itself was accepted, and the remaining 3 opinions did not agree either with the game which was liked or with the type of game.

4. Discussion

The main objective of the study was to develop a method that would allow determining player preferences based on engagement. It was helpful to evaluate the different indices of engagement when playing different types of computer games, which had a time of 1 minute. As a result, it was determined which index most closely reflects the correct engagement in the game. MATLAB software and survey data were used for the analysis. Among the most important results, we can mention the following:(1)Table 9 shows the number of opinions for skill games that matched the index(2)Table 10 shows the persons in whom index 2 indicated another game, but it was still an arcade game(3)Among the respondents, there were people whose favorite game was a puzzle or tower defense (Table 11). Index 2, in this case, did not show arcade games. Therefore, we can presume that it is also able to detect cases where players do not like arcade games. We cannot state this clearly, because the attempt is too small for a bolder statement

Index 1Index 2Index 3Index 4Index 5Index 6

Number of persons0189100

Person no.Index


prof1_0: 2D shooter; prof4: flying mushroom; prof5: racing game.

Person no.GameIndex 2

2PuzzleTower defense
4Tower defensePuzzle
28Tower defensePuzzle

In order to carry out a test that will allow us to determine whether a person likes arcade games, we should use the test procedure included in Figure 13. The first stage of the test procedure is the preparation of 3 different types of games. The second stage is to conduct the EEG test using the appropriate electrode configuration, i.e., F3, F4, FC5, FC6, P7, and P8, to calculate the exposure index. The test must last at least 1 minute. The last step is to analyze the data obtained. Calculate the average engagement during the whole duration of a particular game, and then sort the received values from the highest to the lowest. If the highest value is of an arcade game, it means that the player likes this type of game.

In order to carry out a test that will allow us to determine whether a person likes arcade games, we should use the test procedure included in Figure 13. These conclusions should be understood in the context of certain restrictions. First of all, in the study, most people like arcade games; only 5 people said they liked a different type of game. As a result, we cannot fully determine whether there will be any conflict in the case of other types of games. At this moment, it was examined which index we should use for arcade games. The next step will be to prepare other types of games in order to establish the credibility of the selected index to ensure that the current scores have no anomaly due to the current number of people who do not like arcade games.

5. Conclusion

The results of research aimed at selecting the appropriate index of engagement using EEG to determine whether a person likes arcade games are presented. In this achievement, it was decided to define the profile of the player on the basis of the order from the most engaging to the least engaging game. Based on their opinions, the optimal indicator is index 2 (theta/alpha), because it best represents the opinion of the respondents.

It should be taken into account that these findings are based on a single type of game and that further research will be needed in order to extend the results of the methodological approach to assessing which type of game is of greatest interest not only by analyzing the player’s involvement but also by adding further indices from other categories such as concentration. Nevertheless, these results confirm the view that index 2 is a strong indicator of enjoyment for some type of games, and this shows real promise for future research with a larger more diverse set of participants and possibly a different set of games.

Data Availability

The data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare that they have no conflicts of interest.


  1. H. Schoenau-Fog, “The player engagement process - an exploration of continuation desire in digital games,” in Proceedings of DiGRA 2011 Conference: Think Design Play, Hilversum, Netherlands, 2011. View at: Google Scholar
  2. M. Filsecker and M. Kerres, “Engagement as a volitional construct,” Simulation & Gaming, vol. 45, no. 4–5, pp. 450–470, 2015. View at: Publisher Site | Google Scholar
  3. M. Csikszentmihalyi, Flow: The Psychology of Optimal Experience, Harper Perennial Modern Classics, 1990.
  4. J. Chen, “Flow in games (and everything else),” Communications of the ACM, vol. 50, no. 4, pp. 31–34, 2007. View at: Publisher Site | Google Scholar
  5. P. Sweetser and P. Wyeth, “GameFlow,” Computers in Entertainment, vol. 3, no. 3, pp. 3–3, 2005. View at: Publisher Site | Google Scholar
  6. M. Lombard and T. Ditton, “At the heart of it all: the concept of presence,” Journal of Computer-Mediated Communication, vol. 3, no. 2, 1997. View at: Publisher Site | Google Scholar
  7. R. Tamborini and P. Skalski, “The role of presence in the experience of electronic games,” Playing video games: Motives, responses, and consequences, pp. 225–240, 2006. View at: Google Scholar
  8. A. Mcmahan, “Immersion, engagement, and presence: a method for analyzing 3-D video games,” The Video Game Theory Reader, pp. 67–86, 2003. View at: Google Scholar
  9. E. Brown and P. Cairns, “A grounded investigation of game immersion,” in CHI ’04 Extended Abstracts on Human Factors in Computing Systems, pp. 1297–1300, New York, NY, USA, 2004. View at: Publisher Site | Google Scholar
  10. C. Jennett, A. L. Cox, P. Cairns et al., “Measuring and defining the experience of immersion in games,” International Journal of Human-Computer Studies, vol. 66, no. 9, pp. 641–661, 2008. View at: Publisher Site | Google Scholar
  11. B. Costello and E. Edmonds, “A tool for characterizing the experience of play,” in Proceedings of the Sixth Australasian Conference on Interactive Entertainment, pp. 2:1–2:10, New York, NY, USA, 2009. View at: Publisher Site | Google Scholar
  12. N. Yee, “Motivations for play in online games,” Cyberpsychology & Behavior, vol. 9, no. 6, pp. 772–775, 2006. View at: Publisher Site | Google Scholar
  13. A. K. Przybylski, C. S. Rigby, and R. M. Ryan, “A motivational model of video game engagement,” Review of General Psychology, vol. 14, no. 2, pp. 154–166, 2010. View at: Publisher Site | Google Scholar
  14. I. Iacovides, J. Aczel, E. Scanlon, J. Taylor, and W. Woods, “Motivation, engagement and learning through digital games,” IJVPLE, vol. 2, no. 2, pp. 1–16, 2011. View at: Publisher Site | Google Scholar
  15. W. Ijsselsteijn, W. Hoogen, C. Klimmt et al., “Measuring the experience of digital game enjoyment,” in Proceedings of Measuring Behavior, Maastricht, the Netherlands, 2008. View at: Google Scholar
  16. N. Ravaja, T. Saari, M. Salminen, J. Laarni, and K. Kallinen, “Phasic emotional reactions to video game events: a psychophysiological investigation,” Media Psychology, vol. 8, no. 4, pp. 343–367, 2006. View at: Publisher Site | Google Scholar
  17. R. Koster and W. Wright, A Theory of Fun for Game Design, Paraglyph Press, 2004.
  18. U. Technologies, “Unity - Manual: Unity Analytics Overview,” Sep. 02, 2019 View at: Google Scholar
  19. Y.-T. Chiang, C.-y. Cheng, and S. S. J. Lin, “The effects of digital games on undergraduate players’ flow experiences and affect,” in 2008 Second IEEE International Conference on Digital Game and Intelligent Toy Enhanced Learning, pp. 157–159, Banff, BC, 2008. View at: Publisher Site | Google Scholar
  20. M. Slater, “Measuring presence: a response to the Witmer and Singer Presence Questionnaire,” Presence (Camb.), vol. 8, no. 5, pp. 560–565, 1999. View at: Publisher Site | Google Scholar
  21. R. M. Martey, K. Kenski, J. Folkestad et al., “Measuring game engagement: multiple methods and construct complexity,” Simulation & Gaming, vol. 45, no. 4–5, pp. 528–547, 2015. View at: Publisher Site | Google Scholar
  22. H. O’Brien and E. Toms, “What is user engagement? A conceptual framework for defining user engagement with technology,” JASIST, vol. 59, no. 6, pp. 938–955, 2008. View at: Publisher Site | Google Scholar
  23. E. Ismail, S. N. Sidek, M. R. Khan, and N. A. Jalaludin, “Analysis of engagement factor in trajectory tracking-based experiment,” in 2012 International Conference on Computer and Communication Engineering (ICCCE), pp. 787–791, Kuala Lumpur, 2012. View at: Publisher Site | Google Scholar
  24. T. Renshaw, R. Stevens, and P. Denton, “Towards understanding engagement in games: an eye-tracking study,” On the Horizon, vol. 17, pp. 408–420, 2009. View at: Publisher Site | Google Scholar
  25. T. Texeira, M. Wedel, and R. Pieters, “Emotion-induced engagement in Internet video advertisements,” Journal of Marketing Research, vol. 49, 2012. View at: Publisher Site | Google Scholar
  26. J. Read, S. MacFarlane, and C. Casey, Endurability, Engagement and Expectations: Measuring Children’s Fun, Interaction Design and Children, 2009.
  27. S. Lim and B. Reeves, “Computer agents versus avatars: responses to interactive game characters controlled by a computer or other player,” International Journal of Human-Computer Studies, vol. 68, no. 1, pp. 57–68, 2010. View at: Publisher Site | Google Scholar
  28. R. Mandryk and K. Inkpen, “Physiological indicators for the evaluation of co-located collaborative play,” in Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW ’04). Association for Computing Machinery, New York, NY, USA, 2004. View at: Google Scholar
  29. A. Karimi and Y. P. Lim, “Children, engagement and enjoyment in digital narrative,” in Proceedings of ASCILITE - Australian Society for Computers in Learning in Tertiary Education Annual Conference 2010, pp. 475–483, 2010,, View at: Google Scholar
  30. R. Dale, C. Kehoe, and M. J. Spivey, “Graded motor responses in the time course of categorizing atypical exemplars,” Memory & Cognition, vol. 35, no. 1, pp. 15–28, 2007. View at: Publisher Site | Google Scholar
  31. A. T. Kamzanova, G. Matthews, A. M. Kustubayeva, and S. M. Jakupov, “EEG indices to time-on-task effects and to a workload manipulation (cueing),” World Academy of Science, Engineering and Technology, vol. 80, pp. 19–22, 2011. View at: Publisher Site | Google Scholar
  32. T. McMahan, I. Parberry, and T. D. Parsons, “Evaluating player task engagement and arousal using electroencephalography,” Procedia Manufacturing, vol. 3, pp. 2303–2310, 2015. View at: Publisher Site | Google Scholar
  33. K. C. Ewing, S. H. Fairclough, and K. Gilleade, “Evaluation of an adaptive game that uses EEG measures validated during the design process as inputs to a biocybernetic loop,” Frontiers in Human Neuroscience, vol. 10, pp. 223–223, 2016. View at: Publisher Site | Google Scholar
  34. A. Vourvopoulos, S. Bermudez i Badia, and F. Liarokapis, “EEG correlates of video game experience and user profile in motor-imagery-based brain—computer interaction,” The Visual Computer, vol. 33, no. 4, pp. 533–546, 2017. View at: Publisher Site | Google Scholar
  35. A. T. Pope, E. H. Bogart, and D. S. Bartolome, “Biocybernetic system evaluates indices of operator engagement in automated task,” Biological Psychology, vol. 40, no. 1–2, pp. 187–195, 1995. View at: Publisher Site | Google Scholar
  36. F. G. Freeman, P. J. Mikulka, L. J. Prinzel, and M. W. Scerbo, “Evaluation of an adaptive automation system using three EEG indices with a visual tracking task,” Biological Psychology, vol. 50, no. 1, pp. 61–76, 1999. View at: Publisher Site | Google Scholar
  37. C. Berka, D. Levendowski, M. Lumicao et al., “EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks,” Aviation, Space, and Environmental Medicine, vol. 78, 5 Supplement, pp. B231–B244, 2007. View at: Google Scholar
  38. M. Smith and A. Gevins, “Neurophysiologic monitoring of mental workload and fatigue during operation of a flight simulator,” in Proceedings of SPIE - The International Society for Optical Engineering, vol. 5797, 2005. View at: Publisher Site | Google Scholar
  39. F. Yamada, “Frontal midline theta rhythm and eyeblinking activity during a VDT task and a video game: useful tools for psychophysiology in ergonomics,” Ergonomics, vol. 41, no. 5, pp. 678–688, 2010. View at: Publisher Site | Google Scholar
  40. “2D Platformer - Asset Store,” Sep. 02, 2019 View at: Google Scholar
  41. Ehthesham, “Drag And Drop Puzzle Lite - Asset Store,” Sep. 02, 2019 View at: Google Scholar
  42. “HexLogic Game - Asset Store,” Sep. 02, 2019 View at: Google Scholar
  43. “Survival Shooter Tutorial [LEGACY] - Asset Store,” Sep. 02, 2019 View at: Google Scholar
  44. “Tower Defense Toolkit (TDTK) Free - Asset Store,” Sep. 02, 2019 View at: Google Scholar
  45. “Fungus - Asset Store,” Sep. 02, 2019 View at: Google Scholar
  46. Finalboss, “Crazy Ball Complete Game Template (free tutorial game) - Asset Store,” Sep. 02, 2019 View at: Google Scholar
  47. V. Jurcak, D. Tsuzuki, and I. Dan, “10/20, 10/10, and 10/5 systems revisited: their validity as relative head-surface-based positioning systems,” NeuroImage, vol. 34, no. 4, pp. 1600–1611, 2007. View at: Publisher Site | Google Scholar
  48. D. Plass-Oude Bos, B. Reuderink, B. van de Laar et al., “Brain-computer interfacing and games,” in Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction, D. S. Tan and A. Nijholt, Eds., pp. 149–178, Springer London, London, 2010. View at: Google Scholar
  49. M. Hosťovecký and B. Babusiak, “Brain activity: beta wave analysis of 2D and 3D serious games using EEG,” Journal of Applied Mathematics, Statistics and Informatics, vol. 13, no. 2, pp. 39–53, 2017. View at: Publisher Site | Google Scholar
  50. M. Chaouachi and C. Frasson, “Mental workload, engagement and emotions: an exploratory study for intelligent tutoring systems,” Intelligent Tutoring Systems, vol. 7315, 2012. View at: Publisher Site | Google Scholar
  51. A. Ghasemi and S. Zahediasl, “Normality tests for statistical analysis: a guide for non-statisticians,” International Journal of Endocrinology and Metabolism, vol. 10, no. 2, pp. 486–489, 2012. View at: Publisher Site | Google Scholar

Copyright © 2020 Konrad Biercewicz et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.