I’m a Master of Arts now!
After two years of university, six months working on my thesis and an internship at Goodgame Studios, I can finally call myself Master of Arts. Sure, in retrospect I’d rather have a degree that says Master of Science but I’m not complaining. Instead I’d like to write a little bit about my master thesis. Unfortunately it was in German so I can’t quote anything, but I’ll try to capture the main essence in English. The title is was given by my professor was – roughly translated – Dynamic difficulty adjustment in games, controlled by via RealSense recognized player emotion, embedded in Unity.
The title pretty much sums up what I did. I tried to adjust the difficulty of games using Unity and Intels RealSense Camera. When I had the idea for the thesis I was planning to use the RealSense SDK that offered emotion detection at that point. Unfortunately by the time I started working on it, Intel had removed the emotion detection from it’s SDK and I had “only” face detection to rely on. While it moved my focus a little from the question “How can emotion based DDA be integrated in different game genres?”, it gave me the chance to explore the psychology behind emotions a little more, which was really interesting. I’ll start with a small overview about emotions, talk about RealSense and then go a little into how I used emotions and RealSense to create two games that were controlled by emotions.
You look stressed! … I think
I based my thesis on the work of Paul Ekman – just like about everyone else who is working with emotions in any way. He did a lot of research to destroy any doubt about the questions if emotions and their expressions are universal or differ from culture to culture. While some scientists still have their doubts, most of them accept his findings. According to Ekman there are six emotions with an universal face expressions. Those emotions are anger, sadness, fear, surprise, happiness and disgust. For my thesis none of these seemed really fitting. I was thinking more about emotions like stress and boredom. While stress was mentioned by Ekman in the context of both disgust and anger, there is very little scientific research on the expression of boredom. In the end I had the following indicators for my emotion evaluation: yawning and little to no movement in the face representing boredom, frowning and compressed lips for stress and smiling and brow raising as neutralizer to move the emotion value that I was computing more to the neutral midpoint.
Trust your senses
Capturing those indicators was done with the RealSense SDK and the RealSense frontcamera. RealSense is a technology by Intel that combines three different cameras in one. A 1080p HD camera, an infrared camera, and an infrared laser projector work together to make different applications possible. To name a few the RealSense is capable of sensing depth, measuring real live sizes and track motions. Because of this diverse possibilities the RealSense has been used in the gaming context. There aren’t too many games using it, since the camera itself is not as common as normal webcams but those that I tried are fun to play. In Warrior Wave for example the player uses it’s hand to guide Roman soldiers to their destination. Another game that I haven’t played myself but that game up a lot during my research was Nevermind. Not only does the game use the RealSense camera (among other technologies) to enhance the experience but it uses it to change the game. It doesn’t adjust the difficulty based on boredom but it does make the game harder when the user is stressed.
Let me use your feelings
The idea of dynamic difficulty adjustment (DDA) in games isn’t new. It is based on the fact that games that are either to easy or to hard are not as enjoyable as games with a good degree of challenge. Since every player has a different skill set, everyone experiences the difficulty of a game differently. To make up for that DDA can be used. Games that use DDA have various techniques to measure the performance of the player and adjust the difficulty accordingly. But they do not take into consideration the face that there are players that like more challenge and players that enjoy games more if they are a little easy on them. To find out what type of player is sitting in front of the game you need insight into his thoughts and feelings. And that’s exactly where my thesis comes in. The idea was to make a game harder if the player seemed bored and to make it easier if he seems stressed. To test if my concept was working I created two games that were tested by informed and uninformed testers. One was Tetris were I only tempered with the falling speed of the blocks. To mask the DDA the game also got faster over time but emotions based adjustment were more significant. For the second game a wanted something a little more complex and used the Shooter from one of the Unity tutorials. Instead of making anything faster or slower in a game that was created to be very fast and only ends when you loose, I decided to just add health packs when the player gets to stressed.
There is more to say
Unfortunately I was only able to get six testers and while they all played both games, some of them with DDA, some of them without, the results weren’t really conclusive. There was a slight tendency that the games with emotion detection were more enjoyable, but it wasn’t enough to definitely say that expression detection is a huge improvement for games. What played into this was that my expression detection is far from perfect and the RealSense did not always detect all faces correctly. In the early phase of my theses my mum and I had a lot of fun because the RealSense always detected her mouth to be her nose. While causing a lot of laughter it made all indicators that were using the mouth invalid. There is also the problem that not every player is willing to be filmed during playing, no matter if the gathered data is saved somewhere or not.
Regardless of whether my expression detection was good enough or if the used camera works as intended, the idea of DDA using player emotions is a really interesting one. And luckily no one has to depend on my solution for the detection of expressions. In the planning stage of my thesis Affectiva, the leading company for emotion measurement, released a trial version of a Unity SDK that measures user emotion. Unfortunately I only found out about it when my own work was quite advanced so I didn’t try their plugin for my thesis. But from what I’ve read it get’s rid of most of the problems I experiences and gives game developers, who are interested in using player emotions to enhance the gaming experience of their players, the chance to try out how emotion can best be used to adjust game difficulty.
I really enjoyed the months I spend with the thesis and I might continue to look into DDA based on player emotions in the future. But in the next few weeks I’m going to concentrate on my new job at a small game studio in Cologne.
If you want to know anything else about my thesis, feel free to ask!