CrushDog5 Profile Joined March 2010 Canada 204 Posts Last Edited: 2011-09-12 19:18:05 #1 STARCRAFT II + SCIENCE Starcraft's March Into Academia





As the director of the cognitive science lab (http://cslab.psyc.sfu.ca/) and a professor at Simon Fraser University (http://www.psyc.sfu.ca/people/), I lead the team of researchers behind the SkillCraft project, the first large-scale research on StarCraft 2 expertise. There are currently eight members of the research team from the Cognitive Science Lab working on the SkillCraft project, a mix of graduate students and undergrads from Computing, Psychology and Cognitive Science and we have put in well over a 1,000 hours on the project. Our goal is to make SkillCraft the largest expertise study ever conducted. In this article I’ll give an overview of the project, provide some description of the methods we are using, and talk about what I see in the future for the scientific study of StarCraft2.





Expertise



Scientists have been studying expertise for decades, and we now know a great deal. We know that expertise in a complex skill that requires about 10,000 hours of deliberate practice. We know that deliberate practice, in which effort is made to improve, typically by doing tasks that are targeting specific sub-skills (for example playing scales on the piano), is far better than simply playing a lot. We know that the difference between the top experts in a field is correlated most strongly with hours of deliberate practice, and that there is very little empirical support for the idea that some experts are more 'talented' than others. Talent may have something to do with who gets involved in particular skill or who progresses past the novice stage - after all, no one likes to do things they’re bad at - but at the highest levels of achievement, practice is the most important factor by far. We also know that extensive practice at a skill changes the nature of your cognitive and perceptual processing in ways that improve your performance. Sometimes this generalizes to other tasks, such as improved visual processing in FPS gamers, and sometimes it doesn’t, as with case of a chess expert’s improved memory for chess positions.



Annoying Limitations



There are some limitations to typical studies of expertise, however. For example, it’s not uncommon to see a study with only 20 participants. When doing experimental work, more is definitely better, and small studies often need to be replicated before firm conclusions can be drawn. Another problem is that studies tend to have only a few levels of expertise, such as novice, intermediate, and expert. Nevertheless, these limitations make it difficult to see how the various components of a complex skill develop, and in particular, how they might interact with each other. The problem results because it is expensive to gather data from experts in real world tasks. Take, for example, tennis. Measuring the speed of a tennis serve is complicated, requires special equipment, and requires a lot of setup time. Measuring the speed of two hundred tennis serves from players across a wide range of skills is beyond the resources of most university research labs. And serve speed is only one of many interesting variables you’d want to look at. What about foot speed, reaction time and any of a dozen other interesting components to tennis skill? That is not even mentioning the difficulty in finding enough participants of different skill levels; even in a major city there are only a handful of tennis pros.





Starcraft Saves the Day



As it turns out, the way to solve all these research problems is sitting on your computer with the extension ‘.SC2replay’. The replay file overcomes the data collection problem because it automatically, instantly, and effortlessly collects a wealth of data about your play. SC2 records every command you and your opponent make: every time you select a probe, every time you build a tank, and every time you fungal a mineral line. The replay file, then, is a timestamped list of every action you perform during the entire game. This list is fed back into the game engine to produce the replay that you actually watch when you review your games.



A replay contains tons of interesting information. It includes selections and deselections, attack commands, build commands, hotkey definitions and hotkey use. From this data, we can look at a great many things. Multitasking, APM, Map Awareness, Scouting, etc., can all be tracked from the information in a replay file. One of the big changes from SC:BW replays is that they now includes camera movements. Players can now see a first person view , and we can now study how you move the main screen to gather information from around the map. For us, a lab focusing on information access and use, this is very interesting data. The speed at which a player can switch to a new view, do some action and switch views again constitutes what we would call a perception-action cycle, and reveals a great deal about the cognitive capacities of the player. We can use such information to learn about how cognitive processing changes as a function expertise, and what the limits of real-time information processing are in the context of a complex cognitive-motor task.



Processing replay information is far from easy, however. Calculating a meaningful APM, for example, involves filtering out spamming, which is most common early game, but can happen any time. Making a bunch of zerglings can look like spamming to a computer program that is looking to strip out repeated actions, so there is a lot of testing that has to happen before we can be sure where capturing what we want to capture. There are also frustrating omissions. For instance, pathing is dealt with by the game engine, so we cannot tell for sure where units actually go, only where they are supposed to end up. We don’t know for sure when units die, because dying is isn’t a command. Are you late on your chronoboost timing or did your nexus get sniped? It is tricky to tell. Despite the complexity of the analysis, there are a large number of factors that are reliably captured, and we will be focusing on those at first.







A Ground Breaking Study



Because data collection is already done by the game, and because data analysis can be automated, it becomes feasible to collect and analyze expertise data on a scale never seen before in this field of research; not just from 20 people, or even 200 people, but from 2,000 or even 20,000 people. Having a data set this large solves all the limitations one normally sees in expertise research. We can have very reliable estimates because we have so many data points. We can get a finer grained picture of the development of expertise because we have replays from seven different leagues, as well as professional players. The combination of these two factors allows us to get a better understanding of the rate of development of various skills with respect to each other. This is not a trivial question. It’s often assumed that all subcomponents of expertise get better at the same rate, for example that your ability to scout for expansions improves as your APM goes up. But it may turn out that scouting ability really only improves once your APM reaches a certain threshold. It also may be that the biggest APM improvements happen somewhere in the middle of the skill continuum, or it may be that large APM improvements occur only after players master hotkeys. Perhaps the internalization of specific timings (e.g., Chronoboost) occur only late in the skill curve. These examples present a picture of interdependent sub-skills. Our study has the potential to discover such interdependencies, something that a study with 3 levels of expertise and 20 people could never do.



The Real World



SC2 is interesting to study because it is complex, has a nice slow learning curve, involves memory, decision-making and perceptual processes, and involves interacting with information of varying relevance in an rich GUI. As a result, findings from our study can be useful for understanding real world tasks in which these processes are essential. A good example is emergency management. The SC2 interface is surprisingly similar to software designed for command and control centers in emergency management, where the goal is to deploy emergency personnel (fireman, paramedics and police) to crisis areas, while protecting strategic locations (bridges, reservoirs, etc.). Understanding the cognitive processes underlying resource management in Starcraft 2 could help us understand how experts and novices make analogous decisions in emergencies and, one day, may eventually lead to insights into how disaster response can be improved. Studies like ours can also help system designers. Specialized software systems do not have large user bases, and seldom do users have thousands of hours of practice. Studying SC2 can give designers some sense of what expert use of such a system would look like. Finally, understanding which cognitive/motor/perceptual skills change at which times over training can also be very beneficial for optimizing training regimes for a variety of real world tasks.



Future Investigations



Since beginning our project we have been contacted by several other scientists, and we are now discussing collaborations with researchers at Temple (on multitasking), and Brown Universities (on eye-tracking). I know of some reply analysis of harvested SC:BW games by a researcher at UC San Diego, and work done on SC2 players’ enhanced cognitive abilities by colleagues at the University of Texas at Austin. This suggests to me that there is both tremendous interest, and tremendous potential in SC2 research.



As for future projects from the Cognitive Science Lab, we are planning to pursue more replay analysis studies: joint attention and coordination studies of team games, and longitudinal studies which track individual players progress over many months (Bronze players save your replays, we’ll want them all!). We are also looking at doing work with eye-tracking. We have been in contact with MLG and may bring our eye-trackers to a tournament sometime next year. We also have an interest in investigating different training methods, for example, discovering what’s the best way to train someone to look at the minimap.







How Science Can Help Starcraft



There are definitely players who don’t care about science at all. They aren’t interested in eSports, or the SC2 community or in anything else beyond just having fun; and, of course, that’s totally OK. But when I see 10,000 players tune in live for the Day[9] daily; when I see that Husky’s casts get 70,000 views in one day; when I get PM’d with an offer to help analyze data from a fellow gamer with a PhD in particle physics; when I hear Star Girl’s casts, and watch my own kids’ enthusiasm for playing; and when I see people organizing BarCraft events all across North America that are so popular that they are covered by the Wall Street Journal; well, it’s just so obvious that there is a huge, vibrant, intelligent community of dedicated gamers who care about gaming and its future.



But we need to realize that our vision clashes with most people’s understanding of gaming. If you walk around my neighborhood at 4pm you can hear cacophony of poorly tuned pianos making their way, in fits and starts, through the Harry Potter theme. Parents pay $1000 a year per child for piano lessons, and will encourage, cajole, badger and berate their kids into practicing for an hour a day, or more. Why aren’t parents bugging their kids to memorize TvT builds, or practice their 4-gate, or watch replays from their mandatory daily laddering session, or write an essay about how they can improve their game? Diligently mastering StarCraft 2 develops fine motor skills and strategic thinking, it trains both planning and time critical decision-making, it helps develop mental toughness, it encourages reflection and analysis, and it offers the myriad benefits of any serious pursuit; yet these rational, caring parents diligently limit their kids “screen time” to an hour a week.



Beyond the scientific benefits, the marriage of Science and StarCraft can help change people’s misconceptions about gaming. The SkillCraft expertise project adopts, as its fundamental assumption, that StarCraft experts are TRUE experts who, because of their dedication, hard work and amazing skill, are worth studying. Whenever I publish a paper, present work at a conference, chat with colleagues, talk to a reporter, write a grant, make a presentation to industry, or recruit a student, I’ll have to convince someone that StarCraft is interesting and worth studying. I’ll have to describe the game, the community, and the professional scene. I’ll have to talk about the dexterity required, the strategy, the split-second decisions, and the resource management. In other words, I’ll be educating people about what playing StarCraft is really like. And I’ll be doing it, not as a gamer, but as a Cognitive Science professor at a respected university with a PhD and a nationally funded research program. I won’t be alone, either; as interest in researching SC2 blossoms, every scientist on every research project will be doing exactly the same thing. Scientists have been studying expertise for decades, and we now know a great deal. We know that expertise in a complex skill that requires about 10,000 hours of deliberate practice. We know that deliberate practice, in which effort is made to improve, typically by doing tasks that are targeting specific sub-skills (for example playing scales on the piano), is far better than simply playing a lot. We know that the difference between the top experts in a field is correlated most strongly with hours of deliberate practice, and that there is very little empirical support for the idea that some experts are more 'talented' than others. Talent may have something to do with who gets involved in particular skill or who progresses past the novice stage - after all, no one likes to do things they’re bad at - but at the highest levels of achievement, practice is the most important factor by far. We also know that extensive practice at a skill changes the nature of your cognitive and perceptual processing in ways that improve your performance. Sometimes this generalizes to other tasks, such as improved visual processing in FPS gamers, and sometimes it doesn’t, as with case of a chess expert’s improved memory for chess positions.There are some limitations to typical studies of expertise, however. For example, it’s not uncommon to see a study with only 20 participants. When doing experimental work, more is definitely better, and small studies often need to be replicated before firm conclusions can be drawn. Another problem is that studies tend to have only a few levels of expertise, such as novice, intermediate, and expert. Nevertheless, these limitations make it difficult to see how the various components of a complex skill develop, and in particular, how they might interact with each other. The problem results because it is expensive to gather data from experts in real world tasks. Take, for example, tennis. Measuring the speed of a tennis serve is complicated, requires special equipment, and requires a lot of setup time. Measuring the speed of two hundred tennis serves from players across a wide range of skills is beyond the resources of most university research labs. And serve speed is only one of many interesting variables you’d want to look at. What about foot speed, reaction time and any of a dozen other interesting components to tennis skill? That is not even mentioning the difficulty in finding enough participants of different skill levels; even in a major city there are only a handful of tennis pros.As it turns out, the way to solve all these research problems is sitting on your computer with the extension ‘.SC2replay’. The replay file overcomes the data collection problem because it automatically, instantly, and effortlessly collects a wealth of data about your play. SC2 records every command you and your opponent make: every time you select a probe, every time you build a tank, and every time you fungal a mineral line. The replay file, then, is a timestamped list of every action you perform during the entire game. This list is fed back into the game engine to produce the replay that you actually watch when you review your games.A replay contains tons of interesting information. It includes selections and deselections, attack commands, build commands, hotkey definitions and hotkey use. From this data, we can look at a great many things. Multitasking, APM, Map Awareness, Scouting, etc., can all be tracked from the information in a replay file. One of the big changes from SC:BW replays is that they now includes camera movements. Players can now see a first person view , and we can now study how you move the main screen to gather information from around the map. For us, a lab focusing on information access and use, this is very interesting data. The speed at which a player can switch to a new view, do some action and switch views again constitutes what we would call a perception-action cycle, and reveals a great deal about the cognitive capacities of the player. We can use such information to learn about how cognitive processing changes as a function expertise, and what the limits of real-time information processing are in the context of a complex cognitive-motor task.Processing replay information is far from easy, however. Calculating a meaningful APM, for example, involves filtering out spamming, which is most common early game, but can happen any time. Making a bunch of zerglings can look like spamming to a computer program that is looking to strip out repeated actions, so there is a lot of testing that has to happen before we can be sure where capturing what we want to capture. There are also frustrating omissions. For instance, pathing is dealt with by the game engine, so we cannot tell for sure where units actually go, only where they are supposed to end up. We don’t know for sure when units die, because dying is isn’t a command. Are you late on your chronoboost timing or did your nexus get sniped? It is tricky to tell. Despite the complexity of the analysis, there are a large number of factors that are reliably captured, and we will be focusing on those at first.Because data collection is already done by the game, and because data analysis can be automated, it becomes feasible to collect and analyze expertise data on a scale never seen before in this field of research; not just from 20 people, or even 200 people, but from 2,000 or even 20,000 people. Having a data set this large solves all the limitations one normally sees in expertise research. We can have very reliable estimates because we have so many data points. We can get a finer grained picture of the development of expertise because we have replays from seven different leagues, as well as professional players. The combination of these two factors allows us to get a better understanding of the rate of development of various skills with respect to each other. This is not a trivial question. It’s often assumed that all subcomponents of expertise get better at the same rate, for example that your ability to scout for expansions improves as your APM goes up. But it may turn out that scouting ability really only improves once your APM reaches a certain threshold. It also may be that the biggest APM improvements happen somewhere in the middle of the skill continuum, or it may be that large APM improvements occur only after players master hotkeys. Perhaps the internalization of specific timings (e.g., Chronoboost) occur only late in the skill curve. These examples present a picture of interdependent sub-skills. Our study has the potential to discover such interdependencies, something that a study with 3 levels of expertise and 20 people could never do.SC2 is interesting to study because it is complex, has a nice slow learning curve, involves memory, decision-making and perceptual processes, and involves interacting with information of varying relevance in an rich GUI. As a result, findings from our study can be useful for understanding real world tasks in which these processes are essential. A good example is emergency management. The SC2 interface is surprisingly similar to software designed for command and control centers in emergency management, where the goal is to deploy emergency personnel (fireman, paramedics and police) to crisis areas, while protecting strategic locations (bridges, reservoirs, etc.). Understanding the cognitive processes underlying resource management in Starcraft 2 could help us understand how experts and novices make analogous decisions in emergencies and, one day, may eventually lead to insights into how disaster response can be improved. Studies like ours can also help system designers. Specialized software systems do not have large user bases, and seldom do users have thousands of hours of practice. Studying SC2 can give designers some sense of what expert use of such a system would look like. Finally, understanding which cognitive/motor/perceptual skills change at which times over training can also be very beneficial for optimizing training regimes for a variety of real world tasks.Since beginning our project we have been contacted by several other scientists, and we are now discussing collaborations with researchers at Temple (on multitasking), and Brown Universities (on eye-tracking). I know of some reply analysis of harvested SC:BW games by a researcher at UC San Diego, and work done on SC2 players’ enhanced cognitive abilities by colleagues at the University of Texas at Austin. This suggests to me that there is both tremendous interest, and tremendous potential in SC2 research.As for future projects from the Cognitive Science Lab, we are planning to pursue more replay analysis studies: joint attention and coordination studies of team games, and longitudinal studies which track individual players progress over many months (Bronze players save your replays, we’ll want them all!). We are also looking at doing work with eye-tracking. We have been in contact with MLG and may bring our eye-trackers to a tournament sometime next year. We also have an interest in investigating different training methods, for example, discovering what’s the best way to train someone to look at the minimap.There are definitely players who don’t care about science at all. They aren’t interested in eSports, or the SC2 community or in anything else beyond just having fun; and, of course, that’s totally OK. But when I see 10,000 players tune in live for the Day[9] daily; when I see that Husky’s casts get 70,000 views in one day; when I get PM’d with an offer to help analyze data from a fellow gamer with a PhD in particle physics; when I hear Star Girl’s casts, and watch my own kids’ enthusiasm for playing; and when I see people organizing BarCraft events all across North America that are so popular that they are covered by the Wall Street Journal; well, it’s just so obvious that there is a huge, vibrant, intelligent community of dedicated gamers who care about gaming and its future.But we need to realize that our vision clashes with most people’s understanding of gaming. If you walk around my neighborhood at 4pm you can hear cacophony of poorly tuned pianos making their way, in fits and starts, through the Harry Potter theme. Parents pay $1000 a year per child for piano lessons, and will encourage, cajole, badger and berate their kids into practicing for an hour a day, or more. Why aren’t parents bugging their kids to memorize TvT builds, or practice their 4-gate, or watch replays from their mandatory daily laddering session, or write an essay about how they can improve their game? Diligently mastering StarCraft 2 develops fine motor skills and strategic thinking, it trains both planning and time critical decision-making, it helps develop mental toughness, it encourages reflection and analysis, and it offers the myriad benefits of any serious pursuit; yet these rational, caring parents diligently limit their kids “screen time” to an hour a week.Beyond the scientific benefits, the marriage of Science and StarCraft can help change people’s misconceptions about gaming. The SkillCraft expertise project adopts, as its fundamental assumption, that StarCraft experts are TRUE experts who, because of their dedication, hard work and amazing skill, are worth studying. Whenever I publish a paper, present work at a conference, chat with colleagues, talk to a reporter, write a grant, make a presentation to industry, or recruit a student, I’ll have to convince someone that StarCraft is interesting and worth studying. I’ll have to describe the game, the community, and the professional scene. I’ll have to talk about the dexterity required, the strategy, the split-second decisions, and the resource management. In other words, I’ll be educating people about what playing StarCraft is really like. And I’ll be doing it, not as a gamer, but as a Cognitive Science professor at a respected university with a PhD and a nationally funded research program. I won’t be alone, either; as interest in researching SC2 blossoms, every scientist on every research project will be doing exactly the same thing.



If you support SC2, support SC2 + Science.



Participate

http://skillcraft.ca/



Other Info



Original TL thread

Reddit AMA

One hour interview with InfestedMrT for rCraftGaming on Twitch

(They do King of the Hill tournaments for all levels, you should check them out)

If you support SC2, support SC2 + Science.(They do King of the Hill tournaments for all levels, you should check them out) SkillCraft.com - StarCraft + Science