A Serious Game for the CIA? MSU Alum Dr. Yu-Hao Lee Is Helping Connect the Dots

NOLA (106) (1)

Dr. Yu-Hao Lee

While cognitive bias can affect everyone, it is especially dicey among intelligence agents. Dr. Yu-Hao Lee, Michigan State University Media and Information Studies Ph.D. alum, is currently working with a team at the University of Oklahoma on a game to reduce cognitive bias among intelligence agents. For Yu-Hao, this is a match made in heaven, landing in his field of research—motivation and psychology of serious games. The two part game: MACBETH I and MACBETH II, is being developed for the International Advanced Research Projects Activity (IARPA), the research head of the intelligence community for the CIA.

Lee explains that the reason for the project evolved after 9/11, when intelligence analysts came together to try to figure out what happened, to see if anyone had caught any warning signs. They came to the conclusion that intelligence analysts may have trouble removing implicit cognitive biases from their work. Even though they need to make quick decisions, sometimes unconscious mental short cuts lead to errors. There are many well documented forms of cognitive bias. For example, “confirmation bias” is the tendency to pay more attention to evidence that supports or confirms your expectations. “Anchoring bias” is the tendency to place too much weight on initial clues. To address the problem, IARPA contracted experts in the field of psychology and games. And so, MACBETH—or Mitigating Analyst Cognitive Bias by Eliminating Task Heuristics—was born.

Catchy name, huh?

MACBETH I has a similar feel to Clue, but revolves around terrorist attacks instead of a murder. The player’s goal is to collect evidence to figure out the suspect, weapon of choice, and location. Each plot in MACBETH is based on real terrorist attacks. In the first iteration of the game, Lee’s role became testing, checking to see if the game successfully accomplished the cognitive bias reduction goals. As a strategy game, intelligence analysts loved it. But when Lee and his team decided to test the game on 2,000 undergraduates, they found that many of the students thought MACBETH was really hard and experienced a steep learning curve. This lead to the design of MACBETH II, which Lee also helped design with an interdisciplinary group of faculty and students. The teams split into six, one for each biases given by IARPA, with the focus of understanding the cause of the bias, how to mitigate it, and how to build mitigation mechanics into the game. Teams then converged to discuss their findings.

Screenshots of MACBETH

Screenshots of MACBETH

In the second iteration of the game, players control the agent who still collect clues but instead of needing to manage multiple screens, the game became more action-based and streamlined, using only one window and directing the field agent (or player) to different cargo ships, factory complexes, and so on. As the agent gives feedback on what information to gather, the player determines whether the clues contain bias or not. While the game seems much easier, Lee and his team have yet to test for effectiveness, but have high hopes based on their study of MACBETH I. They compared cognitive bias of intelligence analysts who watched a standard IARPA training video with a second group of incoming intelligence analysts who played the game. “Immediately after training,” Lee recounts, “both were equally as good at reducing bias. But eight weeks later, the effects of the game were still there, but not for the video.” For the intelligence community, this is a big thing because it shows that games are more effective than what they’re doing right now, and in effect, using MACBETH is changing the way intelligence analysts make decisions and take actions.

“For people who study games, it’s not surprising. Games are more motivating. When you watch a video, you’re not really thinking too much.”

Thinking back on the process of development, Lee recalls the large amount of team work across disciplines. They took cognitive bias theories and discussed how to work them into the game and make them “game-like instead of psychology-like.” After game designers built a version of the game, the research team received it for testing. Applying his knowledge of serious game design and research, Lee acts as the link between these two teams, helping leverage each group’s strengths. “For me,” Lee says, “it’s a really great match for what I do, and theoretically applies what I learned at MSU.”

Lee hopes that the game could be beneficial for the general population. While he wants to meet the agency’s demands and long-term goals, reaching a broader audience was part of the reason to test on college level students. Everyone has cognitive biases, so Lee believes it would be great if the game could teach others outside of the intelligence community. Overall, Lee looks back on this project with fondness as he now moves to continue his career at the University of Florida an assistant professor, taking his experience with games and emergent technology and psychology to the Sunshine State.

Comments are closed.