The 1993 video game, in which a player fights the menace of hell, has become a hallmark for AI research for its simple 3D maps and potential for different styles of play.
This competition is particularly interesting because these AI play Doom in a way that imitates how an actual human would – by “looking” at the screen (actually, the screen buffer), finding and evaluating what’s happening, making decisions and then sending inputs to the game as though the AI were using a mouse and keyboard, points out Engadget. There were two “tracks” for agents to compete on, offering very different challenges. In the Track 1, AIs are armed with rocket launchers on a map they’ve seen before and included twelve 10-minute matches. The goal was simple: kill each other. They could heal themselves with Medikits, and gather more ammunition. On the other hand, in Track 2, AI was dropped into a map it had never seen before. Three different maps (four 10-minute matches on each) were scattered across, so the bots didn’t only have to resist with unfamiliar terrain, but the decision of how different weapons would affect the potential outcomes of the match.
Facebook, which competed under the team name “F1”, took home the (figurative) gold medal in the competition’s track 1 challenge by winning 10 of 12 matches. Intel with their “IntelAct” bot won the more difficult second track challenge, wherein it also won 10 of the 12 games, but by a much bigger margin than the Facebook win. Even though Facebook and Intel may have won the overall prizes, there were other impressive performances too. Three standout bots — Arnold, Clyde and Tuho — came from students.