Off Topic

If I Was a Gamer, I’d Join the Army!

The US Army was looking for a new kind of recruit—someone with dexterity, skill, and patience. So it released a video game.

On July 4, 2002—less than a year after the September 11 attacks and the subsequent invasion of Afghanistan, and about nine months before President George W. Bush told the nation on live television that the United States was starting Operation Iraqi Freedom—the US army did a strange thing: it released a video game.

The army’s turn toward gaming came on the heels of a manpower crisis. Recruitment, which had become a necessity after the end of the draft in 1973, had reached a 30-year low in 1999. The armed forces were the victims of a successful economy: with decreasing unemployment, increasing private-sector wages, and rising rates of college attendance, potential high-quality recruits were looking elsewhere when considering how to spend their lives.

The effectiveness of traditional recruitment techniques also seemed to be declining. In the early 2000s, the recruitment playbook relied on in-person contact or exposure through traditional media: TV advertising, visits to schools, traveling military installations, student groups, direct mail, telephone calls, targeted visits by recruiters with leads, and websites with information about joining the military.

Between 1980 and 2000, the percentage of male American high school students who had said that they “definitely wouldn’t” enlist voluntarily had risen from 40 percent to 60 percent, according to a 2003 recruitment study from the National Research Council. Compounding the issue, the percentage of people who said they did have an interest in enlisting remained stagnant, despite a 318 percent increase in advertising between 1993 and 2000. For the army to meet its demanding goals—80,000 new recruits in 2000—it couldn’t just target those who were already interested: it would have to change the fundamental ideas young Americans had about it. To do that, it needed something new.

Enter Lieutenant Colonel Casey Wardynski. In 1999, Wardynski, director of the Office of Economic and Manpower Analysis at West Point, realized that video games could provide a means of reaching out to prospective recruits. Games were a medium that could be less costly and more impactful than traditional forms of advertising, such as print and television ads. As Wardynski would later put it to the New York Times in 2002 with succinct military candor, “Gaming tends to be very interesting to young Americans”—exactly the population the army was desperate to connect with. And so, America’s Army was born.

I was 12 years old in 2002 and infatuated with the idea of the military, although I had no real connection to the army beyond my interests in history and video games. I liked playing first-person shooters, having graduated from console games such as GoldenEye and Perfect Dark to more sophisticated titles such as the dystopian Deus Ex. In that sense, America’s Army was a perfect fit for a preteen, and it had two even stronger things going for it: one, I’d heard it was actually good, attracting positive reviews and press; and two, it was free, available for download online and via CD at recruiting stations and through the mail. For a 12-year-old in the early 2000s, a free video game was basically a miracle.

I downloaded the game at my house with a friend, and we took turns playing. The play was slow, but serious; there was a sense of consequence to your actions, and a lack of immediate respawn—once you died, you had to wait until the next round to play again. I would continue to play America’s Army both with that friend and on my own, and even if my leftward turn in high school put to rest any lingering military aspirations, the game had served its purpose: it put the army’s messaging in front of a potential recruit.

America’s Army was far from the military’s first brush with video games. In fact, as Corey Mead outlines in his 2013 book War Play: Video Games and the Future of Armed Conflict, this relationship extends back to the birth of the medium. “For several decades, from the 1960s to the early 1990s,” Mead writes, “the armed forces took the lead in financing, sponsoring, and inventing the specific technology used in video games.” The purpose of this engagement stemmed from the military’s traditional role as one of the main institutions financing and fostering technological development. In video games, the Army saw a tool with potential for any number of different situations: battlefield simulation, readiness for nuclear war and larger tactical engagements, and training.

The game was slow, but serious; there was a sense of consequence to your actions, and a lack of immediate respawn—once you died, you had to wait until the next round to play again.

First came Spacewar!, developed by MIT grad students in 1962 using Pentagon funding, in which two pixelated spaceships dogfight around a gravity-weighted star. It became so popular, Mead writes, that “Stanford University’s Computer Studies Department had to initiate a ‘no Spacewar! during business hours’ policy.” In the early 1980s, the army reached out to Atari about adapting its game Battlezone to help train soldiers to use the Bradley Fighting Vehicle, a three-man tank developed in response to advances in Soviet technology; the result was Army Battlezone, which, though completed, was never actually put into practice.

Military modifications of popular games continued with Marine Doom in 1996, a variation on the popular Doom first-person shooter but with an emphasis on teamwork. Like Army Battlezone, it didn’t become an official part of training, but it demonstrated three important assets that video games could provide to the military. First, there was the cost savings. Video games were relatively cheap; SIMNET, a combat simulator built by military contractors that could physically boom and shake and rumble, cost $140 million, whereas Marine Doom cost just $25,000. (America’s Army would cost $7.6 million to develop, which was still only one-third of 1 percent of the army’s marketing budget that year.)

Second, there was the proficiency. The army recognized that it required skills that were increasingly desirable in recruits. “Modern high-tech warfare was increasingly fought through electronic and digital interfaces resembling video games,” Mead writes, and “the belief that soldiers needed only the skills required to comprehend field and weapons manuals was superseded by the drive for digital expertise, for the highly advanced information-processing capabilities that video games supposedly promote.”

Finally, video games offered a new audience, giving the army a way to reach and influence prospective recruits before their minds were resistant to potential enlistment. Wardynski had noticed that his sons were infatuated with first-person-shooter video games, in which the player occupies the perspective of a character with a gun and the gameplay involves shooting other characters with guns, who are controlled by AIs or other players. (First-person perspective is a staple of 2003’s Call of Duty, which takes place on the battlefields of World War II.) As the army and Department of Defense had realized through their research, first-person-shooter games offered obvious parallels with modern military service: the constant threat of unseen insurgents, the need to assess lots of information at once, and the requirement of dexterity to handle increasingly complex weapons.

As war itself began to resemble the interface of a video game in the 1990s, game designers were being influenced by another medium: film. As Edwin Evans-Thirlwell, who wrote a comprehensive history of the first-person shooter for PC Gamer magazine, explains, Steven Spielberg’s interest in games was piqued after working on his 1998 film Saving Private Ryan. And, of course, there was Tom Clancy, “who was the US military’s foremost pop culture advocate for decades, and whose books and their film adaptations turn on making a spectacle of military technology and procedure—a tactic Call of Duty, Battlefield, and Halo have bought into,” Evans-Thirlwell told me. “Both of these men cofounded extremely successful studios to work on simulations and shooters that have shaped the tastes of generations of designers.”

Medal of Honor, a game that put its player in the role of an OSS operative during World War II, was released on October 31, 1999 and developed by the Spielberg-cofounded DreamWorks Interactive. It marked a drastic shift away from the cartoonish violence of Doom and Wolfenstein 3D, probably the most emblematic first-person shooters up to that point. The game’s realism and historical authenticity was met by controversy, however. Released six months after the Columbine school shooting, it became part of the national debate about video games, even as DreamWorks toned down the violence, removing any blood or gore. This was the environment in which America’s Army was developed, and Wardynski and his team took special care to ensure that it wouldn’t be met with the same kind of response.

The fundamental gameplay of America’s Army at its launch wasn’t terribly different from that of other popular first-person shooters at the time, especially ones that could be played online, such as 2000’s Counter-Strike. Players were organized into two teams, with objectives that included guarding or capturing parts of a map, or protecting VIPs. But Wardynski and his developers implemented a number of features that made America’s Army unique: before players could go online and compete with others, they had to finish a number of basic-training stages, based closely off of real-life basic training. Second, once online, players could never inhabit the roles of enemies; each team appeared to itself as the army and saw the other team as a kind of ambiguous, terrorist force. Third, unlike the dizzyingly fast-paced Counter-Strike and 1999’s graphic Unreal Tournament—the gaming engine that America’s Army was based on—the army built a certain degree of realism into its game: players operated in squads; they used weapons that the actual army uses, the functions of which—from accuracy to recoil to appearance—were closely modeled on their real-life equivalents; and, unlike in many games, players couldn’t just soak up bullets. A few shots and players were toast, at least until one of their buddies with medic training could revive them. As it would in reality, running or fast movement greatly decreased accuracy, which meant the pace of the game slowed down: players would have to line up their shots with care, seeking cover and taking their time. And, fourth, the game displayed messaging and information about the army, offering links to find out more online.

Released six months after the Columbine school shooting, Medal of Honor became part of the national debate about video games, even as DreamWorks toned down the violence, removing any blood or gore from the game.

“What we do is present information that then lets [players] know what the next step would be: there’s videos that play when you’re loading a map, there’s text that’s displayed when you’re loading at the end of a round with general information about the army,” says Daniel Kolenich, the current executive producer of the Army Game Studio, which develops games, comics, and apps meant to illustrate life in the army. “The purpose of the game is to give people information about the army that’s coming from the official source.”

America’s Army was designed to appeal to boys 14 and older, with the idea that it might seed the notion of joining the army in the minds of its audience, before their teenage years drew their attention elsewhere. And the idea, at least as Wardynski conceived of it, wasn’t just to make the army seem appealing: it was to educate potential recruits about what a career in the army might actually look like. At the same time, it would target gamers, a population whose problem-solving skills and digital literacy were increasingly valued by the military.

Wardynski believed that better-informed recruits who had already had a glimpse into army life would be less likely to drop out after enlisting, cutting into the then-13.7 percent attrition rate reported by the New York Times. At a $15,000 cost to recruit each soldier, that was a lot of money wasted. “By letting young people ‘test-drive’ both basic training and actual battle,” Mead writes, “Wardysnki saw America’s Army as a way to weed out those who might drop out later at vastly greater expense to the government.”

The game was meant to be a corrective to the run-and-gun of first-person shooters as a genre, presenting a far more team-oriented, slower-paced version of soldiering than its counterparts did. “They didn’t want the game to be, ‘You go hide behind a rock and become healthy’—they wanted it to be as realistic as they could, showing how vulnerable you are physically,” says Robertson Allen, who wrote America’s Digital Army: Games at Work and War. Instead the game raises questions of what realism means when discussing a virtual environment. For America’s Army, realism is a reflection of how the army sees itself, its function, and its tools.

“[The developers] have to keep it inside the realm of the army values. We’re not there to try to outdo some of the games that are out there,” says Russel Patishnock, marketing strategist at the office of the Assistant Secretary of the Army. “It’s never been the intent of America’s Army to be graphic … If we show a weapon, it’s an accurate look at that weapon, and if that weapon fires, it’s an accurate sound.”

In fact, the game was designed specifically not to be graphic. “They deliberately made it a Teen-rated game, which means that you can’t have dismemberment, gore—just a puff of blood, nothing else,” Allen says. “[They’re] showing what actually happens if you send the army to war: [they’re] sanctioned to perform violence on people, and that’s part of what the army does.’

Sixteen years after it was introduced, America’s Army is one of the cornerstones of the army’s public reputation, with 15 million registered players and over 278 million hours logged; a 2008 study by two MIT researchers found that “30 percent of all Americans age 16 to 24 had a more positive impression of the army because of the game and, even more amazingly, the game had more impact on recruits than all other forms of army advertising combined.”

Over the years, the game has gone through a few different iterations, each updating the graphics and engine while maintaining a focus on the squad-based, authentic details of the original. It’s also had to deal with its fair share of criticism. In 2008, the ACLU alleged that the game violates UN protocols regarding the military recruitment of children; the decision to bring the game into some middle and high schools was met with raised eyebrows; and even veterans criticized the decision to use the game as part of a recruitment center at a Pennsylvania mall in 2009.

For Evans-Thirlwell, the success of America’s Army fits squarely into a greater narrative that has played out in gaming. “One troubling effect is the normalization and legitimization of certain questionable ideologies or value sets, like the innate heroism of soldiering, the desirableness of military weapons themselves, and the god-given right of any ‘developed world’ army to intervene wherever it damn well pleases,” he says. “In many late-’90s action games, guns were just goofy ways of interacting with abstract spaces and entities; now, they and the fidelity of their operation have become the whole point of the experience, as though the purpose of the game were to sell you on the gun.”

This notion has been reinforced by the success of the Call of Duty franchise, a consistent best seller in the first-person-shooter genre and one that spans a number of historical epochs, from World War II to hypothetical future combat; between its debut in 2003 and January 2016, it had sold over 250 million copies across its various releases—before the more recent releases of Call of Duty: Infinite Warfare and the wildly popular Call of Duty: WWII.

Journalist Simon Parkin has looked at ways that realism, when it comes to weaponry, can function as effective advertisements for arms manufacturers; in a 2013 article for Eurogamer magazine, he talked with gun manufacturers who found that customers encountering their products in video games will make gamers more likely to buy them in real life. An advertisement is only realistic insofar as that realism is considered within the goals of the advertiser—and, as the army has said over and over, America’s Army is an ad.

Mead writes of the many struggles Wardynski had with the game’s original developers in terms of balancing faithfulness to army values and an overall educational tone with gameplay that would be as fun and engaging as possible. In 2007, following the development of America’s Army, Wardynski expanded the game into a touring concept called the Virtual Army Experience, which rendered it as a life-sized, physical interface that traveled around the country, allowing civilians to play America’s Army—but, instead of using a game controller, they were able to handle simulated rifles and ride on a simulated Humvee. Today the VAE is blow-up tent of nearly 20,000 square feet that can be brought to any NASCAR race, air show, or music festival, featuring replicas of M4 rifles, Black Hawk helicopters—and 75 computers running the latest iteration of America’s Army. In 2009, it attracted criticism from then-congressman Dennis Kucinich for giving “participants as young as 13 years old a naïve and unrealistic glimpse into the world of soldiering.”

In 2010, Wardynski retired from the army and was hired as the chief financial officer of Aurora Public Schools in Colorado. He was then appointed as the superintendent of Huntsville City Schools in Alabama, where he served for five years. In early June of this year, he was nominated by the Trump administration to become the assistant secretary of the army for Manpower and Reserve Affairs, a role that oversees issues such as recruiting, the readiness and life quality of soldiers, and the overall condition of the army—a sign of how successful America’s Army has been. Since Wardynsk’s departure from the military, it seems as if the game has become more dictated by play rather than the experiential elements he prioritized.

“The key difference in the newer one is a smaller, faster-paced environment to the game,” says Kolenich, the executive producer of the Army Game Studio. “America’s Army traditionally was a slower, more methodical, longer mission time than other titles such as Call of Duty and Unreal Tournament, but with Proving Grounds [the latest iteration of the game] we wanted to shorten those engagement times and focus on the smaller, faster missions, which seem a lot more fun.”

Beyond America’s Army, video games are coming to have a more important place in the military. Virtual reality is an increasingly common tool in treating PTSD, with the affected inhabiting virtual versions of Iraq and Afghanistan in order to process their trauma. And while America’s Army reflects a fundamental picture of the army’s real on-the-ground presence, other aspects of war are coming to look more like gaming, as exemplified in “shock and awe” invasions and occupations of Afghanistan and Iraq. As the military’s need for drone pilots increases, gamers could provide a rich population from which to draw—a process that has already begun to happen. And this time, the groundwork has been done for the recruiters: if war is starting to look like a video game, then nobody has to make a video game that looks like war.

Share this story

Loading more