The dream of the cop is second sight.
If law enforcement could predict where and when a crime was going to occur, or who was going to commit it, they could get around the physical limits of reality: time is an arrow and space, too vast to saturate. With the knowledge of the perfect spot at the perfect time to catch the burglar as she begins to jimmy the window, or stop the shooter as he takes aim, order could be perpetually maintained, rather than laboriously restored after each new mess is made. Omniscience to the point of prescience might seem like a big ask, but American police are nothing if not ambitious—and over the last 15 years, police forces across the country have started gathering and analyzing more and more data on their jurisdictions, and on the people who live within them, in an effort to digitally open their third eye and see into the about-to-be. Predictive policing, and the data-gathering and surveillance that enables it, is now in effect in most major American cities. Like many imagined sci-fi futures, this one was brought into reality under the warm sun of Los Angeles, a city that has since served as the incubator and proving ground for predictive policing since 2005, when a paleoanthropologist at UCLA named Jeffrey Brantingham was looking for a change. He had built his career on studying the movement patterns of prehistoric societies, analyzing the deposits of their stone tools scattered around the Eurasian plains to infer how they hunted, and how their fortunes rose and fell as climate and resources shifted over time. But Brantingham had grown weary of hand axes, and long-dead subjects, and began to look around for a richer vein of data. He soon struck gold in two sets of statistics linked to law enforcement at home and abroad: the Iraq Body Count, which strove to record the place and location of civilian murders over the course of the ongoing war, and crime statistics from Southern California police departments, which had been recording the place and time of reported crimes since the CompStat system of crime tracking first swept the nation in the '90s. The timing was ripe for funding high-tech security research. As IED deaths in Iraq continued their relentless climb and the Second Battle of Fallujah, the bloodiest of the entire war, demonstrated the strength of a growing insurgency, the Department of Defense and National Science Foundation were opening up their coffers to researchers looking to analyze the rising body count abroad and criminal acts here at home.
Brantingham assembled an analytics team—Andrea Bertozzi and Lincoln Chayes, two mathematicians at UCLA, plus George Tita, a UC Irvine criminologist—and wrote up a proposal in 2005 for federal funding of a multi-year research project, centered on the idea that violent death and urban crime can be modeled as natural phenomena that can be tracked in space, according to a knowable pattern that ripples through everything from American cities and war-torn countries. That was the theory, at least.
Those patterns are called “self-exciting point processes,” and strange though it sounds to say that a business so relentlessly human as armed robbery or mass murder could be mapped independent of motive or personality or ideology—and predicted based on location, not any of the parties involved—the idea was a logical extension of "hot spot" policing, which says that crimes cluster in the same place, day after day. Self-exciting point processes can take many forms, depending on the algorithm and the operating metaphor. In those early days, Brantingham and Bertozzi and their collaborators and grad students tried a little bit of everything. Did violence move like a fluid, splashing across the map? Did death more closely match life, in the way that bacteria multiplied in a petri dish? Or maybe car thieves thought like Neolithic hunter-gatherers, wandering the LA plains in search of injured Honda Civics, separated from the herd. “These are different lenses you can use to look at the world,” Brantingham says, “but all of these models are based on a common set of principles, like a statement of criminal physics.” The premise is that crime, broadly defined, occurs when a "motivated offender" encounters a “suitable target” in the absence of “effective security,” and that such conditions arise in the normal course of everyone’s daily routines. “You don’t need a special theory of motive, or criminals, to understand where and when crime occurs,” Brantingham says, “which makes it very amenable to mathematical modeling.” Yet, in this model, humans don't even enter the data—since these projections only take place, time, and type of crime into account, they can tell you when and where a crime might happen, but nothing about who might commit it. When a postdoc named George Mohler, who had recently finished his Ph.D in Math at UC Santa Barbara, joined the research group around 2010, he brought with him a new metaphor based on his prior research, with an algorithm to match: maybe crime and violent death rumble through the landscape like earthquakes and their aftershocks, clustering and echoing off of the first major temblor.
“You don’t need a special theory of motive, or criminals, to understand where and when crime occurs.” —Jeffrey Brantingham
By chance, the earthquake model worked. If you fed the algorithm a chunk of past burglary data, it could spit out a map highlighting the small sliver of the city where a quarter of the next day's burglaries would occur, split up into an array of squares measuring 500 feet to a side. The idea, based on decades of research on crime prevention, was that police could park themselves in those boxes for some percentage of their shifts and either catch the people who were predicted to be committing crimes in those zones or, by the mere presence of a police cruiser, deter any would-be criminals from taking action. The team published their results in the Journal of the American Statistical Association in 2011, and soon after, Mohler got a call from the Santa Cruz PD: they wanted to try his model of predicting crime out in the field. “We were never thinking, ‘let’s create an algorithm and get the police to use it,’” Bertozzi says. The team had been up front with the ambitious LAPD lieutenant assigned to be a liaison to the researchers, Sean Malinowski, that their goal was to do basic science research on the data, not serve as an incubator for a functional predictive policing program. That goal changed as soon as Santa Cruz reported a 27 percent drop in burglaries in the first month using Mohler's predictions. The LAPD then piloted a similar program in its Foothills division, which hugs the northeast edge of the San Fernando Valley—after four months of following the earthquake algorithm's boxes, they claimed that property crime was down 13 percent, while the rest of the city had seen a half-percent uptick. Later that year, in 2011, Brantingham and Mohler turned their algorithm into a business called PredPol, which has since sold its predictive software to over 60 police departments across the U.S. and the UK. Today, Brantingham says that PredPol has started to push the boundaries of its initial focus. The research team assumed that only property crimes like burglary and theft could really be successfully modeled, both because they're far more common than violent crime, yielding a richer data set, and because the idea of a thief opportunistically grazing on a fixed set of low-security houses and cars more neatly fits within the target-offender framework. Surprisingly, though, Brantingham says that they've found robberies—intimate crimes of intimidation or violence—to be even more regularly predictable than simple property crimes. “We never thought that would be the case,” he says, “but that's the great thing about science.”
Even though the Santa Cruz PD’s 2011 efforts made it the first force to put PredPol into effect, the LAPD had been primed to begin predictive policing for almost a decade, ever since Bill Bratton became LA’s top cop in 2002.
Bratton first gained national recognition in 1994, when he was brought in as NYC’s Chief of Police as a reform hire in the wake of a series of corruption scandals. In order to combat internal corruption (and the department's PR issues) in NYC, he almost immediately implemented CompStat, a system of meticulously measuring police success or failure based on the changes in weekly precinct-level crime statistics. This early data-based policing model made Bratton a law-enforcement star, and has since become a standard component of American police management. On the back of that reputation, Bratton was brought out to LA as a reformer in the wake of LAPD’s massive Rampart Division scandal, which had revealed pervasive corruption, brutality, and civil rights violations across the city, and which led the Department of Justice to enter into a consent decree with the police department. (A consent decree is a common tool of the DoJ, essentially a contract in which both parties agree to enact a certain series of changes—weed out racist practices, split up a monopoly, segregate a district’s schools—under court supervision without admitting guilt or liability.)
At the local level, one of the terms of the consent degree required the LAPD to computerize its force so that officers could collect and report data on all of their routine activities. This came at a time when, across the nation post-9/11, the federal government was relaxing privacy laws and willing to spend freely to turn local law enforcement into counterterrorism operations. And Bratton, with his predilection for quantification and history of using data as a tool for internal turnaround, was ready to ride that wave. In 2008, towards the end of his tenure, Bratton cemented his data-centric legacy in LA when he reached out to an old acquaintance named Craig Uchida, who ran a Washington, D.C.-based consulting and research firm called Justice & Security Strategies, Inc. to see if he'd want to come work with the LAPD to figure out how predictive policing might work. “I was intrigued, mainly because it tied into the use of big data, which was just starting to emerge in criminal justice,” Uchida says. “And it coincided with my interest in baseball SABERmetrics—you know, Moneyball.” Again, the timing wasn’t a coincidence. The 2008 recession hit local law enforcement budgets hard, and common cop sense dictated that a sharp economic downturn would lead to a spike in crime (which never came to pass—crime has steadily decreased nationwide since 1990, and 2010 marked a 40-year low). Around the country, police responded by looking for ways to do more with less, and predictive policing promised to help make staffing more efficient, and automate some of the intuition about patrol areas that officers accumulate over years. Working with Lt. Malinowski and Chief of Detectives Charlie Beck (who have since risen the ranks to Deputy Chief and Chief of Police, respectively), Uchida surveyed the field of potential vendors for place-based predictive policing and served as an LAPD liaison to Brantingham's research at UCLA. But at the same time, he began working on a pair of initiatives that would prove much more powerful, and more troubling, than the first wave of prediction.
Operation LASER (Los Angeles Strategic Extraction and Restoration), a LAPD program developed by Uchida and first implemented in 2012, doesn't aim to predict where and when a crime might happen, but who might commit it. Absent reasonable suspicion or probable cause, the goal is to rank residents of Los Angeles by likelihood of involvement in a future violent crime, and distribute that information to officers and detectives in the form of a “Chronic Offender Bulletin”—a most-wanted poster of people wanted for their potential to commit a hypothetical crime. By 2011, LAPD was piloting it in the Newton division, south of Downtown LA. By mid-2018, it’s slated to expand to 16 of the department's 21 divisions. Only the surface details of Operation LASER’s core mechanisms have been revealed to the public, but one of the few LAPD documents outlining the program, co-authored by Uchida in late 2012, make its central logic clear:
“The program is analogous to laser surgery, where a trained medical doctor uses modern technology to remove tumors or improve eyesight. First, the area is carefully diagnosed: Who are the offenders, and where and when are they involved in criminal activity? Plans are then developed to remove offenders from an area with minimal invasiveness and minimal harm to the people and areas around them. Extraction of offenders takes place in a “non-invasive” manner (no task forces or saturation patrol activities), and the result produces less disruption in neighborhoods.”
The same document lays out the math used to determine which residents become classed as cancers on their communities, but it’s difficult to know whether further criteria also factor in. If nothing else, parole or probation, prior handgun arrests, convictions for violent crimes, and gang membership all count for five points, and each “quality police contact” in the past two years, measured by internal police documents called Field Interview Cards, count for an additional point. Even from that sketchy outline, LASER has two clear feedback loops built into its calculations. Gang membership is a notoriously nebulous category in the eyes of the LAPD, which has placed almost 9,000 people under gang injunctions since 2000, making it illegal for them to be out in public past a curfew, congregate in large groups, or wear certain types of clothes—blue shirts, for instance, or a Dodgers jersey. (7,300 people were removed from the list last year, an action prompted, in part, by an ACLU lawsuit arguing that the injunctions violated due process. Furthermore, the CalGang database that the injunction rolls were based on, and which is also used to justify harsher sentencing and deny access to social services, was found to include 42 “gang members” who had been added to the database before their first birthday.) But the Field Interview cards are where the circular logic picks up speed. Officers are expected to write up every time they interact with a person, and just seeing a person walking down the street as you drive by can qualify as a reportable “interaction.” These cards can go into as much detail as the officer writing them wants—place and time and name of the person, sure, but also what they were wearing, who they mentioned, whether they seemed agitated, or suspicious, or up to no good. Uchida explains that the whole point of the program is to give officers the information they need to keep tabs on “chronic offenders.” “If you see him, stop him and see what he’s up to,” Uchida says, “because in our view these guys are the most active and engaged in criminal behavior.”
Both PredPol and LASER, the place- and person-based models of predictive policing, have drawn scrutiny and criticism from a wide range of sources. Critiques of the programs themselves focus on issues of privacy, transparency, and bias—how the underlying data is collected, how it analyzed, and whether the outcome just a tech-enhanced version of the same policing practices that are often riddled with human error and racist thinking. Andrew Ferguson, whose 2017 book The Rise of Big Data Policing is the most comprehensive survey of data-based predictive policing practices to date, parses the programs out into distinct categories. He characterizes PredPol and place-based programs like it—since Brantingham’s initial success, companies like HunchLab, IBM, and CivicScape have entered the market for place-based crime prediction, all with different algorithms and levels of complexity—as relatively unproblematic, since they rely on police report data, which mostly reflects calls for service initiated by victims or witnesses, not the reports of police themselves. Person-based policing, the next category in Ferguson’s taxonomy, is a much stickier ball of wax. “Any time you have arrests as part of the input for your predictive system,” Ferguson says in a phone interview, “you’re really only predicting what police are doing.” Systems that rank people’s level of criminality based on information like arrests, traffic stops, or even just police observation, all of which can be initiated and recorded by police officers at will, create an endless feedback loop, in which a person is deemed suspicious because the cops find them suspicious, which then proves that the person is worthy of further suspicion. Whether you describe the outcome as persistent policing based on professional intuition and expertise or biased harassment reinforced by mindless math, these ostensibly predictive systems seem to just paper a futuristic facade over age-old police tactics. Some critics of the trend towards predictive policing, however, reject the idea that any police model of future crime, whether place- or person-based, can escape the weight of history. “Criminal behavior that takes place in poor communities in cities is more likely to be observed,” explains Sam Adler-Bell, and thus more likely to be reported and fed into PredPol, regardless of the real distribution of crime across a city. Adler-Bell, a policy associate at The Century Foundation, connects this to what he calls the “nature of the architecture of capitalism.”
“This is just another opportunity for them to use technology to mask racism, while making themselves look like they've reformed.” —Jamie Garcia
“People who are poor live their lives in public, or in spaces where they're near other people all the time,” he says. “The difference between living in a house with a yard and a fence and living in an apartment building is the difference between the ‘criminal behavior’ you engage in being visible to others and it not being reported at all.” Locally, members of the Stop LAPD Spying Coalition, a group of citizen activists, have been trying to push back against and raise awareness of the development of these programs since 2011. “Every ten years or so, law enforcement comes up with a new way to reinvent themselves,” says Jamie Garcia, an oncology nurse who has become the Coalition’s in-house expert on predictive policing practices. “PredPol claims to not be targeting people, but police have always targeted certain communities—this is just another opportunity for them to use technology to mask racism, while making themselves look like they've reformed.” Adler-Bell echoes Garcia's sentiment. “Algorithmic policing is a very literal manifestation of what we mean by structural inequality—what looks like a neutral evaluation of facts on the ground is really the accumulated effect of a history of segregation and discriminatory police scrutiny and generational poverty.” In August 2017, a two-and-a-half year study of the culture of predictive policing in the LAPD, conducted by Sarah Brayne, a sociology professor at UT Austin, found that traffic citations also enter into the LASER calculations, giving police another discretionary way to up a person’s risk rating at will. In anonymous interviews, police told Brayne bluntly that the new system was a fancy way of shoring up the systems of subjective profiling that police had been using for ages without having to state their biases outright. One anonymous officer explained to Brayne that the point-based system lets law enforcement officials continue to profile Angelenos while technically complying with federal law. “They say you shouldn’t create a—you can’t target individuals especially for any race,” the officer said. “We didn’t want to make it look like we’re creating a gang depository of just gang affiliates or gang associates…We were just trying to cover and make sure everything is right on the front end.” In other words, even to the people using these new tools, all the techno-talk of LASER’s Chronic Offender Bulletins seems like a fancy way of updating the gang databases, which are, in turn, a fancy way of updating the older practice of assuming anyone who isn’t white is a criminal. But it’s proven difficult to confirm to what degree the LASER bulletins are biased. In May of 2017, the Coalition submitted a Public Records Act request to the LAPD, asking the department to reveal details like the race, gender, location, and qualifying criteria for the people the LAPD deem “Chronic Offenders” as part of Operation LASER, but the department won't release the information (the ACLU is currently suing the department over its systematic failure to comply with the California transparency law). Despite weeks of attempted contact, the one designated LAPD spokesperson on the department's use of predictive policing, Captain Jeff Nolte of the Rampart Division, could not find the time to respond to my questions on the subject before this story’s deadline.
As Uchida was developing Operation LASER in 2009, he also began to introduce the data-management system created by Palantir into the LAPD. Palantir, founded in 2004 by the Trump-supporting tech billionaire Peter Thiel and named after a magical seeing stone used by wizards in Lord of the Rings, was initially funded by In-Q-Tel, the CIA’s venture capital arm, and found its first clients in the national security and surveillance industry. Its power lies in its ability to integrate a vast array of data sets—from gang databases and conviction records to massive datasets that include most members of the public, like health histories, DMV information, and every kind of financial transaction—into one searchable database, blurring jurisdictional lines in the process. With Palantir’s software running on an in-car laptop, an officer can bring up the name of an Angeleno placed on an Operation LASER chronic offender bulletin and immediately see a startlingly complete picture of his data-based life. According to Brayne’s study of the LAPD, a Palantir query provides cops not just the typical police data like convictions, arrests, and Field Interview card entries, but also friends, lovers, and credit scores, hospital records and parking lot receipts, contact lens rebates and the last time the Chronic Offender ordered a pizza from Papa Johns. Thanks to the LAPD’s citywide network of automated license plate readers, the same search can also call up a map of the Chronic Offender's movements around town.
Eye of Sauron, indeed. Palantir’s integration into the life of the LAPD came with a bundle of funding from the National Institutes of Justice in 2009, and allowed the department's data analysis and intelligence-gathering efforts to connect with a growing constellation of federal and private-sector surveillance databases stored at a facility in the LA suburb of Norwalk, CA, called a “fusion center”—one of a national network of data clearinghouses established by the Department of Homeland Security. (A 2012 Senate report described these clearinghouses as financially wasteful and useless in the pursuit of greater homeland security, noting that much of the data is “shoddy” and gathered by “potentially illegal” means.) Brayne found the widespread adoption of these powerful tools to search ever-expanding databases more disturbing than either place- or person-based predictive policing, which ultimately just amplify and computerize longstanding police practices. With Palantir, a cop today can learn more about any of us with a single click than a cop 10 years ago might after months of tracking and surveillance—and since the cop of 2008 would have required a warrant to conduct so thorough a search, Brayne notes that this fundamentally warps the application of Fourth Amendment protections. Uchida insists that the program, which he is still paid to administer with the LAPD, is on the up-and-up, citing the fact that it had been cleared by the City Attorney before being put into practice in 2011 (though it’s worth noting that the City Attorney also approved the widespread gang injunctions, before rolling them back years later over issues of accuracy and constitutionality). “All of these things about surveillance and privacy issues don't get at the core of what Palantir helps us do, which is really reduce crime,” Uchida says. “The suspicions about surveillance or anything else really have never come into play with the LAPD—I think they're so aware of constitutional rights and civil rights and privacy issues that using the data for illegal or immoral means is not something that they would even tolerate.”
Leaving aside the integrity of the LAPD’s moral compass, what little third-party research has been carried out on predictive policing and ever-widening warrantless surveillance has found it to have very little effect on reducing crime. Despite the initial enthusiasm around Santa Cruz’s 27 percent drop, or the LAPD’s reported 22 percent drop in homicides in the Newton Division after implementing Operation LASER, RAND Corporation reports have found that similar place-based programs in Louisiana and person-based programs in Chicago have had vanishingly small impacts on crime rates. A 2015 study conducted by much of the original PredPol gang, including Sean Malinowski of LAPD, found that for every 1000 minutes (almost 17 hours) patrol officers spent sitting in the geographic zones spit that the algorithm determined to be at risk for future crime each week, one less crime occurred. Over the course of an average LA cop’s week and the crime in an average LAPD division, that amounts to a 7.4 percent reduction in crime. There’s a chance that LA is simply doing it better, but in the absence of further study and further transparency on the part of LAPD, it's difficult to know for sure. But any estimate of the programs’ effectiveness are moot if beat cops don’t even have the time to sit in their boxes, or file Field Interview cards, or noodle around in Palantir’s databases. According to the LAPD’s own goals, patrol officers are supposed to spend 60 percent of their time responding to calls for service, respond to those calls within 7 minutes, and use the rest of their time for “proactive policing,” whether that means PredPol box-watching, tracking Chronic Offenders from the LASER list, or just walking around and saying hi to people.
“The nightmare vision of a totalitarian surveillance society already exists—in poor communities, and for Muslims, and immigrants, and people of color in general.” —Adler-Bell
Robert Harris, an 18-year veteran of the LAPD currently working as a director at the Los Angeles Police Protective League, the local cop union, says that current staffing levels for patrol officers are so low that the time available for proactive policing is approaching zero percent. “If you want these programs to be effective, and not just another acronym, you’ve got to staff them to give them a chance to succeed,” Harris says. He notes that the department recently orchestrated a wave of managed attrition, to shift personnel from special units back out onto patrol in an attempt to at least keep response times down, but that officers were still mostly “chasing the radio” all day. “On top of that shell game, PredPol and data are not going to fix the issues in our communities,” Harris says. “It’s just striving to check a box, and it doesn't do anything to improve the public perception of what we do.” Researchers and community activists note that the police presence in the Black and Latino neighborhoods that have always been surveilled and hotspotted by the LAPD hasn’t palpably changed, despite all the laserbeam rhetoric. The LAPD has led the nation in killings for four years running, fatally shooting residents at more than three times the per-capita rate of the NYPD, and edging out Chicago, which has a higher murder rate than New York and LA combined. A UCLA professor named Kelly Lytle-Hernández has shown that, from 2010 to 2015, the LAPD and LA County Sheriff’s Department collectively spent more than $500,000,000 jailing residents of South Central LA, an area that encompasses many of the pilot divisions for both PredPol and Operation LASER, most commonly for drug or DUI charges. “It would be naive of us to speak about predictive policing in isolation,” says Hamid Khan, the director of the Stop LAPD Spying Coalition. “It’s had a long trajectory, but even going back to broken windows policing, the goal has been to criminalize entire communities.” Joyti Chand, a resident of Baldwin Village, the poor, mostly black neighborhood (made famous in the 2001 Denzel Washington movie Training Day) also known as “The Jungle,” says that the motivating force of the LAPD seems to be aiding gentrification, rather than following the data. “In my community, overpolicing is creating an unsafe environment,” Chand says. "Black residents are being pushed out, it's attrition through enforcement.”
Despite the current trends, Andrew Ferguson holds out hope that systems as powerful as Palantir’s could be turned to good. If the agencies using this data are democratically accountable, and the methods of analysis are transparent, he believes that “bright data” could overtake “black data” to transform the world. “Predictive analytics are just about identifying risk,” Feguson says, “and the solution to dealing with that risk doesn’t have to involve policing.” Brantingham, too, points to the example of Kent, the county in southeast England that’s become a PredPol client: “Once a quarter, they hand out their prediction maps to all city workers—waste management folks, police officers, firefighters, elderly services—and everyone gets out in the box and engages the problems they see there.” “Police action always needs to be accountable,” Brantingham adds. “The predictive policing box doesn’t change that accountability structure at all.” In late December 2017, New York City created a task force with the power to investigate the algorithms that city agencies use, in an attempt to fight against hard-coded bias, and other cities like Seattle, Washington and Somerville, Massachusetts are creating stronger systems of citizen police oversight (in many cities, like LA, the civilian police commission can only make recommendations to the department). But the federal funding for data-based policing, and the police department market that creates for technology companies like PredPol and Palantir, most of which sell their products exclusively to law enforcement agencies, incentivize police to keep the big data to themselves. “Even though progressive police chiefs understand you can’t police your way out of problems,” Ferguson says, “They would have to turn down the money, and send it to city services instead.” “We often talk about these theoretical futures in which we would live under some kind of Big Brother-type surveillance regime,” Adler-Bell says. “But the nightmare vision of a totalitarian surveillance society already exists—in poor communities, and for Muslims, and immigrants, and people of color in general—we're all subject to it, but not equally so.” The sci-fi novelist William Gibson famously wrote that “the future is already here—it’s just not evenly distributed yet.” The quote is often read optimistically—someone out there has a jetpack, or a Bitcoin fortune, and soon we'll all be crypto jetpack millionaires. But the same holds true for a darker tomorrow, in which the language of high-tech objectivity obscures the reality of a pervasive, constant, and almost entirely secret network of surveillance that can track your every move. At least in “Minority Report,” the famous Philip K. Dick story about crime prediction, the “precogs” actually had psychic powers. In 2018, we’ve just got a nation of cops clicking around on their laptops whenever they've got a minute to kill.