Kyle Schurman
Oct 10, 2011

Can technology predict who has criminal intent?

Revelations that a federal government agency is testing a Minority Report-style “pre-crime” detection system has sparked an emotional debate, pitting those who believe in high-tech solutions against human-rights activists.  

 

Pity the poor people who fit into both categories.

 

The Electronic Privacy Information Center (EPIC) has obtained documents that show the U.S. Dept. of Homeland Security (DHS) has been testing a high-tech system that makes use of behavioral scanning techniques, claiming it can predict which people have an intent to commit crimes in the near future.

 

The pre-crime prediction system, called FAST (Future Attribute Screening Technology), has sparked thoughts of the “precogs” from the 2002 movie Minority Report, starring Tom Cruise. The precogs were mutated humans who could “see” crimes before they happened, allowing law enforcement authorities to arrest those who would have committed the crimes. The upside was that criminals would be arrested before they acted, saving lives.

 

The potential downside was that, if the precogs made a mistake, an innocent person would be arrested. Worries that a system like FAST could result in mistakes have human rights activists concerned about the actions DHS has taken.

 

The idea of seeing a system like FAST actually working is undoubtedly interesting, especially from a high-tech perspective. FAST does not involve using mutated humans to predict crimes, but the system uses a computing algorithm to process data measured about the suspect to attempt to determine the suspect’s likelihood of committing a crime in the near future.

 

The data the system uses is similar to what’s collected during a lie detector test, including heart rate, body movements and breathing patterns. However, the FAST system uses far more data and more complex data than a standard lie detector test.

 

For example, the system measures eye movements, pheromones, body heat changes and changes in voice fluctuations. In addition, a person’s ethnicity, job status, medication use, age, health condition and gender would be collected. The complex algorithm then puts together all of these measurements to determine the likelihood that the subject may be preparing to commit a crime.

 

The system will perform these measurements without making physical contact with the subject, using a variety of high-def video cameras. Law enforcement would ask each subject a few questions during a basic screening process, while the FAST system collects the data and makes its calculations.

 

Although the exact technical details of the FAST system haven’t been revealed, it appears many of the basic information-gathering processes are similar to those for lie-detector tests. With that information put through a computing algorithm, authorities then could find people who may require additional questioning.

 

Such a system sounds far from perfect, and a large series of tests would be required before anyone would take such a technology seriously — no matter how much the technology resembles something from a great sci-fi movie.

 

FAST’s development began in 2007, and the DHS recently began running tests of the system. Participants in the testing were volunteers, but those concerned about the loss of rights (how does presumed innocence fit into the FAST system?) are more concerned about the ramifications of even testing such a system. After all, running tests is the first step toward implementation.

 

The human rights concerns are certainly legitimate. How can someone be convicted of a crime that he is only “thinking” of committing? Can anyone actually be predisposed to committing crimes? If someone is simply preparing to commit a crime, couldn’t she simply choose not to do it at the last minute? The technology behind FAST doesn’t really answer those questions about rights.

 

Ultimately, DHS could use a system like FAST at airports or sporting events, trying to pinpoint those people who might be on the verge of committing a crime. At this point, though, DHS says FAST is only in the research phase, and it does not have any plans to deploy the technology. That might satisfy critics for now, but the only way to alleviate the concerns of critics completely is to guarantee that the high-tech, sci-fi aspect of FAST is infallible, which will be nearly impossible.

 

In Minority Report, the system that was thought to be perfect was proven to have flaws, as the visions from the three precogs didn’t always agree on the future. When one precog disagreed with the other two, its vision was called the “minority report,” and it was discarded. The flaw in the system was that perhaps the minority report would have been the correct vision of the future, meaning law enforcement would have arrested people who ultimately would not have committed the crime.

 

FAST’s documentation doesn’t list a “minority report” as part of its specifications, but if sci-fi movies are to be believed, high-tech solutions to society’s problems nearly always seem to have an element of evil, unintended consequences to them. If any of FAST’s developers have a resemblance to Tom Cruise, conspiracy theorists will be ready and waiting.