Israeli Company Claims Its Software Can Look At Your Face And Determine If You're A Terrorist Or Murderer
from the no-it-can't dept
There is a regular experience I have that I assume is common for anyone that operates within the technology industry: I will often hear non-technical people make claims about a specific kind of technology that are wildly overstated. To clarify, I am technically proficient in the barest sense, mostly meaning that I have enough of an understanding of the underlying process by which things work that I can explain them, but not implement them. To those without even that barest understanding, I can understand how technology can simply seem like magic. That can open the doors for others who know better to try to take advantage of this.
Enter into the conversation Israeli startup company Faception, which claims its facial recognition software can look at your features and then determine if you're a terrorist, pedophile, or criminal.
An Israeli start-up says it can take one look at a person's face and realize character traits that are undetectable to the human eye. Faception said it's already signed a contract with a homeland security agency to help identify terrorists. The company said its technology also can be used to identify everything from great poker players to extroverts, pedophiles, geniuses and white collar-criminals.
"We understand the human much better than other humans understand each other," said Faception chief executive Shai Gilboa. "Our personality is determined by our DNA and reflected in our face. It's a kind of signal."
The practice of trying to figure out human tendencies and traits through facial features isn't new. It's been going on for centuries under the term "physiognomy", and its history is long and dubious. The overwhelming general consensus is that it's nonsense, with anecdotal evidence for its successes being more the result of confirmation bias and self-fulfilling prophecy than anything else. One might remember the man with the mean face who later robbed him at gunpoint, but doesn't think as often about the kind-faced man who bilked him out of his life savings with a confidence scheme. Alternatively, one might decide someone had an untrustworthy face, treat that person badly, and assume confirmation of the original theory when that person reacts negatively to such treatment.
What Faception claims to do is put the job in the hands of software, as though that solves the problem. Except software is both constructed by humans, with all of their biases, and is even programmed in a way to be intelligent like humans, making the whole thing a self-defeating enterprise. Even Gilboa knows that his claims of super-accuracy shouldn't be relied on.
The danger lies in the computer system's imperfections. Because of that, Gilboa envisions governments considering his findings along with other sources to better identify terrorists. Even so, the use of the data is troubling to some.
"The evidence that there is accuracy in these judgments is extremely weak," said Alexander Todorov, a Princeton psychology professor whose research includes facial perception. "Just when we thought that physiognomy ended 100 years ago. Oh, well."
And examples for the methodology of how this type of software will learn to identify traits often leaves much to be desired as well.
There are challenges in trying to use artificial intelligence systems to draw conclusions such as this. A computer that is trained to analyze images will only be as good as the examples it is trained on. If the computer is exposed to a narrow or outdated sample of data, its conclusions will be skewed. Additionally, there's the risk the system will make an accurate prediction, but not necessarily for the right reasons.
Domingos, the University of Washington professor, shared the example of a colleague who trained a computer system to tell the difference between dogs and wolves. Tests proved the system was almost 100 percent accurate. But it turned out the computer was successful because it learned to look for snow in the background of the photos. All of the wolf photos were taken in the snow, whereas the dog pictures weren't.
In that latest example, you see how a particular technology can appear to be almost magical in its accuracy until you get under the hood and see what actually occurred. There's no magic here, just software intelligent enough to separate wolves from dogs in a manner that didn't look at either the wolves or dogs at all. We can't tell if Faception's technology works the same way because, of course, Gilboa's company isn't sharing what's under the hood. Given the history of this practice, however, one feels safe rebutting the company's claim that it can tell if you're a terrorist by looking at your face by simply saying, "No you can't."
And that's not even digging into the question of the ethics of the use of this "technology." Faception claims it has already signed an agreement with an unidentified nation's homeland security organization. Were governments to begin using questionable technology that hasn't been vetted for accuracy and applying law enforcement actions with it, well, then we'd have delved into the kind of PreCrime dystopia typically reserved for bad fiction.
Software isn't magic. It strains the mind to believe that one company could undo centuries of consensus on physiognomy, just like that. Because it almost certainly can't.
Filed Under: criminal, facial recognition, terrorist
Companies: faception