Tech & Science Chinese students build a computer that knows if you’re a criminal…from your face?

06:05  10 january  2017
06:05  10 january  2017 Source:   Techly

Keating minister said Australia could be 'easy target' for asylum seekers

  Keating minister said Australia could be 'easy target' for asylum seekers Immigration minister Gerry Hand warned in 1992 of 'spontaneous mass movement' and argued for strict border controls.In an echo of later arguments used by the Howard government, Hand said there was immense migratory pressure worldwide in the early 90s and the asylum route was being increasingly used as a way around immigration processes.

Like a more crooked version of the Voight-Kampff test from Blade Runner, a new machine learning paper from a pair of Chinese researchers has delved into the controversial task of letting a computer decide on your innocence. Can a computer know if you ’ re a criminal just from your face ? In their paper ‘Automated Inference on Criminality using Face Images’, published on the arXiv pre-print server, Xiaolin Wu and Xi Zhang from China ’s Shanghai Jiao Tong University investigate whether a computer can detect if a human could be a convicted criminal just by analysing his or her facial features.

If this problem persists please contact customer support.

  Chinese students build a computer that knows if you’re a criminal…from your face? © Getty Images

We’ve all had that moment, even those lucky enough to live in the more hip and swinging tolerant societies, sometimes you look at your fellow human beings and think, ‘hell, that kid looks shifty!’

Maybe they’ve pulled on a balaclava in the middle of meal, they might be running determinedly out of a supermarket in a big coat with lots of pockets. They might just be a six-year-old boy, in which case it’s fairly certain they’ve just done something a least a little bit naughty.

Imagine for a second though, that there was a way of learning how likely someone was to commit a crime based on nothing more than their facial features? In what, it really must be said, sounds like the basic plot to an early 2000s sci-fi film, two Chinese researchers have set out to answer precisely that question.

HP embraces the curved computer screen at CES 2017

  HP embraces the curved computer screen at CES 2017 HP embraces the curved computer screen at CES 2017HP is not afraid to put a curve on it. At CES 2017 it debuted two new high-end, curved displays, including an all-in-one computer and a gaming monitor.

We'll be back soon! US college students face high debt, shattered dreams.

If this problem persists please contact customer support.

Whilst researching a recently released paper, Automated Inference on Criminality, Xiaolin Wu and Xi Zhang of the Shanghai Jao Tong University, fed still images of 1856 people, almost half of which were convicted criminals, into a computer in order to digitally analyse their facial features.

The idea here was to lessen the impact of human bias through automating much of the process. The results were interesting, to say the least. According to the paper’s conclusion, “By extensive experiments and vigorous cross validations, we have demonstrated that via supervised machine learning, data-driven face classifiers are able to make a reliable inference on criminality.”

While it may seem more logical to assume that criminals have certain facial features in common to make them stand out, in fact the opposite is true with the non-criminal data set having more in common and the criminals having a wider variation when assessing factors such as “lip curvature, eye inner corner distance and the so-called nose-mouth angle.”

Anti-surveillance clothing aims to hide wearers from facial recognition

  Anti-surveillance clothing aims to hide wearers from facial recognition Hyperface project involves printing patterns on to clothing or textiles that computers interpret as a face, in fightback against intrusive technologyBerlin-based artist and technologist Adam Harvey aims to overwhelm and confuse these systems by presenting them with thousands of false hits so they can’t tell which faces are real.

This page is currently unavailable. If you are the webmaster for this site, please contact your hosting provider's support team for assistance.

This page is currently unavailable. If you are the webmaster for this site, please contact your hosting provider's support team for assistance.

Therefore it’s more accurate to say that rather than finding it possible to look like a typical criminal, the research actually led to the discovery that it’s possible to look like a typical non-criminal.

In order to ensure the validity of the results, the researchers used four ‘classifiers’, such as logistical regression– a method of describing data, explaining the relationship between one binary variable (in this case, is the subject known to be guilty or not) and several metric variables (in this case, their facial features) all of which found fairly similar results.

In the case of this particular study, one of the more interesting aspects has been the reaction to it, with Motherboard reporting on some fairly heated criticism that the research has received. Since the paper has been printed several flaws have been pointed out, not least in this Hacker News thread, one of which being the idea that since the algorithms have been designed by people subject to human bias, then those same biases would in fact be programmed into the machine.

The problem being that, as Microsoft learned with the embarrassment surrounding their misanthropic chatbot Tay earlier this year with her strange and vaguely terrifying views about Ricky Gervais, programmes such as these are actually quite adept at identifying, and acting upon, human bias within a data set rather than removing them from the equation.

All of which means, we’re thankfully quite a long way from accidentally being locked up as soon as someone takes a photo of us at an angle where are lips are curved the wrong way in order to stop us robbing the corner shop in the future.


Is Facebook secretly developing a mind-reading communications platform? .
Facebook may be working on a revolutionary type of computer interface that will allow people to communicate with each other using thought alone. The project is being spearheaded by Facebook's Building 8 team. Similar to Google's X division and Amazon's Lab126, Building 8 is a clandestine development unit charged with creating new innovations and developing "Darpa-style" breakthroughs for Facebook.

—   Share news in the SOC. Networks

Topical videos:

Comments:

comments powered by HyperComments
This is interesting!