Drawing on databases of images collected from an online dating site, a new study conducted at Stanford University concludes that faces carry information about sexual orientation. This information is not available to visual inspection by ordinary perceivers. But it can be extracted by powerful, pattern-recognizing machines (“deep neural networks” or DNNs).
According to the study … a DNN was 91 percent accurate in determining sexual orientation from photos of men and 83 percent accurate with woman. Humans, given the same images to inspect, had a much lower level of accuracy (on the order of 20 percentage points lower).
You might have thought that there is something silly, and even offensive, about the idea that you, or a machine, might read off sexual preference from pictures. Indeed, it’s about as silly and offensive as the idea that you could judge a person’s politics or intelligence or liability to criminal behavior on a similar basis.
But this is a bullet the study’s authors are happy to bite.
It’s just a matter of time, they seem to believe, before DNN can crack the face code. This will doubtless be an evolution with broad civil liberties and privacy implications.
What it won’t do, I suspect, is settle any deeper questions about nature versus nature.
The GLP aggregated and excerpted this blog/article to reflect the diversity of news, opinion, and analysis. Read full, original post: Can A Machine Tel Whether You Are Gay?