[Michal Kosinski] decided to show that it was possible to use facial recognition analysis to detect something intimate, something “people should have full rights to keep private.”
After considering atheism, he settled on sexual orientation.
Whether he has now created “A.I. gaydar,” and whether that’s even an ethical line of inquiry, has been hotly debated over the past several weeks, ever since a draft of his study was posted online.
Presented with photos of gay men and straight men, a computer program was able to determine which of the two was gay with 81 percent accuracy, according to Dr. Kosinski and co-author Yilun Wang’s paper.
The software extracts information from thousands of facial data points, including nose width, mustache shape, eyebrows, corners of the mouth, hairline and even aspects of the face we don’t have words for. It then turns the faces into numbers.
Dr. Kosinski’s algorithm, by comparison, picked correctly 71 percent for of the time for women and 81 percent for men. When the computer was given five photos for each person instead of just one, accuracy rose to 83 percent for women and 91 percent for the men.
Regardless of effectiveness, the study raises knotty questions about perceptions of sexual orientation.
The GLP aggregated and excerpted this blog/article to reflect the diversity of news, opinion, and analysis. Read full, original post: Why Stanford Researchers Tried to Create a ‘Gaydar’ Machine