Troublingly, some modern AI applications are delving into physiognomy, a set of pseudoscientific ideas that first appeared thousands of years ago with the ancient Greeks. In antiquity, Pythagoras, the Greek mathematician, based his decisions on accepting students on whether they “looked” gifted. To the philosopher Aristotle, bulbous noses denoted an insensitive person and round faces signaled courage.
Recent research has tried to show that political leaning, sexuality, and criminality can be inferred from pictures of people’s faces.
Political orientation. In a 2021 article in Nature Scientific Reports, Stanford researcher Michal Kosinski found that using open-source code and publicly available data and facial images, facial recognition technology can judge a persons’ political orientation accurately 68 percent of the time even when controlling demographic factors. In this research, the primary algorithm learned the average face for conservatives and liberals and then predicted the political leanings of unknown faces by comparing them to the reference images. Kosinski wrote that his findings about AI’s abilities have grave implications: “The privacy threats posed by facial recognition technology are, in many ways, unprecedented.”
While the questions behind this line of inquiry may not immediately trigger an alarm, the underlying premise still fits squarely within physiognomy, predicting personality traits from face features.