The study was published in the Journal of Personality and Social Psychology and first reported in the Economist.
The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using ‘deep neural networks’ – a sophisticated mathematical system that learns to analyze visuals based on a large dataset.
The research found that gay men and women tended to have gender-atypical features, expressions and grooming styles. This means that gay men appear more feminine and gay women appear more masculine. Not only that, the data also identified certain other trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.
It says that human judges accurately identify orientation only 61% of the time for men and 54% for women. On the other hand, according to the study, it is said that when the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. This is because, according to the authors, “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain.”
This study suggested that the findings provide strong support for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice.
The research has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology, and the potential for this kind of software to violate people’s privacy or be abused for anti-LGBT purposes.