A new research has revealed that Artificial Intelligence (AI) can accurately tell whether someone is gay or straight based on photos of their faces. The same research has suggested that machines can have significantly better “gaydar” than humans.
The machine intelligence tested in the research was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website.
The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using ‘deep neural networks’ – a sophisticated mathematical system that learns to analyze visuals based on a large dataset.
The research found that gay men and women tended to have gender-atypical features, expressions and grooming styles. This means that gay men appear more feminine and gay women appear more masculine. Not only that, the data also identified certain other trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.
The study has revealed that human judges on sexual orientation performed much worse than the algorithm.
It says that human judges accurately identify orientation only 61% of the time for men and 54% for women. On the other hand, according to the study, it is said that when the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. This is because, according to the authors, “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain.”
This study suggested that the findings provide strong support for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice.
The research has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology, and the potential for this kind of software to violate people’s privacy or be abused for anti-LGBT purposes.