Face-analysis appears to be the next big thing advanced technology. Developing of artificial intelligence algorithms that do a facial analysis to detect a person’s interests, professions, to identify whether a person is terrorist or not, the world is seeing great inventions in facial recognition technology.
And Michal Kosinski’s latest invention to detect the sexual preference of people through facial analysis is seemingly one of the most important (and dangerous) inventions so far.
An illustration of the facial analysis technology, similar to what has been used in the experiment The Guardian
A draft of his study “A.I. Gaydar” was recently posted online and has since then become a major point of debate. The research suggests that machines can have essentially better “gaydar” that humans.
This study, which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women, has essentially raised a flurry of questions on physiognomy – the biological origins of sexual orientation and ethics of facial-detection technology.
Around 35,000 samples were put on test in the experiment The New York Times
According to the research, homosexual people tend to have “gender-atypical” features, “grooming style” and expressions, which prompt the gay men to act more on the feminine side and the gay women to be more on the masculine side.
While the human judges performed quite badly compared to the algorithm, being able to identify people with just 61% accuracy for men and just 54% for women, the software was exceptionally successful in recognizing gay men and women with a whopping 91% and 83% accuracy for men and women respectively. Thereafter, the authors wrote in the report –
“Faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain.”
However, this study was limited only to homosexuals and heterosexuals, bisexuals or transgenders were not part of the study. Moreover, the study was limited to white skinned people, as, according to the researchers, it is hard to analyse the facial features of people with color.
While the subject marks one-of-a-kind progression of science and technology, nevertheless, it is regarded as derogatory and regressive to many. Countries, where homosexuality is labelled as crime, may use this technology to detect people and curb their freedom of expression. The technology if used by spouses to identify if their partners are closet gays, or by teenagers using the algorithm on themselves or their peers could prove very dangerous for society.
According to Kosinski, the study may absolutely damage his reputation but he has no regrets whatsoever. The reason was well-etched out by him in his question –
“The question is, can you live with yourself if you knew it’s possible and you didn’t let anyone know?”