Recently, a new technology with Artifical Intelligence (AI) has been developed, which accurately guesses whether one is straight or gay based on their photos. Discovery of this algorithm from the students of Stanford University has raised serious questions regarding the ethics of the whole technology. Its capabilities, biological origins of sexual orientation and its potential to be misused against the LGBT community are limitless.
This newly discovered computer algorithm was successful in distinguishing between gay and straight men 81% of the time and between gay and straight women 74% of the time. For testing their algorithm, they resorted to a US based dating website. Around 35,000 sample faces images of men and women were chosen at random which were publicly posted on the website. They by using a mathematical system called as “deep neural networks”, the AI analyzed the visuals based on huge data sets. However, the numbers grew significantly when the software was provided with more than 5 images of a person. The numbers escalated to 91% for men and 83% for women.
On the other hand, as compared to the algorithm, humans performed poorly in the same test.They were only able to determine appropriate sexual orientation based on the facial images 61% of the time for men. On the other hand, it was 54% for women.
The Basis of Research
The whole theory was based on a simple fact that gay men and women showed more gender-atypical features. Putting that in simple terms, gay men appeared more feminine where gay women tended to show man-like characteristics. Also, gay men tend to have narrower jaws, longer noses and larger foreheads than the straight men. While in case of women, the findings were exactly opposite. Gay women had larger jaws and smaller foreheads as compared to straight women. According to the research, people are born gay and it is not a matter of their choice. Their sexual orientation is based on exposure to certain hormones before birth.
However, AI judging people based on their looks is a matter of concern for all of us.