As artificial intelligence becomes increasingly advanced, we’re hearing more warnings about their potential dangers. Elon Musk thinks they’ll be the cause of world war 3 , but, based on a recent study from Stanford University, the systems could pose a more immediate threat when it comes to people’s privacy.
Michal Kosinski and Yilun Wang’s algorithm can accurately guess if a person is straight or gay based on a photo of their face. This was achieved using a sample of over 35,000 images that had been publicly posted on a US dating site, with an even representation of gay and straight (determined through their profiles) and male and female subjects.
The deep neural networks extracted features from the huge dataset, identifying certain trends to help it determine a person’s sexuality. Gay men tended to have “gender-atypical” features, expressions and grooming styles, meaning they appeared more feminine. They often had narrower jaws, longer noses and larger foreheads than straight men. The opposite was true for gay women, who generally had larger jaws and smaller foreheads.
The system was eventually able to distinguish between gay and straight men 81 percent of the time, and 74 percent of the time for women, just by reviewing a photo. When it was shown five images of the same person, accuracy levels went up to 91 percent for men and 83 percent for women. By contrast, a human was only able to identify sexual orientation 61 percent of the time for men and 54 percent for women.
The findings strongly support the theory that the same hormones affecting a fetus’ developing bone structure play a part in determining sexuality, meaning that people are born gay – it’s not a choice. Additionally, the lower success rate for women could suggest female sexual orientation is more fluid.