Two proment LGBT groups lashed out at rearchers who found that a facial regnn software uld duce whether a person was gay or heterosexual.
Contents:
- 'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
- THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
- WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
Weeks after his trip to Mosw, Kosski published a ntroversial paper which he showed how face-analysg algorhms uld distguish between photographs of gay and straight people.
THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
”In a paper published last year, Kosski and a Stanford puter scientist, Yilun Wang, reported that a mache-learng system was able to distguish between photos of gay and straight people wh a high gree of accuracy. Prented wh two pictur – one of a gay person, the other straight – the algorhm was traed to distguish the two 81% of s volvg imag of men and 74% of photographs of women. Human judg, by ntrast, were able to intify the straight and gay people 61% and 54% of s, rpectively.
WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
“I was jt shocked to disver that is so easy for an algorhm to distguish between gay and straight people, ” Kosski tells me. ’ Photograph: Jason Henry/The GuardianNeher did many other people, and there was an immediate backlash when the rearch – dubbed “AI gaydar” – was previewed the Enomist magaze.