Two proment LGBT groups lashed out at rearchers who found that a facial regnn software uld duce whether a person was gay or heterosexual.
Contents:
- 'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
- THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
- WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
Weeks after his trip to Mosw, Kosski published a ntroversial paper which he showed how face-analysg algorhms uld distguish between photographs of gay and straight people. ”In a paper published last year, Kosski and a Stanford puter scientist, Yilun Wang, reported that a mache-learng system was able to distguish between photos of gay and straight people wh a high gree of accuracy.
THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
Prented wh two pictur – one of a gay person, the other straight – the algorhm was traed to distguish the two 81% of s volvg imag of men and 74% of photographs of women. Human judg, by ntrast, were able to intify the straight and gay people 61% and 54% of s, rpectively.
“I was jt shocked to disver that is so easy for an algorhm to distguish between gay and straight people, ” Kosski tells me.
WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
’ Photograph: Jason Henry/The GuardianNeher did many other people, and there was an immediate backlash when the rearch – dubbed “AI gaydar” – was previewed the Enomist magaze. ) There was also anger that Kosski had nducted rearch on a technology that uld be ed to persecute gay people untri such as Iran and Sdi Arabia, where homosexualy is punishable by ath.