Rearchers and LGBT groups clash over facial regnn tech that supposedly spots gay people.
Contents:
- ROW OVER AI THAT 'INTIFI GAY FAC'
- 'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
- IS YOUR FACE GAY? CONSERVATIVE? CRIMAL? AI REARCHERS ARE ASKG THE WRONG QUTNS
- AI CAN TELL IF YOU'RE GAY: ARTIFICIAL INTELLIGENCE PREDICTS SEXUALY FROM ONE PHOTO WH STARTLG ACCURACY
- THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
ROW OVER AI THAT 'INTIFI GAY FAC'
* image recognition gay *
Image source, Stanford UniversyImage ptn, The study created pose fac judged most and least likely to belong to homosexualsA facial regnn experiment that claims to be able to distguish between gay and heterosexual people has sparked a row between s creators and two leadg LGBT rights Stanford Universy study claims s software regnis facial featur relatg to sexual orientatn that are not perceived by human work has been acced of beg "dangero" and "junk science" the scientists volved say the are "knee-jerk" reactns. Details of the peer-reviewed project are due to be published the Journal of Personaly and Social jawsFor their study, the rearchers traed an algorhm g the photos of more than 14, 000 whe Amerins taken om a datg ed between one and five of each person's pictur and took people's sexualy as self-reported on the datg rearchers said the rultg software appeared to be able to distguish between gay and heterosexual men and women.
In one tt, when the algorhm was prented wh two photos where one picture was fely of a gay man and the other heterosexual, was able to terme which was which 81% of the women, the figure was 71%. "Gay fac tend to be genr atypil, " the rearchers said. "Gay men had narrower jaws and longer nos, while lbians had larger jaws.
"But their software did not perform as well other suatns, cludg a tt which was given photos of 70 gay men and 930 heterosexual asked to pick 100 men "most likely to be gay" missed 23 of s summary of the study, the Enomist - which was first to report the rearch - poted to several "limatns" cludg a ncentratn on whe Amerins and the e of datg se pictur, which were "likely to be particularly revealg of sexual orientatn". "This rearch isn't science or news, but 's a scriptn of bety standards on datg s that ignor huge segments of the LGBTQ (lbian, gay, bisexual, transgenr and queer/qutng) muny, cludg people of lour, transgenr people, olr dividuals, and other LGBTQ people who don't want to post photos on datg s, " said Jim Halloran, chief digal officer of Glaad, a media-monorg body. "The reckls fdgs uld serve as a weapon to harm both heterosexuals who are accurately outed, as well as gay and lbian people who are suatns where g out is dangero.
'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
While people who intify as lbian, gay, bisexual and transgenr (LGBT) experience body image ncerns ways that are generally siar to people who intify as heterosexual, their experience and relatnship wh their body is likely to differ specific ways. * image recognition gay *
"The 'subtle' differenc uld be a nsequence of gay and straight people choosg to portray themselv systematilly different ways, rather than differenc facial appearance self, " said Prof Benedict Jon, who ns the Face Rearch Lab at the Universy of was also important, he said, for the technil tails of the analysis algorhm to be published to see if they stood up to rmed cricism.
IS YOUR FACE GAY? CONSERVATIVE? CRIMAL? AI REARCHERS ARE ASKG THE WRONG QUTNS
Computers tected gay people om photos of fac wh up to 91 percent accuracy. * image recognition gay *
Weeks after his trip to Mosw, Kosski published a ntroversial paper which he showed how face-analysg algorhms uld distguish between photographs of gay and straight people. ”In a paper published last year, Kosski and a Stanford puter scientist, Yilun Wang, reported that a mache-learng system was able to distguish between photos of gay and straight people wh a high gree of accuracy. Prented wh two pictur – one of a gay person, the other straight – the algorhm was traed to distguish the two 81% of s volvg imag of men and 74% of photographs of women.
Human judg, by ntrast, were able to intify the straight and gay people 61% and 54% of s, rpectively.
AI CAN TELL IF YOU'RE GAY: ARTIFICIAL INTELLIGENCE PREDICTS SEXUALY FROM ONE PHOTO WH STARTLG ACCURACY
“I was jt shocked to disver that is so easy for an algorhm to distguish between gay and straight people, ” Kosski tells me. ’ Photograph: Jason Henry/The GuardianNeher did many other people, and there was an immediate backlash when the rearch – dubbed “AI gaydar” – was previewed the Enomist magaze.
THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
) There was also anger that Kosski had nducted rearch on a technology that uld be ed to persecute gay people untri such as Iran and Sdi Arabia, where homosexualy is punishable by ath. His fdgs are nsistent wh the prenatal hormone theory of sexual orientatn, he says, which argu that the levels of androgens foet are exposed to the womb help terme whether people are straight or gay.