People around the world face vlence and equaly—and sometim torture, even executn—bee of who they love, how they look, or who they are. Sexual orientatn and genr inty are tegral aspects of our selv and should never lead to discrimatn or abe. Human Rights Watch works for lbian, gay, bisexual, and transgenr peopl' rights, and wh activists reprentg a multiplicy of inti and issu. We document and expose ab based on sexual orientatn and genr inty worldwi, cludg torture, killg and executns, arrts unr unjt laws, unequal treatment, censorship, medil ab, discrimatn health and jobs and hog, domtic vlence, ab agast children, and nial of fay rights and regnn. We advote for laws and polici that will protect everyone’s digny. We work for a world where all people n enjoy their rights fully.
Contents:
- ROW OVER AI THAT 'INTIFI GAY FAC'
- 'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
- THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
- FACIAL HTS SHARPEN PEOPLE'S 'GAYDAR'
- IS YOUR FACE GAY? CONSERVATIVE? CRIMAL? AI REARCHERS ARE ASKG THE WRONG QUTNS
- THIS PSYCHOLOGIST’S “GAYDAR” REARCH MAK UNFORTABLE. THAT’S THE POT.
ROW OVER AI THAT 'INTIFI GAY FAC'
Rearchers and LGBT groups clash over facial regnn tech that supposedly spots gay people. * gay face detection *
Image source, Stanford UniversyImage ptn, The study created pose fac judged most and least likely to belong to homosexualsA facial regnn experiment that claims to be able to distguish between gay and heterosexual people has sparked a row between s creators and two leadg LGBT rights Stanford Universy study claims s software regnis facial featur relatg to sexual orientatn that are not perceived by human work has been acced of beg "dangero" and "junk science" the scientists volved say the are "knee-jerk" reactns. Details of the peer-reviewed project are due to be published the Journal of Personaly and Social jawsFor their study, the rearchers traed an algorhm g the photos of more than 14, 000 whe Amerins taken om a datg ed between one and five of each person's pictur and took people's sexualy as self-reported on the datg rearchers said the rultg software appeared to be able to distguish between gay and heterosexual men and women.
In one tt, when the algorhm was prented wh two photos where one picture was fely of a gay man and the other heterosexual, was able to terme which was which 81% of the women, the figure was 71%.
'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
"Gay fac tend to be genr atypil, " the rearchers said. "Gay men had narrower jaws and longer nos, while lbians had larger jaws. "But their software did not perform as well other suatns, cludg a tt which was given photos of 70 gay men and 930 heterosexual asked to pick 100 men "most likely to be gay" missed 23 of s summary of the study, the Enomist - which was first to report the rearch - poted to several "limatns" cludg a ncentratn on whe Amerins and the e of datg se pictur, which were "likely to be particularly revealg of sexual orientatn".
"This rearch isn't science or news, but 's a scriptn of bety standards on datg s that ignor huge segments of the LGBTQ (lbian, gay, bisexual, transgenr and queer/qutng) muny, cludg people of lour, transgenr people, olr dividuals, and other LGBTQ people who don't want to post photos on datg s, " said Jim Halloran, chief digal officer of Glaad, a media-monorg body. "The reckls fdgs uld serve as a weapon to harm both heterosexuals who are accurately outed, as well as gay and lbian people who are suatns where g out is dangero. "The 'subtle' differenc uld be a nsequence of gay and straight people choosg to portray themselv systematilly different ways, rather than differenc facial appearance self, " said Prof Benedict Jon, who ns the Face Rearch Lab at the Universy of was also important, he said, for the technil tails of the analysis algorhm to be published to see if they stood up to rmed cricism.
Weeks after his trip to Mosw, Kosski published a ntroversial paper which he showed how face-analysg algorhms uld distguish between photographs of gay and straight people.
THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
”In a paper published last year, Kosski and a Stanford puter scientist, Yilun Wang, reported that a mache-learng system was able to distguish between photos of gay and straight people wh a high gree of accuracy.
Prented wh two pictur – one of a gay person, the other straight – the algorhm was traed to distguish the two 81% of s volvg imag of men and 74% of photographs of women.
Human judg, by ntrast, were able to intify the straight and gay people 61% and 54% of s, rpectively. “I was jt shocked to disver that is so easy for an algorhm to distguish between gay and straight people, ” Kosski tells me. ’ Photograph: Jason Henry/The GuardianNeher did many other people, and there was an immediate backlash when the rearch – dubbed “AI gaydar” – was previewed the Enomist magaze.
FACIAL HTS SHARPEN PEOPLE'S 'GAYDAR'
) There was also anger that Kosski had nducted rearch on a technology that uld be ed to persecute gay people untri such as Iran and Sdi Arabia, where homosexualy is punishable by ath. His fdgs are nsistent wh the prenatal hormone theory of sexual orientatn, he says, which argu that the levels of androgens foet are exposed to the womb help terme whether people are straight or gay. “Th, ” he wr his paper, “gay men are predicted to have smaller jaws and chs, slimmer eyebrows, longer nos and larger foreheads...
He be prickly when I prs him on Rsia, potg to s dire rerd on gay rights. (A uple of days later, Kosski tells me he has checked his slis; fact, he says, he didn’t tell the Rsians about his “AI gaydar”. Even though there are exampl throughout history om the ancient Greeks to the 18th century of people practisg physgnomy, basilly judgg a person’s character or liftyle om their facial featur, a recent study om Stanford Universy giv a morn-day versn to ntemplate—puters termg if a person is gay or straight through facial-tectn technology.
Datg webse and found that a puter algorhm was rrect 81% of the time when was ed to distguish between straight and gay men, and accurate 74% of the time for women. One pattern the mach tected the study was gay women and men typilly had “genr-atypil, ” “groomg styl, ” featur and exprsns—gay men appeared more feme and gay women appeared more mascule.
IS YOUR FACE GAY? CONSERVATIVE? CRIMAL? AI REARCHERS ARE ASKG THE WRONG QUTNS
Another trend the mach intified was that gay women tend to have larger jaws and smaller foreheads then straight women while gay men had larger foreheads, longer nos and narrower jaws than straight men.
As quoted an article by The Guardian on the subject, Nick Rule, an associate profsor of psychology at the Universy of Toronto who has published rearch on the science of gaydar said, “It’s certaly unsettlg. All men wh genr-atypil exprsns are gay)? Unsurprisgly, that origal work kicked up a massive fs at the time, wh many skeptil that puters, which have zero knowledge or unrstandg of somethg as plex as sexualy, uld really predict whether someone was gay or straight om their fizzog.
THIS PSYCHOLOGIST’S “GAYDAR” REARCH MAK UNFORTABLE. THAT’S THE POT.
The Stanford eggheads behd that first rearch – Yilun Wang, a graduate stunt, and Michal Kosski, an associate profsor – even claimed that not only uld nral works ss out a person’s sexual orientatn, algorhms had an even better gaydar than humans. Notably, straight women were more likely to wear eye shadow than gay women Wang and Kosski’s dataset. Straight men were more likely to wear glass than gay men.