The famo AI gaydar study was repeated – and, no, n't tell if you're straight or not jt om your face • The Register

gay detector machine

In an era where homosexualy was nsired a medil disorr Atralia, a vice dubbed the "penis lie tector" was ed alongsi aversn therapy to "treat" gay men.

Contents:

WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE

Rearchers and LGBT groups clash over facial regnn tech that supposedly spots gay people. * gay detector machine *

Those seekg same-sex partners were classified as gay; those seekg oppose-sex partners were assumed to be 300, 000 imag were whtled down to 35, 000 that showed fac clearly and met certa creria. All were whe, the rearchers said, bee they uld not fd enough datg profil of gay mori to generate a statistilly valid imag were cropped further and then procsed through a ep nral work, a layered mathematil system pable of intifyg patterns vast amounts of Kosski said he did not build his tool om scratch, as many suggted; rather, he began wh a wily ed facial analysis program to show jt how easy would be for anyone to pull off somethg Kosski and Yilun WangThe software extracts rmatn om thoands of facial data pots, cludg nose width, mtache shape, eyebrows, rners of the mouth, hairle and even aspects of the face we don’t have words for. Both humans and mache were given pairgs of two fac — one straight, one gay — and asked to pick who was more likely participants, who were procured through Amazon Mechanil Turk, a supplier for digal tasks, were advised to “e the bt of your tun.

And when the tool was challenged wh other scenars — such as distguishg between gay men’s Facebook photos and straight men’s onle datg photos — accuracy dropped to 74 ’s also the issue of false posiv, which plague any predictn mol aimed at intifyg a mory group, said William T. A facial sn that is 91 percent accurate would misintify 9 percent of straight people as gay; the example above, that’s 85 software would also mistake 9 percent of gay people as straight people.

”He noted an email that “the algorhms were only traed and tted on whe, Amerin, openly gay men (and whe, Amerin, prumed straight parisons), ” and therefore probably would not have broar a Face RevealsRegardls of effectivens, the study rais knotty qutns about perceptns of sexual orientatn.

ROW OVER AI THAT 'INTIFI GAY FAC'

Have you ever wonred if you might be gay or not? Well, this app n't actually help you wh that, but is a great gag for clubs and parti! Gay Detector is a fake fgerprt snng app that tri to terme how gay you are based on a mock thumbprt sn. It is meant for pranks and entertament purpos. In fact, you n even ctomize the rponse that the app giv, and play the ultimate prank on your iends! The fun will never end, and you might even end up learng somethg about yourself or your iends. Gay Detector is a ee Progrsive Web App (PWA). * gay detector machine *

”Even many experts who are supportive of the theory, said they uld not see how a study of self-selected datg photos ma the se that gay people have genr-atypil fac, let alone a theory that attribut distctive featur to discsn of P. It is Barnard College, not Barnard we handle rrectnsA versn of this article appears prt on, Sectn D, Page 1 of the New York edn wh the headle: The ‘Gaydar’ Mache Cs an Uproar. Image source, Stanford UniversyImage ptn, The study created pose fac judged most and least likely to belong to homosexualsA facial regnn experiment that claims to be able to distguish between gay and heterosexual people has sparked a row between s creators and two leadg LGBT rights Stanford Universy study claims s software regnis facial featur relatg to sexual orientatn that are not perceived by human work has been acced of beg "dangero" and "junk science" the scientists volved say the are "knee-jerk" reactns.

Details of the peer-reviewed project are due to be published the Journal of Personaly and Social jawsFor their study, the rearchers traed an algorhm g the photos of more than 14, 000 whe Amerins taken om a datg ed between one and five of each person's pictur and took people's sexualy as self-reported on the datg rearchers said the rultg software appeared to be able to distguish between gay and heterosexual men and women. In one tt, when the algorhm was prented wh two photos where one picture was fely of a gay man and the other heterosexual, was able to terme which was which 81% of the women, the figure was 71%. "But their software did not perform as well other suatns, cludg a tt which was given photos of 70 gay men and 930 heterosexual asked to pick 100 men "most likely to be gay" missed 23 of s summary of the study, the Enomist - which was first to report the rearch - poted to several "limatns" cludg a ncentratn on whe Amerins and the e of datg se pictur, which were "likely to be particularly revealg of sexual orientatn".

"This rearch isn't science or news, but 's a scriptn of bety standards on datg s that ignor huge segments of the LGBTQ (lbian, gay, bisexual, transgenr and queer/qutng) muny, cludg people of lour, transgenr people, olr dividuals, and other LGBTQ people who don't want to post photos on datg s, " said Jim Halloran, chief digal officer of Glaad, a media-monorg body.

THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE

Currently, Amerin gay people believe they have a unique abily to pick each other out a crowd (often termed "gaydar" ["gay" + "radar"]). This was tablished through a natnwi Inter-mediated survey (n = 460). To tt for the prence of this abily gay men, the rearcher asked self- … * gay detector machine *

"The 'subtle' differenc uld be a nsequence of gay and straight people choosg to portray themselv systematilly different ways, rather than differenc facial appearance self, " said Prof Benedict Jon, who ns the Face Rearch Lab at the Universy of was also important, he said, for the technil tails of the analysis algorhm to be published to see if they stood up to rmed cricism.

Unsurprisgly, that origal work kicked up a massive fs at the time, wh many skeptil that puters, which have zero knowledge or unrstandg of somethg as plex as sexualy, uld really predict whether someone was gay or straight om their fizzog. The Stanford eggheads behd that first rearch – Yilun Wang, a graduate stunt, and Michal Kosski, an associate profsor – even claimed that not only uld nral works ss out a person’s sexual orientatn, algorhms had an even better gaydar than humans.

“Moreover, this entire le of thought is premised on the ia that there is value to be gaed workg out why 'gay face' classifiers might work – value further scribg, fg and settg out the methodology for any tpot dictator or bigot wh a puter who might want to opprs queer people.

HOW THE COLD WAR 'U MACHE' TRIED TO TERME GAY OM STRAIGHT

* gay detector machine *

Even though there are exampl throughout history om the ancient Greeks to the 18th century of people practisg physgnomy, basilly judgg a person’s character or liftyle om their facial featur, a recent study om Stanford Universy giv a morn-day versn to ntemplate—puters termg if a person is gay or straight through facial-tectn technology. Datg webse and found that a puter algorhm was rrect 81% of the time when was ed to distguish between straight and gay men, and accurate 74% of the time for women.

One pattern the mach tected the study was gay women and men typilly had “genr-atypil, ” “groomg styl, ” featur and exprsns—gay men appeared more feme and gay women appeared more mascule. Another trend the mach intified was that gay women tend to have larger jaws and smaller foreheads then straight women while gay men had larger foreheads, longer nos and narrower jaws than straight men.

As quoted an article by The Guardian on the subject, Nick Rule, an associate profsor of psychology at the Universy of Toronto who has published rearch on the science of gaydar said, “It’s certaly unsettlg. OttawaHow the Cold War 'u mache' tried to terme gay om straightA relic of Ottawa's efforts to root out homosexuals the ary and public service durg the Cold War is back the news followg a class-actn lawsu om LGBT dividuals who lost their jobs. Feral ernment facg class-actn lawsu om LGBT public servants who lost jobsFROM THE ARCHIVES | Government missned 'F Mache' 1960sGovernment missned 'F Mache' 1960s to tect homosexualsIt's not fictn — although sounds like somethg straight out of a dystopian so-lled "u mache" was a homosexualy tectn system missned by the Canadian ernment durg the Cold War — and veloped largely by a psychologist at Carleton Universy Ottawa — to keep LGBT people out of the public service or ary.

THE GAY-DETECTG F MACHE

Posts about gay tectors wrten by timalrman * gay detector machine *

While the mache is long gone, s legacy is back the news after the feral ernment was h wh a class-actn lawsu this week om former public servants who lost their jobs bee of their sexual fac class-actn lawsu over fired LGBT civil servantsGay and lbian civil servants were driven out of the Canadian ary and public service begng the 1950s, but the practice ntued after homosexualy was removed om the Crimal Co the the time, homosexuals were perceived by the ernment as weak, unreliable and potentially disloyal.

GAYDAR: VISUAL TECTN OF SEXUAL ORIENTATN AMONG GAY AND STRAIGHT MEN

Durg those years, Canada’s mpaign to elimate all homosexuals om the ary, police, and the civil service was particularly broad and unfivg, wh the Royal Canadian Mounted Police (RCMP) pilg fil on over 9, 000 spected homosexuals. A siar vice lled a plethysmograph, which nnected directly to the subject’s genals, was ed for siar purpos after the u mache was retired, but the Canadian ernment eventually put a halt to the RCMP’s anti-homosexual activi.

'I WAS SHOCKED WAS SO EASY': ​MEET THE PROFSOR WHO SAYS FACIAL REGNN ​​N TELL IF YOU'RE GAY

Even though there are exampl throughout history om the ancient Greeks to the 18th century of people practicg physgnomy, basilly judgg a person’s character or liftyle om their facial featur, a recent study om Stanford Universy giv a morn-day versn to ntemplate—puters termg if a person is gay or straight through facial-tectn technology. Datg webse and found that a puter algorhm was rrect 81% of the time when was ed to distguish between straight and gay men, and accurate 74% of the time for women.

PROFSORS CREATE ‘GAYDAR’ MACHE WH AI

As quoted an article by The Guardian on the subject, Nick Rule, an associate profsor of psychology at the Universy of Toronto who has published rearch on the science of gaydar said, “It’s certaly unsettlg. To tt for the prence of this abily gay men, the rearcher asked self-intified gay and straight male participants to view a seri of unfaiar men on viotape and terme the sexual orientatn of each.

The higher overall accuracy of gay men monstrated a trend level difference om their straight horts although fallg short (primarily due to small sample size) of the p < 0. Weeks after his trip to Mosw, Kosski published a ntroversial paper which he showed how face-analysg algorhms uld distguish between photographs of gay and straight people. ”In a paper published last year, Kosski and a Stanford puter scientist, Yilun Wang, reported that a mache-learng system was able to distguish between photos of gay and straight people wh a high gree of accuracy.

Prented wh two pictur – one of a gay person, the other straight – the algorhm was traed to distguish the two 81% of s volvg imag of men and 74% of photographs of women.

A STANFORD SCIENTIST SAYS HE BUILT A GAYDAR G “THE LAMT” AI TO PROVE A POT

’ Photograph: Jason Henry/The GuardianNeher did many other people, and there was an immediate backlash when the rearch – dubbed “AI gaydar” – was previewed the Enomist magaze. ) There was also anger that Kosski had nducted rearch on a technology that uld be ed to persecute gay people untri such as Iran and Sdi Arabia, where homosexualy is punishable by ath. His fdgs are nsistent wh the prenatal hormone theory of sexual orientatn, he says, which argu that the levels of androgens foet are exposed to the womb help terme whether people are straight or gay.

Jt as society is movg towards viewg sexual orientatn as a fluid spectm, two Stanford Universy profsors have found face regnn technology n tegorize people as ‘gay’ or ‘straight’, based on a sgle picture.

RNTHIS 'PENIS LIE TECTOR' HELPED DOCTORS NDUCT GAY AVERSN THERAPY

Last week, The Enomist published a story around Stanford Graduate School of Bs rearchers Michal Kosski and Yilun Wang’s claims that they had built artificial telligence that uld tell if we are gay or straight based on a few imag of our fac.

Those two fdgs should not be nflated, ” Jim Halloran, GLAAD’s chief digal officer, wrote a statement claimg the paper uld e harm exposg methods to target gay the other hand, LGBTQ Natn, a publitn foced on issu the lbian, gay, bisexual, transgenr, queer muny, disagreed wh GLAAD, sayg the rearch intified a potential threat.

TAG ARCHIV: GAY TECTORS

Kosski asserted an terview wh Quartz that regardls of the methods of his paper, his rearch was service of gay and lbian people that he se unr siege morn society. He says his work stands on the shoulrs of rearch happeng for s—he’s not reventg anythg, jt translatg known differenc about gay and straight people through new technology. Rather, should foc on permanent featur like the length of the nose or width of the face, bee those stay nsistent when regnizg a thors say this is necsary bee, sce the pictur are om a datg se, gay and straight people might e different facial exprsns, mera angl, or pos to seem attractive based on the sex they’re tryg to attract.

It’s an ia backed by soclogists, and if your claim is that facial stcture is different for gay people and straight people, then you fely don’t want somethg as fleetg as a facial exprsn to nfe that. A heat map of where the algorhm looks to tect signs of homosexualyImage: Kosski and WangBut some AI rearchers doubt that VGG-Face is actually ignorg exprsn and pose, bee the mol isn’t beg ed for s tend e, to simply intify people.

GAYDAR

For stance, they found that gay and straight men wore different facial hair, and gay women were more prone to wear baseball is jt one crique: we don’t know for sure that this is te, and ’s impossibly difficult to fd out given the rmatn provid the paper or even if you had the algorhm.

But this potential explanatn based on the tt of another AI rearcher throws doubt to the ia that VGG-Face n be ed as a perfect oracle to tect somethg about a person’s facial featur while ignorg nfoundg algorhm generated the optimally gay and straight Kosski and WangThe send aspect of this rearch, apart om the algorhm, are the data ed to tra the facial-regnn system. The pany uld have disclosed technologil or cultural bias herent the data for rearchers to way, ’s unclear how imag of people taken om datg webs and sorted only to gay and straight tegori accurately reprent their sexualy. The thors also assume that men lookg for male partners and femal lookg for female partners are gay, but that’s a stunted, bary distillatn of the sexual spectm soclogists today are tryg to unrstand.

To only measure terms of gay or straight don’t accurately reflect the world, but stead forc a human nstct onto —a hallmark of bad that, Kosski says the study was nducted wh the nf of what ers reported themselv to be lookg for on the datg s—and back to the pot that someone g this malicly wouldn’t spl hairs over whether someone was bisexual or about the numbers? Dpe the 91% accuracy reported other news outlets, that number only om a very specific algorhm is shown five pictur each of two different people who were lookg for the same or oppose sex on the datg se, and told that one of them is gay.

UGANDA N NOT AFFORD TO BUY S 'GAY TECTOR' MACHE

Estimat om the paper put roughly 7% of the populatn as gay (Gallup says 4% intify as LGBT as of this year, but 7% of lennials), so om a random draw of 100 people, seven would be gay. Then they tell the algorhm to pull the top 100 people who are most likely to be gay om the full 1, algorhm do ; but only 43 people are actually gay, pared to the entire 70 expected to be the sample of 1000. While accuracy is a measure of succs, Kosski said he didn’t know if was ethilly sound to create the bt algorhmic approach, for fear someone uld replite , stead optg to e off-the-shelf realy, this isn’t an algorhm that tells gay people om straight people.

After readg Kosski and Wang’s paper, three soclogists and data scientists who spoke wh Quartz qutned whether the thor’s assertn that gay and straight people have different fac is supported by the experiments the paper. “The thg that [the thors] assert that I don’t see the evince for is that there are fixed physgnomic differenc facial stcture that the algorhm is pickg up, ” said Carl Bergstrom, evolutnary blogist at the Universy of Washgton Seattle and -thor of the blog Callg study also heavily leans on prev rearch that claims humans n tell gay fac om straight fac, ditg an ial benchmark to prove mach n do a better job. It also shows that people are que accurate, ” Konstant Tskhay, a soclogist who nducted rearch on whether people uld tell gay om straight fac and ced Kosski and Wang’s paper, told Quartz an sce we n’t say wh total certaty that the VGG-Face algorhm hadn’t also picked up those stereotyp (that humans see too) om the data, ’s difficult to ll this a sexual-preference tectn tool stead of a stereotype-tectn the science matter?

*BEAR-MAGAZINE.COM* GAY DETECTOR MACHINE

Uganda n not afford to buy s 'gay tector' mache - OUT & PROUD AFRICAN LGBTI.

TOP