Stanford Universy's AI n tell if you're gay or straight om a photo | TechSpot

stanford gay ai

Contents:

THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE

Homo enomic, Evolvg. Compose fac built by averagg fac classified as most and least likely to be gay by a puter. The rearch, by Michal Kosski and Yilun Wang of Stanford Universy, claims that a puter algorhm bted humans distguishg between a gay person and a straight person when analyzg imag om public profil on a datg webse.

Two gay rights groups, Human Rights Campaign and GLAAD, lled the rearch,  a jot prs release, "junk science. Gay and straight people, male and female, were reprented evenly.

STANFORD UNIVERSY'S AI N TELL IF YOU'RE GAY OR STRAIGHT OM A PHOTO

Usg a rultg mol ma up of the distguishg characteristics, the program, when shown one photo of a gay man and one of a straight man, was able to intify their sexual orientatn 81 percent of the time. ) Human gusers rrectly intified straight fac and gay fac jt 61 percent of the time for men and 54 percent for women. The rearchers say the paper that the rults "provi strong support" for the prenatal hormone theory of gay and lbian sexual orientatn.

13), the rearchers discs the study's limatns at some length, cludg the narrow mographic characteristics of the dividuals analyzed -- whe people who self-reported to be gay or straight. Such tools prent a special threat, said the thors, to the privacy and safety of gay men and women livg unr reprsive regim where homosexualy is illegal. Alex Bollger, wrg at LGBTQ Natn, wrote a post tled "HRC and GLAAD release a silly statement about the ‘gay face’ study.

Unsurprisgly, that origal work kicked up a massive fs at the time, wh many skeptil that puters, which have zero knowledge or unrstandg of somethg as plex as sexualy, uld really predict whether someone was gay or straight om their fizzog. The Stanford eggheads behd that first rearch – Yilun Wang, a graduate stunt, and Michal Kosski, an associate profsor – even claimed that not only uld nral works ss out a person’s sexual orientatn, algorhms had an even better gaydar than humans.

WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE

Notably, straight women were more likely to wear eye shadow than gay women Wang and Kosski’s dataset. Straight men were more likely to wear glass than gay men.

So, do this mean that AI really n tell if someone is gay or straight om their face? It would mean that blogil factors such as a person’s facial stcture would dite whether someone was gay or not.

*BEAR-MAGAZINE.COM* STANFORD GAY AI

Stanford Universy's AI n tell if you're gay or straight om a photo | TechSpot.

TOP