Contents:
- THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
- STANFORD UNIVERSY'S AI N TELL IF YOU'RE GAY OR STRAIGHT OM A PHOTO
- WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
Homo enomic, Evolvg.
STANFORD UNIVERSY'S AI N TELL IF YOU'RE GAY OR STRAIGHT OM A PHOTO
Compose fac built by averagg fac classified as most and least likely to be gay by a puter. The rearch, by Michal Kosski and Yilun Wang of Stanford Universy, claims that a puter algorhm bted humans distguishg between a gay person and a straight person when analyzg imag om public profil on a datg webse.
WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
Two gay rights groups, Human Rights Campaign and GLAAD, lled the rearch, a jot prs release, "junk science. Gay and straight people, male and female, were reprented evenly. Usg a rultg mol ma up of the distguishg characteristics, the program, when shown one photo of a gay man and one of a straight man, was able to intify their sexual orientatn 81 percent of the time.
) Human gusers rrectly intified straight fac and gay fac jt 61 percent of the time for men and 54 percent for women.
The rearchers say the paper that the rults "provi strong support" for the prenatal hormone theory of gay and lbian sexual orientatn. 13), the rearchers discs the study's limatns at some length, cludg the narrow mographic characteristics of the dividuals analyzed -- whe people who self-reported to be gay or straight. Such tools prent a special threat, said the thors, to the privacy and safety of gay men and women livg unr reprsive regim where homosexualy is illegal.