Contents:
- THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
- STANFORD UNIVERSY'S AI N TELL IF YOU'RE GAY OR STRAIGHT OM A PHOTO
- WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE
Homo enomic, Evolvg. Compose fac built by averagg fac classified as most and least likely to be gay by a puter.
The rearch, by Michal Kosski and Yilun Wang of Stanford Universy, claims that a puter algorhm bted humans distguishg between a gay person and a straight person when analyzg imag om public profil on a datg webse.
STANFORD UNIVERSY'S AI N TELL IF YOU'RE GAY OR STRAIGHT OM A PHOTO
Two gay rights groups, Human Rights Campaign and GLAAD, lled the rearch, a jot prs release, "junk science. Gay and straight people, male and female, were reprented evenly. Usg a rultg mol ma up of the distguishg characteristics, the program, when shown one photo of a gay man and one of a straight man, was able to intify their sexual orientatn 81 percent of the time.
WHY STANFORD REARCHERS TRIED TO CREATE A ‘GAYDAR’ MACHE
) Human gusers rrectly intified straight fac and gay fac jt 61 percent of the time for men and 54 percent for women. The rearchers say the paper that the rults "provi strong support" for the prenatal hormone theory of gay and lbian sexual orientatn.