Two proment LGBT groups lashed out at rearchers who found that a facial regnn software uld duce whether a person was gay or heterosexual.
Contents:
IS YOUR FACE GAY? CONSERVATIVE? CRIMAL? AI REARCHERS ARE ASKG THE WRONG QUTNS
They gleaned more than 35, 000 pictur of self-intified gay and heterosexual people om a public datg webse and fed them to an algorhm that learned the subtle differenc their featur. They then showed the software randomly selected face pictur and asked to gus whether the people them were gay or rults were unsettlg. Acrdg to the study, first published last week, the algorhm was able to rrectly distguish between gay and heterosexual men 81 percent of the time, and gay and heterosexual women 71 percent of the time, far outperformg human judg.
Given the prevalence of such technology, the rearchers wrote, “our fdgs expose a threat to the privacy and safety of gay men and women.
” Far om protectg the LGBT muny, they say, uld be ed as a weapon agast gay and lbian people as well as heterosexuals who uld be accurately “outed” as gay. “Image for a moment the potential nsequenc if this flawed rearch were ed to support a btal regime’s efforts to intify and/or persecute people they believed to be gay, ” said HRC’s Ashland Johnson, director of public tn and rearch. The groups’ news release was “full of unterfactual statements, ” they study, which was peer reviewed and accepted for publitn the Journal of Personaly and Social Psychology, found that an algorhm uld differentiate between gay and heterosexual men and women most of the time g a sgle photograph.