Lbian, gay, bisexual, and transgenr (LGBT) people Sat Vcent and the Grenad face bias-motivated vlence and discrimatn their daily life, Human Rights Watch said a report released today. The legislature should repeal the untry’s lonial-era laws that crimalize nsensual same-sex nduct and pass prehensive civil legislatn prohibg discrimatn based on sexual orientatn and genr inty. The 58-page report, “‘They Can Harass Us Bee of the Laws’: Vlence and Discrimatn agast LGBT People Sat Vcent and the Grenad,” expos the physil and verbal asslts, fay vlence, homelsns, workplace harassment, bullyg, and sexual vlence that sexual and genr mori face unr the shadow of discrimatory laws. Those rponsible for mistreatment clu people close to LGBT people – fay members, neighbors, workers, classmat, and teachers – as well as strangers and police officers.
Contents:
- 'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
- IS YOUR FACE GAY? CONSERVATIVE? CRIMAL? AI REARCHERS ARE ASKG THE WRONG QUTNS
- 'GAY PRI WEEK' BOSTON: IN-YOUR-FACE PRAVY, VULGARY, BLASPHEMY -- SPONSORED BY MAJOR POLICIANS & RPORATE AMERI
- IN YOUR FACE GAYIN YOUR FACE GAYIN YOUR FACE GAYIN YOUR FACE GAYIN YOUR FACE GAY
'I WAS SHOCKED WAS SO EASY': MEET THE PROFSOR WHO SAYS FACIAL REGNN N TELL IF YOU'RE GAY
* in your face gay *
Aga, I am not anti homosexual, and I do not hate anyone based on their genr or inty. Weeks after his trip to Mosw, Kosski published a ntroversial paper which he showed how face-analysg algorhms uld distguish between photographs of gay and straight people.
”In a paper published last year, Kosski and a Stanford puter scientist, Yilun Wang, reported that a mache-learng system was able to distguish between photos of gay and straight people wh a high gree of accuracy. Prented wh two pictur – one of a gay person, the other straight – the algorhm was traed to distguish the two 81% of s volvg imag of men and 74% of photographs of women. Human judg, by ntrast, were able to intify the straight and gay people 61% and 54% of s, rpectively.
“I was jt shocked to disver that is so easy for an algorhm to distguish between gay and straight people, ” Kosski tells me. ’ Photograph: Jason Henry/The GuardianNeher did many other people, and there was an immediate backlash when the rearch – dubbed “AI gaydar” – was previewed the Enomist magaze.
IS YOUR FACE GAY? CONSERVATIVE? CRIMAL? AI REARCHERS ARE ASKG THE WRONG QUTNS
) There was also anger that Kosski had nducted rearch on a technology that uld be ed to persecute gay people untri such as Iran and Sdi Arabia, where homosexualy is punishable by ath.
'GAY PRI WEEK' BOSTON: IN-YOUR-FACE PRAVY, VULGARY, BLASPHEMY -- SPONSORED BY MAJOR POLICIANS & RPORATE AMERI
His fdgs are nsistent wh the prenatal hormone theory of sexual orientatn, he says, which argu that the levels of androgens foet are exposed to the womb help terme whether people are straight or gay. “Th, ” he wr his paper, “gay men are predicted to have smaller jaws and chs, slimmer eyebrows, longer nos and larger foreheads... He be prickly when I prs him on Rsia, potg to s dire rerd on gay rights.
IN YOUR FACE GAYIN YOUR FACE GAYIN YOUR FACE GAYIN YOUR FACE GAYIN YOUR FACE GAY
(A uple of days later, Kosski tells me he has checked his slis; fact, he says, he didn’t tell the Rsians about his “AI gaydar”.