A Stanford scientist says he built a gaydar g "the lamt" AI to prove a pot

gay ai detector

Cur or serly wonrg if you're gay? Fd out now - try this very accurate tt. Don't keep puttg off knowg the tth! End the limbo & live your bt life!

Contents:

THE FAMO AI GAYDAR STUDY WAS REPEATED – AND, NO, N'T TELL IF YOU'RE STRAIGHT OR NOT JT OM YOUR FACE

Currently, Amerin gay people believe they have a unique abily to pick each other out a crowd (often termed "gaydar" ["gay" + "radar"]). This was tablished through a natnwi Inter-mediated survey (n = 460). To tt for the prence of this abily gay men, the rearcher asked self- … * gay ai detector *

Even though there are exampl throughout history om the ancient Greeks to the 18th century of people practisg physgnomy, basilly judgg a person’s character or liftyle om their facial featur, a recent study om Stanford Universy giv a morn-day versn to ntemplate—puters termg if a person is gay or straight through facial-tectn technology. Datg webse and found that a puter algorhm was rrect 81% of the time when was ed to distguish between straight and gay men, and accurate 74% of the time for women. One pattern the mach tected the study was gay women and men typilly had “genr-atypil, ” “groomg styl, ” featur and exprsns—gay men appeared more feme and gay women appeared more mascule.

Another trend the mach intified was that gay women tend to have larger jaws and smaller foreheads then straight women while gay men had larger foreheads, longer nos and narrower jaws than straight men.

As quoted an article by The Guardian on the subject, Nick Rule, an associate profsor of psychology at the Universy of Toronto who has published rearch on the science of gaydar said, “It’s certaly unsettlg. All men wh genr-atypil exprsns are gay)?

A STANFORD SCIENTIST SAYS HE BUILT A GAYDAR G “THE LAMT” AI TO PROVE A POT

Gay Robot is an AI Bot enhanced wh ChatGPT * gay ai detector *

Unsurprisgly, that origal work kicked up a massive fs at the time, wh many skeptil that puters, which have zero knowledge or unrstandg of somethg as plex as sexualy, uld really predict whether someone was gay or straight om their fizzog. The Stanford eggheads behd that first rearch – Yilun Wang, a graduate stunt, and Michal Kosski, an associate profsor – even claimed that not only uld nral works ss out a person’s sexual orientatn, algorhms had an even better gaydar than humans. Notably, straight women were more likely to wear eye shadow than gay women Wang and Kosski’s dataset.

GAYDAR: VISUAL TECTN OF SEXUAL ORIENTATN AMONG GAY AND STRAIGHT MEN

* gay ai detector *

Straight men were more likely to wear glass than gay men. So, do this mean that AI really n tell if someone is gay or straight om their face? It would mean that blogil factors such as a person’s facial stcture would dite whether someone was gay or not.

“The paper propos replitg the origal 'gay fac' study a way that addrs ncerns about social factors fluencg the classifier.

In some untri, homosexualy is illegal, so the technology uld endanger people’s liv if ed by thori to "out" and ta spected gay folk. “Moreover, this entire le of thought is premised on the ia that there is value to be gaed workg out why 'gay face' classifiers might work – value further scribg, fg and settg out the methodology for any tpot dictator or bigot wh a puter who might want to opprs queer people. Last week, The Enomist published a story around Stanford Graduate School of Bs rearchers Michal Kosski and Yilun Wang’s claims that they had built artificial telligence that uld tell if we are gay or straight based on a few imag of our fac.

 GAY ROBOT

It me om those who follow AI rearch, as well as om LGBTQ groups such as Gay and Lbian Advot & Defenrs (GLAAD). What their technology n regnize is a pattern that found a small subset of out, whe gay and lbian people on datg s who look siar. Those two fdgs should not be nflated, ” Jim Halloran, GLAAD’s chief digal officer, wrote a statement claimg the paper uld e harm exposg methods to target gay the other hand, LGBTQ Natn, a publitn foced on issu the lbian, gay, bisexual, transgenr, queer muny, disagreed wh GLAAD, sayg the rearch intified a potential threat.

THE 100% ACCURATE GAY TT ?️‍?

Kosski asserted an terview wh Quartz that regardls of the methods of his paper, his rearch was service of gay and lbian people that he se unr siege morn society. He says his work stands on the shoulrs of rearch happeng for s—he’s not reventg anythg, jt translatg known differenc about gay and straight people through new technology. You take some data— this se was 15, 000 pictur of gay and straight people om a popular datg webse—and show to a ep-learng algorhm.

Rather, should foc on permanent featur like the length of the nose or width of the face, bee those stay nsistent when regnizg a thors say this is necsary bee, sce the pictur are om a datg se, gay and straight people might e different facial exprsns, mera angl, or pos to seem attractive based on the sex they’re tryg to attract.

It’s an ia backed by soclogists, and if your claim is that facial stcture is different for gay people and straight people, then you fely don’t want somethg as fleetg as a facial exprsn to nfe that. A heat map of where the algorhm looks to tect signs of homosexualyImage: Kosski and WangBut some AI rearchers doubt that VGG-Face is actually ignorg exprsn and pose, bee the mol isn’t beg ed for s tend e, to simply intify people.

AI CAN TELL IF YOU'RE GAY: ARTIFICIAL INTELLIGENCE PREDICTS SEXUALY FROM ONE PHOTO WH STARTLG ACCURACY

For stance, they found that gay and straight men wore different facial hair, and gay women were more prone to wear baseball is jt one crique: we don’t know for sure that this is te, and ’s impossibly difficult to fd out given the rmatn provid the paper or even if you had the algorhm. But this potential explanatn based on the tt of another AI rearcher throws doubt to the ia that VGG-Face n be ed as a perfect oracle to tect somethg about a person’s facial featur while ignorg nfoundg algorhm generated the optimally gay and straight Kosski and WangThe send aspect of this rearch, apart om the algorhm, are the data ed to tra the facial-regnn system.

The pany uld have disclosed technologil or cultural bias herent the data for rearchers to way, ’s unclear how imag of people taken om datg webs and sorted only to gay and straight tegori accurately reprent their sexualy.

*BEAR-MAGAZINE.COM* GAY AI DETECTOR

The famo AI gaydar study was repeated – and, no, n't tell if you're straight or not jt om your face • The Register .

TOP