Wednesday, November 22, 2023

Algorithmic Intuition - Gaydar

When my friend A was still going out with women, other friends would sometimes ask if he was gay. An intuitive ability to guess the sexuality of other people is known as gaydar. There have been studies that appear to provide evidence that both humans and computers possess such an ability, although the reliability of this evidence has been challenged. For example, some of these studies have relied on images posted on dating sites, but images that have been crafted and selected for dating purposes may already reflect how a person of a given sexuality wishes to present thenselves in that specific context, and may not reflect how the person looks in other contexts.

The latest study claims to assess sexuality from brain waves. This has been criticized as gross and irresponsible (Rae Walker) and as unscientific (Ababa Birhane). Continuing a debate that had started with other methods of algorithmic gaydar.

More generally, there is considerable disquiet about computers attempting to segment people in this way. For a start, there are many parts of the world where homosexuality doesn't only lead to social disapproval and harassment, but also criminal penalties and sanctions. Even though the algorithms may be inaccurate, they might be used to discriminate against people, or trigger homophobic actions. Whether someone actually is gay or is a false positive is almost beside the point here, either way the algorithmic gaydar may result in individual suffering.

Furthermore, these algorithm appears to want to colonize aspects of subjectivity, of the subject's identity.

  • WyssBernard: I’m not going accept a machine determination as to what I identify as. ?¿
  • Abeba Birhane: just let people be or let people identify their own sexuality

In an interview with the editor of Wired, Yuval Noah Harari wonders whether an algorithm might have guessed he was gay before he realised it himself. And if an algorithm had been the source of this wisdom about himself, would this not have been incredibly deflating for the ego?

And Lawrence Scott describes how his Facebook timeline started to be invaded by images of attractive men, suggesting that the algorithm had somehow profiled him as being particularly susceptible to these images.


to be continued




Isobel Cockerell, Facial recognition systems decide your gender for you. Activists say it needs to stop (Codastory, 12 April 2021)

Isobel Cockerell, Researchers say their AI can detect sexuality. Critics say it’s dangerous (Codastory, 13 July 2023)

Lawrence Scott, Hell is Ourselves (The New Atlantis #68, Spring 2022, pp. 65-72)

Nicholas Thompson, When Tech Knows You Better Than You Know Yourself (Wired, 4 October 2018)

Wikipedia: Gaydar

No comments:

Post a Comment