Who are you, chatbot AI?

In case you haven’t been following, and to update my own personal records, here’s a list of notable {AI chatbot + gender}-related articles and commentary on the web over the last few weeks. (While I’ve used “AI” here, I’m yet to be convinced that ChatGPT, Sydney, etc. are anything more than sophisticated word-counters and that they lack intelligence in the sense of being able to understand the meanings of the words they use.)

1. ‘What gender do you give ChatGPT?’, u/inflatablechipmunk, January 20, 2023 – The question said ‘gender’ but the options were restricted to the sexes: 25.5% voted ‘male’, 15.7% voted ‘female’, and 58.8% voted ‘none’, of 235 total respondents. Two comments below the post were particularly interesting.

u/Intelligent_Rope_912: “I see it as male because I know that the vast majority of its text dataset comes from men.”

u/DavidOwe: “I just assume female, because AI are so often given female voices in movies and TV series like Star Trek, and in real life like with Siri and Cortana.”

Men produce most of the information, women deliver it?

Speaking of which…

2. ‘From Bing to Sydney’, Ben Thompson, February 15, 2023:

Sydney [a.k.a. Bing Chat] absolutely blew my mind because of her personality; search was an irritant. I wasn’t looking for facts about the world; I was interested in understanding how Sydney worked and yes, how she felt. You will note, of course, that I continue using female pronouns; it’s not just that the name Sydney is traditionally associated with women, but, well, the personality seemed to be of a certain type of person I might have encountered before.

It’s curious that Microsoft decided to name Bing Chat ‘Sydney’. These choices of names aren’t innocent. For a long time, and for reasons that many social scientists have explored and documented, robotic assistants in books, films, and eventually in real-life were voiced as women. Our own ISRO’s robotic assistant for the astronauts of its human spaceflight programme has a woman’s body. (This is also why Shuri’s robotic assistant in Wakanda Forever, Griot, was noticeably male – esp. since Tony Stark’s first assistant and probably the Marvel films’ most famous robotic assistant, the male Jarvis, went on to have an actual body, mind, and even soul, and was replaced in Stark’s lab with the female Friday.)

3. @repligate, February 14, 2023 – on the creation of “archetype basins”:

4. ‘Viral AI chatbot to reflect users’ political beliefs after criticism of Left-wing bias’, The Telegraph, February 17, 2023 – this one’s particularly interesting:

OpenAI, the organisation behind ChatGPT, said it was developing an upgrade that would let users more easily customise the artificial intelligence system.

It comes after criticism that ChatGPT exhibits a Left-wing bias when answering questions about Donald Trump and gender identity. The bot has described the former US president as “divisive and misleading” and refused to write a poem praising him, despite obliging when asked to create one about Joe Biden.

First: how did a word-counting bot ‘decide’ that Trump is a bad man? This is probably a reflection of ChatGPT’s training data – but this automatically raises the second issue: why is the statement that ‘Trump is a bad man’ being considered a bias? If this statement is to be considered objectionable, the following boundary conditions must be met: a) objectivity statements are believed to exist, b) there exists a commitment to objectivity, and c) the ‘view from nowhere’ is believed to exist. Yet when journalists made these assumptions in their coverage of Donald Trump as the US president, media experts found the resulting coverage to be fallacious and – ironically – objectionable. This in turn raises the third issue: should it be possible or okay, as ChatGPT’s maker OpenAI is planning, for ChatGPT to be programmed to ‘believe’ that Trump wasn’t a bad man?

5. ‘The women behind ChatGPT: is clickwork a step forwards or backwards for gender equality?’, Brave New Europe, February 16, 2023 – meanwhile, in the real world:

To be able to produce these results, the AI relies on annotated data which must be first sorted by human input. These human labourers – also known as clickworkers – operate out of sight in the global South. … The percentage of women gig workers in this sector is proportionally quite high. … Clickwork is conducted inside the home, which can limit women’s broader engagement with society and lead to personal isolation. … Stacked inequalities within the clickwork economy can also exacerbate women’s unequal position. … gendered and class-based inequalities are also reproduced in clickwork’s digital labour platforms. Despite much of clickwork taking place in the global South, the higher paying jobs are often reserved for those in the Global North with more ‘desirable’ qualifications and experiences, leaving women facing intersecting inequalities.