Keep the crap going
Have you seen the new ads for Google Gemini?
In one version, just as a young employee is grabbing her fast-food lunch, she notices her snooty boss get on an elevator. So she drops her sandwich, rushes to meet her just as the doors are about to close, and submits her proposal in the form of a thick dossier. The boss asks her for a 500-word summary to consume during her minute-long elevator ride. The employee turns to Google Gemini, which digests the report and spits out the gist, and which the employee regurgitates to the boss’s approval. The end.
Isn’t this unsettling? Google isn’t alone either. In May this year, Apple released a tactless ad for its new iPad Pro. From Variety:
The “Crush!” ad shows various creative and cultural objects — including a TV, record player, piano, trumpet, guitar, cameras, a typewriter, books, paint cans and tubes, and an arcade game machine — getting demolished in an industrial press. At the end of the spot, the new iPad Pro pops out, shiny and new, with a voiceover that says, “The most powerful iPad ever is also the thinnest.”
After the backlash, Apple bactracked and apologised — and then produced two ads in November for its Apple Intelligence product showcasing how it could help thoughtless people continue to be thoughtless.
The second video is additionally weird because it seems to suggest reaching all the way for an AI tool makes more sense than setting a reminder on the calendar that comes in all smartphones these days.
And they are now joined in spirit by Google, because bosses can now expect their subordinates to Geminify their way through what could otherwise have been tedious work or just impossible to do on punishingly short deadlines — without the bosses having to think about whether their attitudes towards what they believe is reasonable to ask of their teammates need to change. (This includes a dossier of details that ultimately won't be read.)
If AI is going to absorb the shock that comes of someone being crappy to you, will we continue to notice that crappiness and demand they change or — as Apple and Google now suggest — will we blame ourselves for not using AI to become crappy ourselves? To quote from a previous post:
When machines make decisions, the opportunity to consider the emotional input goes away. This is a recurring concern I’m hearing about from people working with or responding to AI in some way. … This is Anna Mae Duane, director of the University of Connecticut Humanities Institute, in The Conversation: “I fear how humans will be damaged by the moral vacuum created when their primary social contacts are designed solely to serve the emotional needs of the ‘user’.”
The applications of these AI tools have really blossomed and millions of people around the world are using them for all sorts of tasks. But even if the ads don't pigeonhole these tools, they reveal how their makers — Apple and Google — are thinking about what the tools bring to the table and what these tech companies believe to be their value. To Google’s credit at least, its other ads in the same series are much better (see here and here for examples), but they do need to actively cut down on supporting or promoting the idea that crappy behaviour is okay.