Would you let a robot wash you? This question was posed by the artist group Blast Theory as they investigated the ethics of artificial intelligence in healthcare systems. The Brighton-based collective stars in ‘AI: Who’s Looking After Me?” at the Science Gallery London (until January 20), which brings together doctors, patients, artists and scientists to explore key questions surrounding AI and healthcare.
“In a care setting, being washed by a robot is a realistic possibility,” Blast Theory’s Matt Adams told Artnet News. “There’s this tension where you might not want a robot to do something so intimate; you want human contact. But the opposite argument is that it’s better to have a robot wash you so you don’t have to deal with another person’s embarrassment; you have some privacy. There are these tensions between what impersonal and private means.
Fear and suspicion of AI are growing, raising questions of privacy, artistic authenticity, and human redundancy. The exhibition avoids easy resolutions, exploring the tangled benefits and risks of artificial intelligence in contemporary life. “AI is here,” Siddharth Khajuria, director of Science Gallery London, told Artnet News. “It’s not dystopian or promising for the future. It is present and messy.
The gallery, linked to King’s College London, combines various knowledge bases. “We need to bring together different perspectives to tackle increasingly difficult societal issues,” Khajuria said. “The projects that seem messy in the most appropriate sense are collaborations between patient groups, medical engineers, and artists. When you meet them, it will be difficult to know what imagination directed or shaped it.
Projects include sound artist Wesley Goatley’s immersive installation of ancient voice assistants and Fast Familiar’s exploration of the romantic potential of a machine that learned all about love from the internet. For Vine, Dr. Oya Celiktutan, head of the Social AI and Robotics Lab in King’s Department of Engineering, collaborated with soft robotics studio Air Giants and King students Jeffrey Chong, Theodore Lamarche and Bowen Liu. The result is a “cuddly” robot, which emotionally interacts with visitors.
“I’m interested in non-verbal communications between people,” Celiktutan told Artnet News. “I’m interested in how we can mimic that with robots so they can be clear and build trust with humans. This robot really bears no resemblance to a human, but with this basic form it can communicate and connect using non-verbal movements.
Contrary to the violent image of robots often stereotyped in movies, Vine invites trust and touch. “One of the big questions is, ‘What can we do to make a robot more accessible?'” Chong said. “Also, what can a robot do that makes you trust it and want to interact with it? What buttons can he press on the human brain or what behaviors can he display to make you think of him as a conversation partner? »
VineThe cuddly aspect raises the question of aesthetics in robotics. “Soft robotics is interesting because it looks cute,” Lamarche said. “I think most of the time people are scared of AI because of job replacement, but soft robotics is getting a lot of interest in the healthcare industry where there aren’t enough people One example is the PARO robot, which is a little seal. It can be used for dementia patients and has a soft soothing light to allow people to physically and mentally interact.
Artist Mimi Ọnụọha takes a behind-the-scenes look at AI, focusing on the human labor that powers it. While the end user may view AI as independent of humans, many systems require large amounts of manual tagging. Ọnụọha’s The future is here! investigates the workspaces of the outsourced workforce, which largely operates remotely from bedrooms, front rooms and cafes in the Global South.
“It’s so tedious and intense,” Ọnụọha told Artnet News. “It’s important work, but they won’t be paid the same as AI specialists or researchers. AI saves time, but from whom?” She points to the similarities between this division of labor and the injustices of longer-lived industries, like fast fashion. “These are old patterns of labor architecture, but the goals are for this new technology.”
Ọnụọha does not call for a reversal of our relationship to these technologies, but a thoughtful approach to their use. “We need to insert some friction into how people approach these tools,” she said. “What is this ecosystem and how do we want it to be? What kinds of power differentials are we considering? If people can consider that while still retaining the potential of AI, I think that’s great. We’re past the point of being able to throw it away. The question becomes how to think strategically.
While most projects focus on human relationships with AI, Blast Theory invites a third species into the conversation: house cats. For royal cat, the group, its employees, experts in animal behavior and welfare agents have set up a controlled experiment. For 72 hours over three-hour periods, cats were observed interacting with a robotic arm providing a “game” every six minutes, such as pulling a feather or throwing a ball. The system gradually learned each cat’s response, calculating each game’s happiness levels and tailoring its offerings.
Of all the animals, cats have added an interesting dimension due to their aloof nature. “Cats are notoriously imperious, opinionated, and harmless,” Adams said. “There was something interesting about a cat out of all the animals we have a close relationship with. They’re not just going to be tricked into agreeing to something.
The resulting video raises questions about the role of humans. Naturally, this type of home care system could supplant the homeowner. “There were times when robots were playing a game with a cat and it almost felt like the cat was enjoying it more than if it was playing with a human,” Adams said. “The human is kind of an interrupter, a disruptor. The cat wants to do prey behavior, but if a human is there, they make noise and have emotional weight. They could be a figure of power , potentially the owner of the animal. Of course, this threatens for us humans, who want to be special.”
The exhibit is a timely reminder of the extent to which AI is intertwined with humans, reflecting the good and evil that already exist within our structures. “Ultimately, robots are what we make of them,” Chong said. “I think the reason scary robots are so popular in the media is that they reflect a fear we have of other humans. It’s a reflection of the inherent danger in humanity.
Kahjuria agrees with this take, emphasizing the importance of challenging the underlying biases that drive AI systems. “There are so many emerging technologies that are deliberately presented to feel magical and elegant,” he said. “But at the end of the day, all AI is the result of humans in a room making decisions, and there’s usually a certain type of person in those meetings and a certain power dynamic. These conversations embed value systems and biases into the products they make. I hope the show reminds people how human this stuff is.
“IA: Who takes care of me?is on view at Science Gallery London, King’s College London, until 20 January 2024.
More trending stories:
A couple renovating their kitchen in Denmark found an ancient stone engraved with Viking runes
Follow Artnet News on Facebook:
Want to stay one step ahead of the art world? Subscribe to our newsletter to receive breaking news, revealing interviews and incisive reviews that move the conversation forward.