AI gives voice to dead animals in Cambridge exhibition
Creatures can converse and share their stories by voice or text through visitors’ mobile phones at Museum of Zoology If the pickled bodies, partial skeletons and stuffed carcasses that fill museums seem a little, well, quiet, fear not. In the latest coup for artificial intelligence, dead animals are to receive a new lease of life to share their stories – and even their experiences of the afterlife.
More than a dozen exhibits, ranging from an American cockroach and the remnants of a dodo, to a stuffed red panda and a fin whale skeleton, will be granted the gift of conversation on Tuesday for a month-long project at Cambridge University’s Museum of Zoology.
Equipped with personalities and accents, the dead creatures and models can converse by voice or text through visitors’ mobile phones. The technology allows the animals to describe their time on Earth and the challenges they faced, in the hope of reversing apathy towards the biodiversity crisis.
“Museums are using AI in a lot of different ways, but we think this is the first application where we’re speaking from the object’s point of view,” said Jack Ashby, the museum’s assistant director. “Part of the experiment is to see whether, by giving these animals their own voices, people think differently about them. Can we change the public perception of a cockroach by giving it a voice?”
The project was devised by Nature Perspectives, a company that is building AI models to help strengthen the connection between people and the natural world. For each exhibit, the AI is fed specific details on where the specimen lived, its natural environment, and how it arrived in the collection, alongside all the available information on the species it represents.
The exhibits change their tone and language to suit the age of the person they are talking to, and can converse in more than 20 languages, including Spanish and Japanese. The platypus has an Australian twang, the red panda is subtly Himalayan, and the mallard sounds like a Brit. Through live conversations with the exhibits, Ashby hopes visitors will learn more than can fit on the labels that accompany the specimens.
As part of the project, the conversations that visitors hold with the exhibits will be analysed to get a better picture of the information people want on specimens. While the AI suggests a number of questions, such as asking the fin whale “tell me about life in the open ocean”, visitors can ask whatever they like.
“When you talk to these animals, they really come across as personalities, it’s a very strange experience,” Ashby said. “I started by asking things like ‘where did you live?’ and ‘how did you die?’, but ended up with far more human questions.”
Asked what it used to eat, the museum’s dodo, one of the most complete specimens in the world, described its Mauritian diet of fruits, seeds and the occasional small invertebrate, explaining how its strong, curved beak was perfect for cracking open the tough fruits of the tambalacoque tree.
The AI-enhanced exhibit also shared its views on whether humans should attempt to bring the species back through cloning. “Even with advanced techniques, the dodo’s return would require not just our DNA but the delicate ecosystem of Mauritius that supported our kind,” it said. “It’s a poignant reminder that the true essence of any life goes beyond the genetic code – it’s intricately woven into its natural habitat.”
The fin whale skeleton, which hangs from the museum roof, was granted a similar level of apparent thoughtfulness. Asked about the most famous person it had met, it conceded that while alive it did not have the chance to meet “famous” individuals as humans see them. “However,” the AI-powered skeleton continued, “I like to think that anyone who stands below me and feels awe, reverence and love for the natural world is someone of significance.”