
Tom May, a mycologist at the Royal Botanic Gardens in Melbourne, Australia, examines the death cap mushroom. William West/AFP via Getty Images
Field guides always vary in quality. But some of them are now being written with artificial intelligence chatbots – which appear to be written by human experts – increasing the likelihood that readers will receive dangerous advice.
Recently New York Mycological Society Warning posted About Amazon and other retailers to offer mushroom foraging and identification books written with AI “Please only buy books by known authors and foragers, it could mean life or death,” wrote X.
It shared another post describing such guidebooks as “the deadliest AI scam I’ve ever heard of…the authors are discovered, their credentials are discovered, and their species ID will kill you.”
Recently in Australia, three people died after a family meal, with authorities blaming the death cap mushroom. The invasive species originated in parts of the UK and Ireland but has spread to Australia and North America, according to the report National Geographic. It is difficult to distinguish from edible mushrooms.
“There are hundreds of poisonous fungi in North America, and many are deadly,” Sigrid Jacob, president of the New York Mycological Society, told 401 Media. “They can look like popular edible species. A poor description in a book can mislead one into eating poisonous mushrooms.
fortune Amazon was contacted for comment but did not immediately receive a reply. The company said parents, “We take matters like this seriously and are committed to providing a safe shopping and reading experience. We are looking into it.”
The problem of AI-written books is likely to grow in the coming years as more scammers turn to chatbots to generate content.
Last month, the The New York Times Reported on travel guidebooks written by chatbots. Of the 35 passages submitted to an artificial intelligence detector from a firm called Originality.ai, all were given a score of 100, meaning they were almost certainly written by an AI.
Jonathan Gillham, founder of Originality.ai, warned that such books are “dangerous” if they encourage readers to travel to unsafe places.
And it’s not just books, of course. A recent odd MSN travel article created with “algorithmic techniques” listed a food bank in Ottawa as a top destination, telling readers, “Consider going on an empty stomach.”
According to Leon Frey, a field mycologist and forage guide in the UK parents They found critical errors in mushroom field guides that are suspiciously written by AI: “smell and taste” are the distinguishing features. “It seems to encourage tasting as a method of identification,” he said. “It shouldn’t be like that at all.”
parents Suspicious samples from such books have also been submitted to Originality.ai, which reports that each has a 100% rating on its AI-detection score.