There are few travel frustrations greater than realizing that artificial intelligence has simply made something up and that the information you relied on has no connection to reality. What makes it worse is that there is no one to scold or blame, except yourself.
For hundreds of tourists in Tasmania, this became the bitter reality of a vacation gone wrong. A tourism website called Tasmania Tours used artificial intelligence to generate written content and images and mistakenly created a nature site that does not exist: the ‘Weldborough Hot Springs.’
The misleading article was published in July 2025 and gradually gained traction online. Alongside an inviting image, it described the fictional springs as a “peaceful forest retreat” offering an “authentic connection to nature.” It promised visitors immersion in “mineral-rich therapeutic waters” and even ranked the site among “the seven best hot spring destinations in Tasmania for 2026.”
“At the peak of the madness, I was getting about five phone calls a day,” said the owner of a local pub. “Entire groups would walk in asking where the hot springs were. I told them, ‘If you find the hot springs, come back and tell me, and I’ll buy you beer all night.’ No one ever came back.”
What made the deception so convincing was the AI’s ability to blend fact and fiction. The imaginary springs appeared in the same list as real and well-known attractions, such as the Hastings Caves, lending the article an air of credibility. Pastoral AI-generated images of steaming pools surrounded by forest sealed the illusion.
A rural town, yes. Hot springs, no.
The reality in Weldborough, a small rural town in northeastern Tasmania, could not be more different. There are no hot springs, and never have been. The area’s only attractions are forests, a local pub and a river with icy-cold water.
Kristy Probert, who owns the local pub, found herself dealing with waves of confused and disappointed visitors.
“At the height of it, I was getting about five calls a day, and two or three groups would come in daily asking where the hot springs were,” she said. “The Weld River here is freezing. Honestly, you have a better chance of finding a diamond in the river than warm water.”
According to Probert, the AI error caused local chaos. “Two days ago, a group of 24 drivers came over from the mainland and made a special detour just to reach the springs,” she said. “I told them, ‘If you find them, come back and tell me, and I’ll shout you beer all night.’ No one came back.”
Following the surge of complaints, Australian Tours and Cruises, which operates the website, removed the false content. Owner Scott Hensley acknowledged the serious failure and spoke publicly about the personal toll.
“The hate we received online was devastating,” Hensley told international media outlets. “We’re just a married couple trying to move on with our lives.”
Hensley said the company outsourced content creation due to staffing shortages in an effort to “compete with the big players on Google,” and that the material was published without sufficient human oversight while he was abroad.
“Sometimes it works brilliantly, and sometimes it fails spectacularly,” he said. “I’ve seen the software create animals I’ve never seen before, like a three-legged wombat or creatures that looked like a strange mix of crocodile.”
The company apologized and stressed that it is a legitimate business now conducting a comprehensive manual review of all site content.
The Weldborough incident is an extreme example of a broader phenomenon known as AI hallucinations, in which text-generating systems fabricate facts with complete confidence. Tourism expert Prof. Anne Hardy warned that blind reliance on the technology can ruin vacations.
“We know that about 90% of AI-generated itineraries contain at least one error,” Hardy said. “Yet about 37% of travelers rely on AI to plan their trips.”
The Tasmania case serves as a painful reminder: before packing a swimsuit based on an online recommendation, it may be wise to make sure a human has verified that the destination actually exists.
This was not the first such incident. In late 2025, two tourists in Peru set out to find the ‘Sacred Canyon of Humantay’ after a chatbot recommendation, only to find themselves climbing to 4,000 meters with no cellphone reception, discovering the place did not exist and that they were in real danger.
Elsewhere, Amazon was flooded with AI-written travel guides sold under fictitious author names, featuring restaurants that had closed years earlier and meaningless advice. Even fast-food chain Taco Bell was affected, after a new voice-ordering system malfunctioned and placed an order for 18,000 cups of water for a single customer.




