And what is even more frustrating?
There is no one to scold and no one to blame — except ourselves (you will understand why later in the article).
For hundreds of tourists in Tasmania, this became the disappointing reality of the holiday they had set out on.
The travel website “Tasmania Tours” used artificial intelligence (AI) to generate content and images for its site, and mistakenly created a natural attraction that does not exist at all: the “Weldborough Hot Springs.”
The problematic article went online in July 2025 and has since attracted significant web traffic.
Alongside an enticing image, it described the imaginary springs as a “peaceful retreat in the heart of the forest” and a place offering an “authentic connection to nature.” The seductive text promised visitors a soak in waters “rich in therapeutic minerals,” and even ranked the location as one of the “seven best hot spring sites in Tasmania for 2026.”
“At the height of the madness, I was getting about five phone calls a day, and whole groups would come into the hotel asking where the springs were,” said the owner of a local pub.
“I told them: ‘If you find the hot springs, come back and tell me, and I’ll take care of your beer all night on the house.’ No one ever came back.”
What caused tourists to fall into the trap was the fact that the AI was sophisticated enough to blend truth and falsehood: the invented springs appeared on a list alongside entirely real and well-known attractions, such as the Hastings Caves, which gave the information a high level of credibility.
The article was accompanied by pastoral AI-generated images of steaming pools in the wilderness, which finally convinced even the hesitant.
The reality on the ground in Weldborough, a rural town in the island’s northeast, was completely different.
There are no hot springs there, and there never have been.
The only attractions in the area are forests, a local pub, and a river whose water is freezing cold.
Christie Probert, the owner of the local pub, was forced to deal with a wave of helpless tourists.
“At the peak of it, I was receiving about five phone calls a day, and every day two or three groups would come into the hotel asking where the springs were,” Probert said.
“The Wild River that runs through here is absolutely freezing. Honestly, you have a better chance of finding a diamond in the river than hot water.”
According to her, the AI mistake created local chaos.
“Two days ago, a group of twenty-four drivers came from the mainland, making a special detour from their route just to reach the springs. I told them: ‘If you find the hot springs, come back and tell me, and I’ll take care of your beer all night on the house.’ No one came back.”
Following the many inquiries, the company “Australian Tours and Cruises,” which operates the site, removed the false content.
The owner, Scott Hensy, admitted the colossal failure and spoke of the heavy personal cost.
“The hatred we received online was devastating to the soul,” Hensy said in interviews with the global media.
“We are just a married couple trying to move forward with our lives.”
Hensy explained that the company outsourced content writing due to a “lack of manpower” to produce enough material independently, in an effort to “compete with the big players on Google.” He said the materials were published without sufficient human oversight while he was overseas.
“Sometimes it works wonderfully, and sometimes it fails massively,” Hensy added.
“I saw the software generate animals I had never seen before, like a wombat with three legs or creatures that looked like a strange combination of a crocodile.” The company apologised and clarified that it is a legitimate business, and that a comprehensive manual review of all website content is now underway. The Weldborough case is an extreme example of a broader phenomenon known as “AI hallucinations,” in which text generators invent facts with complete confidence.
Professor Anne Hardy, a tourism expert, warns that blind reliance on the technology can ruin holidays.
“We know that today, about ninety percent of travel itineraries generated by artificial intelligence contain at least one mistake,” Hardy says.
“Despite that, about thirty-seven percent of travellers rely on AI to plan their trips.” The Tasmania case serves as a painful reminder: before packing a swimsuit based on an online recommendation, it is worth making sure a human has verified that the destination actually exists.
This is not the first case in the past year in which artificial intelligence has sent people on absurd or dangerous missions.
At the end of 2025, two tourists in Peru were reported to have gone searching for the “Sacred Canyon of Humantay” following a chatbot recommendation.
They found themselves climbing to an altitude of four thousand metres with no cellular reception, only to discover that the place did not exist and that they were in fact in serious danger.
Another phenomenon troubling travellers in 2025 was Amazon being flooded with fake travel guides written by AI under fictional author names.
The guides, sold in the thousands, contained recommendations for restaurants that had closed years earlier and meaningless tips.
Even the fast-food chain Taco Bell experienced the force of the technology, when its new voice ordering system malfunctioned and placed an order for no fewer than eighteen thousand cups of water for a single customer.












































