Graphic Media Alliance

Complete Story
 

08/26/2025

AI Hallucinations Could Cause Nightmares for Your Business

Source: Fisher Phillips, July 22, 2025

Consider the following real-life scenarios:

  • An airline’s AI-powered chatbot promises a customer that it could provide a steep discount for a bereavement flight – a promise that goes directly against company policy. A court later ruled that the airline had to honor the promise.

  • A researcher gathering background information on a prominent professor discovers evidence that the professor had been accused of making sexually suggestive comments and attempting to inappropriately touch a student – but it turns out that ChatGPT invented both the story and the citations to it.

  • HR uses AI to develop a job description for an entry-level role but didn’t read it closely enough before posting it. After no one applied, the HR reps discover that the opening required candidates to have five to seven years of experience.

  • The Chicago Sun-Times and Philadelphia Inquirer (and others) publish a syndicated summer reading list to guide readers about the next great book they should pick up for vacation – but it turns out that 10 of the 15 recommended books were made up out of thin air by GenAI.

(To prove to you these stories are all very real, you can find details about them hereherehere, and here.)

These are all examples of AI “hallucinations” – situations where generative AI produces incorrect or blatantly false pieces of information that sound all too real. And each of them caused some sort of damage to the businesses involved. What’s going on when you get one of these results, and what are some steps you can take so your business isn’t the latest to fall victim to this very concerning trend?

What Are AI Hallucinations – and Why Should You Care?

Fi Pic Article 4

AI hallucinations are confidently incorrect outputs generated by large language models (LLMs). They happen because GenAI is designed to predict the most likely next word, not to verify facts. Remember, “artificial” intelligence simulates knowledge but doesn’t embody it. And when GenAI fills in the blanks with fiction, it does so with the tone and confidence of truth – which is what makes hallucinations so dangerous.

Read full article

Printer-Friendly Version



New Lawsuit Highlights Concerns About AI Notetakers

A new lawsuit just filed against Otter.ai underscores the legal and compliance risks companies face when using AI notetakers – and serves as a good reminder to deploy best practices to reduce your risks.

Read More

Avoid the Hidden Risks of Misclassifying Employees

When businesses need workers, the question of classification arises: Should these workers be treated as W-2 employees or independent contractors? Misclassifying employees as independent contractors can have significant financial and legal consequences.

Read More

CPI Change Foretells Small USPS Price Increase

Unless Postmaster General David Steiner reverses his predecessor's policy of semi-annual postage price increases, the Postal Service's stated schedule calls for another to be effective in January 2026.

Read More

AI Hallucinations Could Cause Nightmares for Your Business

Remember, “artificial” intelligence simulates knowledge but doesn’t embody it. And when GenAI fills in the blanks with fiction, it does so with the tone and confidence of truth. Learn the 10 Steps You Can Take to Safeguard Your GenAI Use.

Read More

Hiring Right Now: How to Make the Most of an Employer’s Market

If your company is hiring for full-time, flex-to-hire, or flexible (temporary) workers in this climate, here’s what you need to know.

Read More

Your Digital Story

Everyone has a story to tell. But, we also have stories told about us...stories that affect our sales. In this week's Short Attention Span Sales Tip, Bill Farquharson challenges you to take charge of your online persona so your customers see you in the best possible light.

Read More