kiroi.org

KIROI - Artificial Intelligence Return on Invest
The AI strategy for decision-makers and managers

Business excellence for decision-makers & managers by and with Sanjay Sauldie

KIROI - Artificial Intelligence Return on Invest: The AI strategy for decision-makers and managers

KIROI - Artificial Intelligence Return on Invest: The AI strategy for decision-makers and managers

Start » AI Hallucination (Glossary)
25 June 2024

AI Hallucination (Glossary)

4.6
(530)

AI Hallucination belongs to the categories Artificial Intelligence and Digital Transformation.

AI hallucination describes a phenomenon that can occur when using artificial intelligence (AI): The AI „hallucinates“ and outputs information that is not true at all - it invents facts, so to speak. This can happen especially when an AI works with little or contradictory data. Even modern language models such as ChatGPT are affected by this.

An example: You ask an AI for the date of birth of a famous person. Instead of answering correctly, the AI gives an incorrect date that it has „invented“ because it has incorrectly linked certain data or has not found any precise information about it. To the layperson, the information seems credible, but it is completely false.

The risk of AI hallucination exists wherever AI is used – whether in business reports, when creating summaries, or in customer communication. Therefore, it is important to double-check AI responses and not trust them blindly.

By understanding the concept of AI hallucination, decision-makers become more aware of how to approach AI critically and responsibly, and review information once again.

How useful was this post?

Click on a star to rate it!

Average rating 4.6 / 5. Vote count: 530

No votes so far! Be the first to rate this post.

Spread the love

Leave a comment