Skip to main content
Back to Pulse
TechCrunch

Asking chatbots for short answers can increase hallucinations, study finds

Read the full articleAsking chatbots for short answers can increase hallucinations, study finds on TechCrunch

What Happened

Turns out, telling an AI chatbot to be concise could make it hallucinate more than it otherwise would have. That’s according to a new study from Giskard, a Paris-based AI testing company developing a holistic benchmark for AI models. In a blog post detailing their findings, researchers at Gisk

Our Take

We are tracking this story. Our take is coming soon.

What To Do

Check back for our analysis.

Cited By

React

Loading comments...