Understand AI Hallucinations. How to SUCCEED with Microsoft 365 Copilot even if it's WRONG!
We've all heard horror stories about generative AI tools like ChatGPT, Gemini, or Copilot getting things spectacularly wrong. But do you understand why these tools sometimes get stuff so wrong when ordinarily it seems like they know everything?
In this video we'll explore a little about how the large language models powering AI tools like GPT-4 work, how they are trained, and why they get stuff wrong. Why is this a particular issue for businesses? What is grounding (or retrieval augmented generation)? And if you're adopting AI, what should you do about it?
Whether your tool of choice is Copilot, Copilot for Microsoft 365 (Microsoft 365 Copilot), ChatGPT, Gemini, Claude, or something else, this information equally applicable to learning a little more about how AI works for you.