Use Ollama to test Phi 3 on your local PC

Microsoft's new language model (SLM), Phi 3, is so small you can run it locally on your device. So, in this video, let's learn how to do that, and why you might want a small language model running locally. We'll install Ollama, download the model Phi 3, run it through its paces, and talk about why you might want this option.

Nick DeCourcy

Nick DeCourcy is the owner and principal consultant at the Bright Ideas Agency. He has worked extensively in the education and non-profit sectors in areas including operations, facilities, and technology. He is passionate about getting technology implementation right, first time, by fully understanding how it impacts the employee and customer experience.

Previous
Previous

Why Big Tech's AI assertions are increasingly far from reality

Next
Next

How to Automate Loop with Task Rules (Microsoft Loop and Power Automate)