SaturdayMP Show #92: Local LLM w/ Ollama

In this episode I show how to run a local LLM, in this case Gemma4, using Ollama. Thanks to the Weekly Dev Chat folks for the help and inspiration.

Ollama:
https://ollama.com/

OpenCode:
https://opencode.ai/

Gemma4:
https://deepmind.google/models/gemma/gemma-4/

Weekly Dev Chat:
https://weeklydevchat.com/

Have a question or problem for a future video? Constructive feedback? Then comment, DM me, or send an email to ask@saturdaymp.com.

If you enjoyed this video useful, you can help others find it by liking, subscribing, sharing, and/or sponsoring:

https://github.com/sponsors/saturdaymp

Thanks for watching!

This entry was posted in Saturday MP Show and tagged , , , , , . Bookmark the permalink.