run-ollama-colab
PublicA lightweight setup to run Ollama (for local LLMs like LLaMA 3, Mistral, Gemma, etc.) directly in Google Colab with free GPU/TPU support. Perfect for experimenting with open-source language models without local hardware constraints.