AIbase
Product LibraryTool NavigationMCP

run-ollama-colab

Public

A lightweight setup to run Ollama (for local LLMs like LLaMA 3, Mistral, Gemma, etc.) directly in Google Colab with free GPU/TPU support. Perfect for experimenting with open-source language models without local hardware constraints.

Creat2024-01-15T14:52:43
Update2025-05-03T00:20:48
0
Stars
0
Stars Increase

Related projects