Developed by over 100 contributors from around the world, Local III introduces a user-friendly local model browser deeply integrated with inference engines like Ollama. It provides tailored configurations for open-source models like Llama3, Moondream, and Codestral, ensuring reliable offline code interpretation. Local III also introduces a free, hosted, optional model via the interpreter – model i. Conversations with model i will be used to train our own open-source computer control language model.