AIbase
Product LibraryTool NavigationMCP

gpt-all-local

Public

A "chat with your data" example: using a large language models (LLM) to interact with our own (local) data. Everything is local: the embedding model, the LLM, the vector database. This is an example of retrieval-augmented generation (RAG): we find relevant sections from our documents and pass it to the LLM as part of the prompt (see pics).

Creat2023-05-16T19:22:16
Update2025-03-19T14:23:42
27
Stars
0
Stars Increase