llm-vscode-inference-server
PublicAn endpoint server for efficiently serving quantized open-source LLMs for code.
Creat:2023-09-25T08:51:21
Update:2024-11-12T16:12:52
55
Stars
0
Stars Increase
An endpoint server for efficiently serving quantized open-source LLMs for code.