finGPT-llama2-runpod
PublicDeploy FinGPT-MT-Llama-3-8B-LoRA on RunPod Serverless with llama.cpp + CUDA. Auto-scaling, OpenAI-compatible API, Q4_K_M quantization. Pay-per-use serverless inference.
Discover Popular AI-MCP Services - Find Your Perfect Match Instantly
Easy MCP Client Integration - Access Powerful AI Capabilities
Master MCP Usage - From Beginner to Expert
Top MCP Service Performance Rankings - Find Your Best Choice
Publish & Promote Your MCP Services
Deploy FinGPT-MT-Llama-3-8B-LoRA on RunPod Serverless with llama.cpp + CUDA. Auto-scaling, OpenAI-compatible API, Q4_K_M quantization. Pay-per-use serverless inference.