The relationship between OpenAI and Microsoft has encountered a rift. To reduce its reliance on Microsoft's computing power, OpenAI is accelerating its alliance with Amazon. Amazon has invested 50 billion US dollars in OpenAI and provided 2 gigawatts of computing power support, showcasing its strong positioning in the AI field.
Xpeng Automobile will launch the GX technology on April 15. The new vehicle will redefine the standards of a technological flagship from four aspects: safety, chassis, intelligent driving, and space. Core highlights include the integration of four self-developed TuLing AI chips, providing powerful computing capabilities to support full-scenario high-level intelligent driving.
ZTE launches Co-Claw AI all-in-one machine to address security and compliance issues of open-source agents in enterprise applications, offering localized deployment and enhanced privacy protection for a secure AI environment.....
Xiaomi announced that the Redmi Book Pro 2026 AI flagship lightweight laptop will be released this month, focusing on high performance and local AI experience. Key highlights include the third-generation Intel Core Ultra X7358H processor, large-capacity battery, aiming to enhance productivity through powerful computing power and AI features.
Provides stable and efficient AI computing power and GPU rental services.
Intelligent computing power available on demand, significantly improving efficiency and competitiveness.
SandboxAQ, which uses AI and advanced computing power to change the world.
Upsonic AI provides powerful computing and management infrastructure that allows developers to seamlessly create AI agents.
Google
$0.49
Input tokens/M
$2.1
Output tokens/M
1k
Context Length
Openai
$7.7
$30.8
200
Alibaba
-
Tencent
$1
$4
32
$1.75
$14
400
Iflytek
$2
$0.8
Baidu
64
Minimax
$1.6
$16
$21
$84
128
cpatonn
Qwen3-VL-32B-Instruct AWQ - INT4 is a 4-bit quantized version based on the Qwen3-VL-32B-Instruct base model. It uses the AWQ quantization method, significantly reducing storage and computing resource requirements while maintaining performance. This is the most powerful vision-language model in the Qwen series, with comprehensive upgrades in text understanding, visual perception, context length, etc.
QCRI
Fanar-1-9B-Instruct is a powerful Arabic-English large language model developed by the Qatar Computing Research Institute (QCRI). It supports Modern Standard Arabic and multiple Arabic dialects, and is aligned with Islamic values and Arab culture.
modularStarEncoder
ModularStarEncoder-300M is an encoder model fine-tuned on the SynthCoNL dataset based on the ModularStarEncoder-1B pre-trained model. It is specifically designed for code-to-code and text-to-code retrieval tasks. This model uses hierarchical self-distillation technology, allowing users to choose different layer versions according to their computing power.
chavinlo
The Alpaca model replicated by the Tatsu team at Stanford University. This is a large language model that performs instruction fine-tuning based on LLaMA-7B. The model was trained on 4 A100 GPUs for 6 hours, with computing power donated by redmond.ai. It does not use LoRA technology and adopts the native fine-tuning method.
A mathematical computing service based on the MCP protocol and the SymPy library, providing powerful symbolic computing capabilities, including basic operations, algebraic operations, calculus, equation solving, matrix operations, etc.