On October 16, Alibaba Cloud announced through its official account that the Qwen Chat Memory feature of its Tongyi Qianwen has officially launched. The introduction of this new feature marks another important step for Tongyi Qianwen in the field of AI intelligent assistants. Users can now enjoy the "long-term memory" capability of the AI assistant, which not only understands the context of users but also retains important information and actively recalls previous conversations.

The launch of Qwen Chat Memory enables the AI assistant to store relevant memory information about users. During conversations, it can provide more personalized responses based on previous interactions. This means that whether discussing hobbies or work-related matters, Qwen can promptly retrieve previous information to create a coherent conversation experience.

image.png

Notably, Alibaba Cloud had previously released Qwen3-Max, the largest and most powerful language model from the Tongyi team to date. The Qwen3-Max model has over 1 trillion parameters and was pre-trained using 36 trillion tokens, employing advanced global-batch load balancing loss technology. This powerful model lays a solid foundation for Qwen Chat Memory's "memory" capabilities.

Before this, other AI products such as ChatGPT, Gemini, and Grok have already introduced similar memory features. The core of this functionality is the ability to remember relevant details based on users' past conversations, thereby better meeting user needs. When users request a response, the AI model combines previous conversation content to provide answers that better align with user preferences. This intelligent improvement marks progress in AI assistants' understanding of users.

image.png

Tongyi Qianwen's update not only enhances the quality of interaction between users and AI, but also makes people look forward to the future of intelligent assistants. With continuous technological advancement, AI will become increasingly like a personalized assistant that understands and remembers you.