Recently, a heated debate about the origins of AI architecture has been trending on social media. Arthur Mensch, CEO of Mistral, often referred to as the "European version of OpenAI," stated in an interview that the powerful open-source model DeepSeek-V3 developed by China is actually built upon the architecture proposed by Mistral. This statement immediately attracted widespread attention and skepticism from global developers and netizens.

Core Controversy: Is it "homage" or "self-innovation"?

Arthur Mensch mentioned in the interview that Mistral released its first sparse mixture-of-experts (MoE) model at the beginning of 2024, and he believes that subsequent versions of DeepSeek were built upon this foundation, stating that they "adopted the same architecture."

However, careful netizens found doubts by reviewing the original papers on arXiv:

Time conflict: The publication dates of Mixtral's paper and DeepSeek's MoE paper are only three days apart, making it difficult to determine who truly influenced whom.

Different architectural approaches: Although both belong to sparse mixture-of-experts systems (SMoE), Mixtral focuses more on engineering optimization, while DeepSeek conducted a deep reconstruction of the algorithm.

Different expert designs: DeepSeek introduced "fine-grained expert segmentation" and "shared experts" mechanisms, decoupling general knowledge from specific knowledge, which fundamentally differs from Mixtral's flat expert design.

Technical reversal: Who is "rewriting history"?

Interestingly, the controversy quickly reversed. Some technical experts pointed out that instead of DeepSeek drawing inspiration from Mistral, the situation might be the opposite.

Architecture return: The Mistral3Large released at the end of 2025 was discovered by netizens to have a core architecture highly similar to the innovative technologies like MLA used in DeepSeek-V3.

Shift in influence: Netizens joked that Mistral seems to be trying to "rewrite history" to recover its lost technological leadership, as DeepSeek clearly gained greater industry influence in the innovation of MoE architecture.

"Collective progress" or "verbal battle" in the AI world?

Although there are disputes, as Mensch said in the first part of the interview, the essence of the open-source spirit lies in "continuous progress based on each other."

Intensifying competition: DeepSeek has already been reported to target the Spring Festival holiday in 2026, preparing to release a stronger new model.

Open source rivalry: Mistral is also continuously updating its Devstral family, trying to reclaim the top position in open-source programming intelligence.