Key checks
- Release Timing: Researcher Yifan Zhang predicted on April 19, 2026, that the model would be released 'as early as this week.' AI Times and Remio.ai both reported this timeline, noting that DeepSeek's web version recently added a 'Professional Mode' as a precursor to launch.
- Model Architecture and Parameters: The model is reported to be a 1.6 trillion parameter Mixture-of-Experts (MoE) model. It utilizes a new 'mHC' (modified Hyper-Connection) architecture and 'Engram' memory modules to reduce inference costs to 1/70th of GPT-4.
- Benchmark Accuracy: The user's claim of 99.4% MMLU appears to be a misinterpretation. According to AI Times, the 99.4% score refers to AIME 2026 (math), while the MMLU score is reported as 92.8%. The SWE-Bench score of 83.7% matches the reports.
- Hardware Compatibility: Reports confirm the model was trained on Huawei Ascend 950PR chips. Due to its size, running the model on consumer hardware like a 512GB Mac would likely require significant quantization.