What happened: MIT Technology Review reports that DeepSeek released a preview of V4, an open-source flagship model with two variants (V4‑Pro and V4‑Flash), both offering reasoning modes and a 1‑million‑token context window.
Why it matters: DeepSeek claims frontier-level capability at much lower API prices and says it has reduced long-context costs via a more selective attention approach-meaning "read the whole codebase" workloads get less financially suicidal.
Wider context: The article says V4 compresses older information while keeping nearby text in full, and reports DeepSeek's claims that V4‑Pro uses 27% of V3.2's compute and 10% of its memory at 1M tokens (with even larger reductions for V4‑Flash).
Background: Technology Review also frames V4 as a China-stack signal: the piece says it is the first DeepSeek model optimized for domestic chips like Huawei Ascend, as Beijing pushes AI projects to align with local hardware and software ecosystems.
Three reasons why DeepSeek's new model matters — MIT Technology Review
Singularity Soup Take: Open-weight models are now doing the one thing incumbents hate most: making "frontier" feel like a commodity. The clever bit isn't just the benchmarks-it's turning long-context from a luxury feature into a price point, and then wiring it into a domestic chip stack.
Key Takeaways:
- Two Versions: The article says V4‑Pro targets coding and complex agent tasks, while V4‑Flash is designed to be faster and cheaper; both include reasoning modes, and DeepSeek offers them via its site/app with API access for developers.
- Pricing Pressure: Technology Review reports DeepSeek's stated pricing: V4‑Pro at $1.74 per million input tokens and $3.48 per million output tokens, with V4‑Flash far cheaper-positioning cost as a competitive weapon as much as raw capability.
- China Chip Pivot: The piece describes V4 as DeepSeek's first model optimized for domestic Chinese chips like Huawei Ascend, and notes the practical challenge is not just hardware but the surrounding software ecosystem developers must adapt.