Zyphra Releases ZAYA1-8B: Open Reasoning MoE Model Trained on AMD Hardware

Zyphra

Models / LLM official + media 3 src. ~1 min

Zyphra released ZAYA1-8B, an Apache 2.0-licensed mixture-of-experts reasoning model with under 1 billion active parameters that matches or exceeds larger open-weight models on AIME, LiveCodeBench, and GPQA-Diamond. The model was pretrained on 1,024 AMD Instinct MI300X GPUs and introduces Markovian RSA, a new test-time compute method enabling unbounded reasoning at constant memory cost. Weights are on HuggingFace and a serverless endpoint is live on Zyphra Cloud.

Why it matters

Demonstrates competitive reasoning at <1B active parameters on AMD hardware, providing a genuinely efficient open-source alternative to proprietary reasoning models for local and cloud inference.

Importance: 3/5

First model combining MoE + <1B active parameters matching larger open-weight models on AIME/LiveCodeBench/GPQA-Diamond; Markovian RSA enables constant-memory unbounded reasoning; Apache 2.0 licensed.

Sources