Models · MarkTechPost ·
Zyphra Releases ZAYA1-8B-Diffusion-Preview: The First MoE Diffusion Model Converted From an Autoregressive LLM With Up to 7.7x Speedup
Zyphra released ZAYA1-8B-Diffusion-Preview, an MoE diffusion model converted from an autoregressive LLM. The company reports no systematic evaluation loss and up to 7.7x faster inference by shifting decoding from memory-bound to compute-bound processing.