ZAYA1-8B: Efficient Large Language Models with MoE
Introducing ZAYA1-8B, an 8B MoE model achieving performance parity with larger models, pushing the boundaries of LLM efficiency.
Introducing ZAYA1-8B, an 8B MoE model achieving performance parity with larger models, pushing the boundaries of LLM efficiency.