?Mixtral 8x7B
Released: 12/8/2023texttext
Input: $0.70 / Output: $0.70
Mixtral 8x7B is an LLM designed for efficient processing and optimized model size. It excels in multilingual capabilities and cost-performance efficiency, making it a strong contender against models like GPT-3.5.
Metric | Value |
---|---|
Parameter Count | 47 billion |
Mixture of Experts | Yes |
Active Parameter Count | 13 billion |
Context Length | Unknown |
Multilingual | Yes |
Quantized* | Unknown |
*Quantization is specific to the inference provider and may vary.