?Mixtral 8x7B
Released: 12/8/2023
texttext
Input: $0.70 / Output: $0.70

Mixtral 8x7B is an LLM designed for efficient processing and optimized model size. It excels in multilingual capabilities and cost-performance efficiency, making it a strong contender against models like GPT-3.5.

MetricValue
Parameter Count47 billion
Mixture of ExpertsYes
Active Parameter Count13 billion
Context LengthUnknown
MultilingualYes
Quantized*Unknown

*Quantization is specific to the inference provider and may vary.