Mixtral 8x7B
text-to-text
Efficient Mixture of Experts (8 experts) with 13B active parameters, optimized for multilingual tasks and cost-performance balance.
Model Parameters
Advanced settings to control the behavior of the model.
A system prompt that will be prepended to all chat messages to guide the model's behavior. Automatically gets saved to your playground repository.
Controls randomness: 0 = deterministic, 2 = very creative
Maximum length of the response (1-128000 tokens)
Mixtral 8x7B
Log in to chat with any model.