AIME API Logo AIME API Demo - Mixtral-Chat

AIME API Demo

Mixtral - Instruct Chat

This is the Mixtral MoE model, the latest and largest mixture of experts (MoE) language model from Mistral AI. It comes in two flavors: 8x7B and 8x22B. The model utilizes a mixture of 8 expert models, each with 7 resp. 22 billion parameters. During inference, two of these expert models are selected to generate the output.

INPUT

Chat templates

System Prompt:

Top K

Top P

Temperature

Maximum response length in tokens

Progress

Console Output