Mistral 8x7B Instruct overview

Provider

The company that provides the model

Anthropic Logo
Mistral

Context window

The number of tokens you can send in a prompt

32k tokens

Maximum output

The maximum number of tokens a model can generate in one request

4,096 tokens

Input token cost

The cost of prompt tokens sent to the model

$0.70 / 1M input tokens

Output token cost

The cost of output tokens generated by the model

$0.70 / 1M output tokens

Knowledge cut-off date

When the model's knowledge ends

Unknown

Release date

When the model was launched

September 27, 2023

Mixtral 8x7B Instruct functionality

Function (tool calling) support

Capability for the model to use external tools

No

Vision support

Ability to process and analyze visual inputs, like images

No

Multilingual

Support for multiple languages

Yes

Fine-tuning

Whether the model supports fine-tuning on custom datasets

No

Common questions about Mixtral 8x7B Instruct

How much does Mixtral 8x7B Instruct cost?

Mixtral 8x7B Instruct has a cost structure of $0.70 per million input tokens and $0.70 per million output tokens.

What is the API cost for Mixtral 8x7B Instruct?

The API cost for Mixtral 8x7B Instruct is $0.70 per million input tokens and $0.70 per million output tokens.

What is the price per token for Mixtral 8x7B Instruct?

For Mixtral 8x7B Instruct, the price is $0.0007 per 1,000 input tokens and $0.0007 per 1,000 output tokens.

What is the context window for Mixtral 8x7B Instruct?

Mixtral 8x7B Instruct supports a context window of up to 32,000 tokens.

What is the maximum output length for Mixtral 8x7B Instruct?

Mixtral 8x7B Instruct can generate up to 4,096 tokens in a single output.

When was Mixtral 8x7B Instruct released?

Mixtral 8x7B Instruct was released on December 27, 2023.

Does Mixtral 8x7B Instruct support vision capabilities?

No, Mixtral 8x7B Instruct is a text-only model and does not support vision capabilities.

Can Mixtral 8x7B Instruct perform tool calling or functions?

No, Mixtral 8x7B Instruct does not support tool calling or functions.

Is Mixtral 8x7B Instruct a multilingual model?

Yes, Mixtral 8x7B Instruct supports multiple languages, allowing it to handle input and output in several languages.

Does Mixtral 8x7B Instruct support fine-tuning?

No, Mixtral 8x7B Instruct does not support fine-tuning.

Where can I find the official documentation for Mixtral 8x7B Instruct?

You can find the official documentation for Mixtral 8x7B Instruct on Mistral’s website: Mixtral 8x7B Instruct Documentation

Better LLM outputs are a click away

PromptHub is better way to test, manage, and deploy prompts for your AI products