Mistral Nemo overview

Provider

The company that provides the model

Mistral

Context window

The number of tokens you can send in a prompt

128k tokens

Maximum output

The maximum number of tokens a model can generate in one request

4,096 tokens

Input token cost

The cost of prompt tokens sent to the model

$0.15 / 1M input tokens

Output token cost

The cost of output tokens generated by the model

$0.15 / 1M output tokens

Knowledge cut-off date

When the model's knowledge ends

Unknown

Release date

When the model was launched

July 18, 2024

Mistral Nemo functionality

Function (tool calling) support

Capability for the model to use external tools

No

Vision support

Ability to process and analyze visual inputs, like images

No

Multilingual

Support for multiple languages

Yes

Fine-tuning

Whether the model supports fine-tuning on custom datasets

Yes

Common questions about Mistral Nemo

How much does Mistral Nemo cost?

Mistral Nemo has a cost structure of $0.15 per million input tokens and $0.15 per million output tokens.

What is the API cost for Mistral Nemo?

The API cost for Mistral Nemo is $0.15 per million input tokens and $0.15 per million output tokens.

What is the price per token for Mistral Nemo?

For Mistral Nemo, the price is $0.00015 per 1,000 input tokens and $0.00015 per 1,000 output tokens.

What is the context window for Mistral Nemo?

Mistral Nemo supports a context window of up to 128,000 tokens.

What is the maximum output length for Mistral Nemo?

Mistral Nemo can generate up to 4,096 tokens in a single output.

When was Mistral Nemo released?

Mistral Nemo was released on July 18, 2024.

Does Mistral Nemo support vision capabilities?

No, Mistral Nemo is a text-only model and does not support vision capabilities.

Can Mistral Nemo perform tool calling or functions?

No, Mistral Nemo does not support tool calling or functions.

Is Mistral Nemo a multilingual model?

Yes, Mistral Nemo supports multiple languages, allowing it to handle input and output in several languages.

Does Mistral Nemo support fine-tuning?

Yes, Mistral Nemo supports fine-tuning.

Where can I find the official documentation for Mistral Nemo?

You can find the official documentation for Mistral Nemo on Mistral’s website: Mistral Nemo Documentation

Better LLM outputs are a click away

PromptHub is better way to test, manage, and deploy prompts for your AI products