Llama 3.3 70B overview

Provider

The company that provides the model

Meta

Context window

The number of tokens you can send in a prompt

128k tokens

Maximum output

The maximum number of tokens a model can generate in one request

2,048 tokens

Input token cost

The cost of prompt tokens sent to the model

$0.23/ 1M input tokens

Output token cost

The cost of output tokens generated by the model

$0.40/ 1M output tokens

Knowledge cut-off date

When the model's knowledge ends

December 1, 2023
Unknown

Release date

When the model was launched

December 5, 2024

Llama 3.3 70B functionality

Function (tool calling) support

Capability for the model to use external tools

Yes

Vision support

Ability to process and analyze visual inputs, like images

No

Multilingual

Support for multiple languages

Yes

Fine-tuning

Whether the model supports fine-tuning on custom datasets

Yes

Common questions about Llama 3.7 70B

What is Llama 3.3 70B?

Llama 3.3 70B is a multilingual, text-only, instruction-tuned model designed for efficient language understanding and task execution across various languages.

What is the context window for Llama 3.3 70B?

Llama 3.3 70B supports a context window of up to 128,000 tokens, allowing it to process a large amount of input data.

What is the maximum output length for Llama 3.3 70B?

Llama 3.3 70B can generate up to 2,048 tokens in a single output, making it suitable for generating detailed responses.

When was Llama 3.3 70B released?

Llama 3.3 70B was released on December 5, 2024.

What is the knowledge cut-off date for Llama 3.3 70B?

The knowledge cut-off date for Llama 3.3 70B is December 1, 2023.

What are the input and output costs for Llama 3.3 70B?

  • Input Cost: $0.23 per million tokens
  • Output Cost: $0.40 per million tokens

Does Llama 3.3 70B support vision capabilities?

No, Llama 3.3 70B is a text-only model and does not support vision capabilities.

Can Llama 3.3 70B perform tool calling or functions?

Yes, Llama 3.3 70B supports functions and tool calling.

Is Llama 3.3 70B a multilingual model?

Yes, Llama 3.3 70B supports multiple languages, allowing it to handle input and output in a variety of languages.

Does Llama 3.3 70B support fine-tuning?

Yes, Llama 3.3 70B can be fine-tuned for specific tasks and applications.

Where can I find the official documentation for Llama 3.3 70B?

You can find the official documentation for Llama 3.3 70B here.

Better LLM outputs are a click away

PromptHub is better way to test, manage, and deploy prompts for your AI products