Llama 2 Chat 70B overview

Provider

The company that provides the model

Meta

Context window

The number of tokens you can send in a prompt

4,096 tokens

Maximum output

The maximum number of tokens a model can generate in one request

2,048 tokens

Input token cost

The cost of prompt tokens sent to the model

$1.54 / 1M input tokens (when hosted on Azure)

Output token cost

The cost of output tokens generated by the model

$1.77 / 1M output tokens (when hosted on Azure)

Knowledge cut-off date

When the model's knowledge ends

September 1, 2022
Unknown

Release date

When the model was launched

July 18, 2023

Llama 2 Chat 70B functionality

Function (tool calling) support

Capability for the model to use external tools

No

Vision support

Ability to process and analyze visual inputs, like images

No

Multilingual

Support for multiple languages

No

Fine-tuning

Whether the model supports fine-tuning on custom datasets

Yes

Common questions about Llama 2 Chat 70B

How much does Llama 2 Chat 70B cost?

Llama 2 Chat 70B has a cost structure of $1.54 per million input tokens and $1.77 per million output tokens when hosted on Azure.

What is the API cost for Llama 2 Chat 70B?

The API cost for Llama 2 Chat 70B is $1.54 per million input tokens and $1.77 per million output tokens when hosted on Azure.

What is the price per token for Llama 2 Chat 70B?

For Llama 2 Chat 70B, the price is $0.00154 per 1,000 input tokens and $0.00177 per 1,000 output tokens when hosted on Azure.

What is the context window for Llama 2 Chat 70B?

Llama 2 Chat 70B supports a context window of up to 4,096 tokens.

What is the maximum output length for Llama 2 Chat 70B?

Llama 2 Chat 70B can generate up to 2,048 tokens in a single output.

When was Llama 2 Chat 70B released?

Llama 2 Chat 70B was released on July 18, 2023.

How recent is the training data for Llama 2 Chat 70B?

The knowledge cut-off date for Llama 2 Chat 70B is September 1, 2022.

Does Llama 2 Chat 70B support vision capabilities?

No, Llama 2 Chat 70B is a text-only model and does not support vision capabilities.

Can Llama 2 Chat 70B perform tool calling or functions?

No, Llama 2 Chat 70B does not support tool calling or functions.

Is Llama 2 Chat 70B a multilingual model?

No, Llama 2 Chat 70B does not support multiple languages.

Does Llama 2 Chat 70B support fine-tuning?

Yes, Llama 2 Chat 70B supports fine-tuning.

Where can I find the official documentation for Llama 2 Chat 70B?

You can find the official documentation for Llama 2 Chat 70B on Meta’s GitHub page: Llama 2 Chat 70B Documentation

Better LLM outputs are a click away

PromptHub is better way to test, manage, and deploy prompts for your AI products