o1 overview

Provider

The company that provides the model

OpenAI

Context window

The number of tokens you can send in a prompt

200k tokens

Maximum output

The maximum number of tokens a model can generate in one request

100,000 tokens

Input token cost

The cost of prompt tokens sent to the model

$15.00 / 1M input tokens

Output token cost

The cost of output tokens generated by the model

$60.00 / 1M output tokens - reasoning tokens are priced identically to output tokens

Knowledge cut-off date

When the model's knowledge ends

October 1, 2023
Unknown

Release date

When the model was launched

December 17, 2024

o1 functionality

Function (tool calling) support

Capability for the model to use external tools

Yes

Vision support

Ability to process and analyze visual inputs, like images

Yes

Multilingual

Support for multiple languages

Yes

Fine-tuning

Whether the model supports fine-tuning on custom datasets

No

Common questions about o1

How much does o1 cost?

o1 has a cost structure of $15.00 per million input tokens and $60.00 per million output tokens. Reasoning tokens are priced identically to output tokens.

What is the API cost for o1?

The API cost for o1 is $15.00 per million input tokens and $60.00 per million output tokens.

What is the price per token for o1?

For o1, the price is $0.015 per 1,000 input tokens and $0.06 per 1,000 output tokens.

What is the context window for o1?

o1 supports a context window of up to 200,000 tokens.

What is the maximum output length for o1?

o1 can generate up to 100,000 tokens in a single output.

When was o1 released?

o1 was released on December 17, 2024.

How recent is the training data for o1?

The knowledge cut-off date for o1 is October 1, 2023.

Does o1 support vision capabilities?

Yes, o1 supports vision capabilities and can process and analyze visual inputs, like images.

Can o1 perform tool calling or functions?

Yes, o1 supports tool calling (functions).

Is o1 a multilingual model?

Yes, o1 supports multiple languages, allowing it to handle input and output in several languages.

Does o1 support fine-tuning?

No, o1 does not support fine-tuning on custom datasets.

Where can I find the official documentation for o1?

You can find the official documentation for o1 here.

Better LLM outputs are a click away

PromptHub is better way to test, manage, and deploy prompts for your AI products