Context Window
131,072
Tokens
Model Cost Profile
Developer: sao10k
Pricing updated Mar 11, 2026
Sao10K's Llama 3.1 Euryale 70B v2.2 model features an extensive context window of 32,768 tokens, making it ideal for applications requiring in-depth understanding, such as document summarization and complex conversational agents. With an input price of $0.65 per million tokens and an output price of $0.75 per million tokens, teams can effectively budget for high-volume projects while leveraging the model's capabilities for nuanced content generation. This model is particularly beneficial for enterprises that need to process large datasets or maintain continuity in long-form interactions.
Context Window
131,072
Tokens
Input Price / 1M
$0.8500
Prompt tokens
Output Price / 1M
$0.8500
Completion tokens
Intelligence (MMLU)
Benchmark Pending
Massive Multitask Language Understanding
| Usage Type | Price / 1M Tokens |
|---|---|
| Input (Prompt) | $0.8500 |
| Output (Completion) | $0.8500 |
Price History
Current Input / 1M
$0.8500
Current Output / 1M
$0.8500
Estimate monthly spend for Sao10K: Llama 3.1 Euryale 70B v2.2 based on your workload.
Estimated Monthly Cost
$31
25M input + 12M output tokens
Quick links for cost-down decisions before production rollout.
Common pricing and benchmark questions for Sao10K: Llama 3.1 Euryale 70B v2.2.
Sao10K: Llama 3.1 Euryale 70B v2.2 input pricing is $0.8500 per 1M tokens based on the latest synced provider data.
Sao10K: Llama 3.1 Euryale 70B v2.2 output pricing is $0.8500 per 1M tokens based on the latest synced provider data.
Sao10K: Llama 3.1 Euryale 70B v2.2 supports a context window of 131,072 tokens.
Use the comparison links on this page to open direct model-vs-model pricing and benchmark pages, then evaluate monthly spend projections for your workload.