Context Window
131,072
Tokens
Model Cost Profile
Developer: deepseek
Pricing updated Mar 11, 2026
Live Pricing
Input: $0.7000
Output: $0.8000
Pricing via OpenRouter API ยท Last synced Mar 11, 2026 ยท MMLU score via public benchmark data
DeepSeek's R1 Distill Llama 70B model offers an extensive context window of 131,072 tokens, making it ideal for applications requiring in-depth analysis, such as legal document review or comprehensive research tasks. With an input price of $0.70 per million tokens and an output price of $0.80 per million tokens, teams can effectively manage costs while leveraging the model's capabilities for large-scale data processing. This model is particularly beneficial for organizations that need to handle complex queries and generate detailed responses in real-time.
Context Window
131,072
Tokens
Input Price / 1M
$0.7000
Prompt tokens
Output Price / 1M
$0.8000
Completion tokens
Intelligence (MMLU)
79.5
Massive Multitask Language Understanding
Standardized evaluation scores for DeepSeek: R1 Distill Llama 70B.
| Usage Type | Price / 1M Tokens |
|---|---|
| Input (Prompt) | $0.7000 |
| Output (Completion) | $0.8000 |
Price History
Current Input / 1M
$0.7000
Current Output / 1M
$0.8000
Estimate monthly spend for DeepSeek: R1 Distill Llama 70B based on your workload.
Estimated Monthly Cost
$27
25M input + 12M output tokens
Quick links for cost-down decisions before production rollout.
Common pricing and benchmark questions for DeepSeek: R1 Distill Llama 70B.
DeepSeek: R1 Distill Llama 70B input pricing is $0.7000 per 1M tokens based on the latest synced provider data.
DeepSeek: R1 Distill Llama 70B output pricing is $0.8000 per 1M tokens based on the latest synced provider data.
DeepSeek: R1 Distill Llama 70B supports a context window of 131,072 tokens.
Use the comparison links on this page to open direct model-vs-model pricing and benchmark pages, then evaluate monthly spend projections for your workload.