Model Cost Profile

DeepSeek: R1 Distill Llama 70B

Developer: deepseek

Pricing updated Mar 11, 2026

Input rank: #222Output rank: #157

Live Pricing

Input: $0.7000

Output: $0.8000

Pricing via OpenRouter API ยท Last synced Mar 11, 2026 ยท MMLU score via public benchmark data

DeepSeek's R1 Distill Llama 70B model offers an extensive context window of 131,072 tokens, making it ideal for applications requiring in-depth analysis, such as legal document review or comprehensive research tasks. With an input price of $0.70 per million tokens and an output price of $0.80 per million tokens, teams can effectively manage costs while leveraging the model's capabilities for large-scale data processing. This model is particularly beneficial for organizations that need to handle complex queries and generate detailed responses in real-time.

๐Ÿ“‹ Structured Output๐Ÿง  Reasoning

Context Window

131,072

Tokens

Input Price / 1M

$0.7000

Prompt tokens

Output Price / 1M

$0.8000

Completion tokens

Intelligence (MMLU)

79.5

Massive Multitask Language Understanding

Benchmark Scores

Standardized evaluation scores for DeepSeek: R1 Distill Llama 70B.

BenchmarkScoreRankSource
GPQA40.2#104 of 118artificial_analysis
MMLU79.5#43 of 121artificial_analysis

Price History

DeepSeek: R1 Distill Llama 70B Pricing Trend

Input / 1M tokens0.0%Output / 1M tokens0.0%
Mar 7 โ€” Mar 11
$0.7000$0.7500$0.8000Mar 7Mar 8Mar 9Mar 10Mar 11

Current Input / 1M

$0.7000

Current Output / 1M

$0.8000

Cheaper Alternatives to Compare

Quick links for cost-down decisions before production rollout.

FAQ

Common pricing and benchmark questions for DeepSeek: R1 Distill Llama 70B.

How much does DeepSeek: R1 Distill Llama 70B cost per 1M input tokens?

DeepSeek: R1 Distill Llama 70B input pricing is $0.7000 per 1M tokens based on the latest synced provider data.

How much does DeepSeek: R1 Distill Llama 70B cost per 1M output tokens?

DeepSeek: R1 Distill Llama 70B output pricing is $0.8000 per 1M tokens based on the latest synced provider data.

What context window does DeepSeek: R1 Distill Llama 70B support?

DeepSeek: R1 Distill Llama 70B supports a context window of 131,072 tokens.

How can I compare DeepSeek: R1 Distill Llama 70B with cheaper alternatives?

Use the comparison links on this page to open direct model-vs-model pricing and benchmark pages, then evaluate monthly spend projections for your workload.