Context Window
200,000
Tokens
Model Cost Profile
Developer: anthropic
Pricing updated Mar 10, 2026
Live Pricing
Input: $3.00
Output: $15.00
Pricing via OpenRouter API ยท Last synced Mar 10, 2026 ยท MMLU score via public benchmark data
Anthropic's Claude 3.7 Sonnet (thinking) model offers a substantial context window of 200,000 tokens, making it ideal for applications requiring in-depth analysis and complex dialogue, such as legal document review and advanced customer support systems. With an input price of $3.00 per million tokens and an output price of $15.00 per million tokens, teams must consider their token usage carefully to manage costs effectively while leveraging the model's capabilities. This pricing structure allows businesses to scale their usage according to project needs, ensuring flexibility in budget allocation for high-demand tasks.
Context Window
200,000
Tokens
Input Price / 1M
$3.00
Prompt tokens
Output Price / 1M
$15.00
Completion tokens
Intelligence (MMLU)
87.5
Massive Multitask Language Understanding
Standardized evaluation scores for Anthropic: Claude 3.7 Sonnet (thinking).
| Usage Type | Price / 1M Tokens |
|---|---|
| Input (Prompt) | $3.00 |
| Output (Completion) | $15.00 |
Price History
Current Input / 1M
$3.00
Current Output / 1M
$15.00
Estimate monthly spend for Anthropic: Claude 3.7 Sonnet (thinking) based on your workload.
Estimated Monthly Cost
$255
25M input + 12M output tokens
Quick links for cost-down decisions before production rollout.
Common pricing and benchmark questions for Anthropic: Claude 3.7 Sonnet (thinking).
Anthropic: Claude 3.7 Sonnet (thinking) input pricing is $3.00 per 1M tokens based on the latest synced provider data.
Anthropic: Claude 3.7 Sonnet (thinking) output pricing is $15.00 per 1M tokens based on the latest synced provider data.
Anthropic: Claude 3.7 Sonnet (thinking) supports a context window of 200,000 tokens.
Use the comparison links on this page to open direct model-vs-model pricing and benchmark pages, then evaluate monthly spend projections for your workload.