Context Window
202,752
Tokens
Model Cost Profile
Developer: z-ai
Pricing updated Mar 11, 2026
Z.ai: GLM 4.7 offers a substantial context window of 202,752 tokens, making it ideal for applications requiring extensive text analysis, such as legal document review and long-form content generation. With an input price of $0.38 per million tokens and an output price of $1.70 per million tokens, teams can effectively manage costs while leveraging the model for complex tasks like conversational AI and data summarization. This pricing structure allows organizations to scale their usage based on project demands, optimizing budget allocation for API integration.
Context Window
202,752
Tokens
Input Price / 1M
$0.3800
Prompt tokens
Output Price / 1M
$1.98
Completion tokens
Intelligence (MMLU)
Benchmark Pending
Massive Multitask Language Understanding
| Usage Type | Price / 1M Tokens |
|---|---|
| Input (Prompt) | $0.3800 |
| Output (Completion) | $1.98 |
Price History
Current Input / 1M
$0.3800
Current Output / 1M
$1.98
Estimate monthly spend for Z.ai: GLM 4.7 based on your workload.
Estimated Monthly Cost
$33
25M input + 12M output tokens
Quick links for cost-down decisions before production rollout.
Common pricing and benchmark questions for Z.ai: GLM 4.7.
Z.ai: GLM 4.7 input pricing is $0.3800 per 1M tokens based on the latest synced provider data.
Z.ai: GLM 4.7 output pricing is $1.98 per 1M tokens based on the latest synced provider data.
Z.ai: GLM 4.7 supports a context window of 202,752 tokens.
Use the comparison links on this page to open direct model-vs-model pricing and benchmark pages, then evaluate monthly spend projections for your workload.