Model Cost Profile

LiquidAI: LFM2-2.6B

Developer: liquid

Pricing updated Mar 10, 2026

Input rank: #27Output rank: #27

Live Pricing

Input: $0.0100

Output: $0.0200

Pricing via OpenRouter API · Last synced Mar 10, 2026

LiquidAI: LFM2-2.6B offers an extensive context window of 32,768 tokens, making it suitable for applications requiring deep contextual understanding, such as long-form content generation and complex conversational agents. With an input price of $0.01 per 1 million tokens and an output price of $0.02 per million tokens, teams can effectively manage costs while leveraging the model for data-intensive tasks. This pricing structure allows for scalable integration in various industries, including customer support automation and advanced data analysis.

Context Window

32,768

Tokens

Input Price / 1M

$0.0100

Prompt tokens

Output Price / 1M

$0.0200

Completion tokens

Intelligence (MMLU)

Benchmark Pending

Massive Multitask Language Understanding

Price History

LiquidAI: LFM2-2.6B Pricing Trend

Input / 1M tokens0.0%Output / 1M tokens0.0%
Mar 7Mar 10
$0.0100$0.0150$0.0200Mar 7Mar 8Mar 9Mar 10

Current Input / 1M

$0.0100

Current Output / 1M

$0.0200

Cheaper Alternatives to Compare

Quick links for cost-down decisions before production rollout.

FAQ

Common pricing and benchmark questions for LiquidAI: LFM2-2.6B.

How much does LiquidAI: LFM2-2.6B cost per 1M input tokens?

LiquidAI: LFM2-2.6B input pricing is $0.0100 per 1M tokens based on the latest synced provider data.

How much does LiquidAI: LFM2-2.6B cost per 1M output tokens?

LiquidAI: LFM2-2.6B output pricing is $0.0200 per 1M tokens based on the latest synced provider data.

What context window does LiquidAI: LFM2-2.6B support?

LiquidAI: LFM2-2.6B supports a context window of 32,768 tokens.

How can I compare LiquidAI: LFM2-2.6B with cheaper alternatives?

Use the comparison links on this page to open direct model-vs-model pricing and benchmark pages, then evaluate monthly spend projections for your workload.