Nscale models & pricing

Nscale hosts 16 models (14 with public pricing) covering 2 modalities. Nordic GPU cloud with serverless inference for Llama and DeepSeek. Cheapest input starts at $0.0100/M tokens; the most premium goes up to $0.600/M. Use Future AGI's Agent Command Center to route any Nscale model with cost-optimized fallback and unified observability.

Homepage ↗ Docs ↗

chat 14

Model Input / 1M Output / 1M Context Caps
Qwen Qwen2.5 Coder 3B Instruct $0.0100/M $0.0300/M
Qwen Qwen2.5 Coder 7B Instruct $0.0100/M $0.0300/M
DeepSeek AI DeepSeek R1 Distill Llama 8B $0.0250/M $0.0250/M
Meta Llama Llama 3.1 8B Instruct $0.0300/M $0.0300/M
Qwen Qwen2.5 Coder 32B Instruct $0.0600/M $0.200/M
DeepSeek AI DeepSeek R1 Distill Qwen 14B $0.0700/M $0.0700/M
DeepSeek AI DeepSeek R1 Distill Qwen 1.5B $0.0900/M $0.0900/M
Meta Llama Llama 4 Scout 17B 16e Instruct $0.0900/M $0.290/M
DeepSeek AI DeepSeek R1 Distill Qwen 32B $0.150/M $0.150/M
Qwen Qwq 32B $0.180/M $0.200/M
DeepSeek AI DeepSeek R1 Distill Qwen 7B $0.200/M $0.200/M
Meta Llama Llama 3.3 70B Instruct $0.200/M $0.200/M
DeepSeek AI DeepSeek R1 Distill Llama 70B $0.375/M $0.375/M
Mistralai Mixtral 8×22B Instruct v0.1 $0.600/M $0.600/M

image generation 2

Model Input / 1M Output / 1M Context Caps
Black Forest Labs Flux 1 Schnell
Stabilityai Stable Diffusion Xl Base 1.0

FAQ

How many Nscale models are there?

16 Nscale models are listed across 2 modalities on this page. 14 have public per-token pricing.

How is Nscale pricing verified?

Pricing is aggregated from BerriAI/litellm, models.dev, and OpenRouter and refreshed weekly. Each row shows a per-model "verified" date. If a price is wrong, click the row to open the model page and use the inline "suggest edit" link — submissions go into a public review queue.

Which Nscale model is cheapest?

Input pricing on Nscale starts at $0.0100 per 1M tokens. Sort the table by price (or use the in-page filter at the top) to find the cheapest model that matches your capability requirements.

Can I route to Nscale via an OpenAI-compatible API?

Yes — point your OpenAI client at Future AGI's Agent Command Center, configure a Nscale target, and call Nscale models with the standard /v1/chat/completions surface. The same gateway can route to other providers as fallback. Free for the first 100K requests/month.

Route any Nscale model via Agent Command Center →
OpenAI-compatible endpoint. Caching, fallback, guardrails, observability. Free for 100K requests/month.