Complete guide for using alternative AI models with Claude Code — including DeepSeek, Qwen, MiniMax, Kimi, GLM, MiMo, StepFun, and more. Pricing, configs, and coding plans.
-
Updated
Apr 20, 2026
Complete guide for using alternative AI models with Claude Code — including DeepSeek, Qwen, MiniMax, Kimi, GLM, MiMo, StepFun, and more. Pricing, configs, and coding plans.
OpenRouter model information
Community-maintained registry of AI/LLM model configurations - pricing, features, and limits across 19 providers and 1000+ models
Easy-to-use LLM API from state-of-the-art providers and comparison
Some cost estimates for using LLMs via API versus web UI services (like ChatGPT)
The most comprehensive LLM pricing comparison. 36+ models, 12 providers, CLI calculator. Community-maintained.
MCP server for live AI tool and LLMs status, API pricing, and rate limits — powered by tickerr.ai
119 AI models × 55 benchmarks with per-score freshness dates, auto-updated pricing, task routing. Every score has a date and source URL. Daily CI.
Rate card for common LLM APIs for use within TensorFoundry Products
Zero-dependency Python CLI + MCP server for comparing LLM API costs across 144 models and 22 providers (OpenAI, Anthropic, Google, Mistral, xAI, DeepSeek, Groq...)
LLM model registry with metadata, pricing, and live industry feed — the data layer powering mddb.dev
ATOM: The Global Price Benchmark for AI Inference. MCP server access to the AIPI index family.
LLM cost calculator and usage tracking. Monitor API spend across providers.
Directory of AI and LLM pricing calculators. 8 calculators live, more in development. Fast cost estimates for Claude, OpenAI, Gemini, and other AI services.
LLM Provider Price and Latency Monitor for Apify Store. Normalized pricing and capability feed for 200+ LLMs across OpenRouter, OpenAI, Anthropic, Together, Groq. API for AI builders and FinOps teams.
Claude Code Skill — estimate and compare LLM API costs across OpenAI, Anthropic, Google, DeepSeek, Mistral. Per-token pricing, batch/caching discounts, workload templates, cross-provider comparison.
UsageWall — Live LLM pricing as an MCP server for Claude and any MCP client. Free, read-only. Six tools: list_models, get_model, compare_models, estimate_cost, cheapest_for, list_providers.
Latest pricing and feature overview for large language models from major AI companies
Compare and track up-to-date LLM pricing to help you find the most cost-effective AI models and avoid overpaying.
Add a description, image, and links to the llm-pricing topic page so that developers can more easily learn about it.
To associate your repository with the llm-pricing topic, visit your repo's landing page and select "manage topics."