zhaxnb
Published on 2025-10-22 / 16 Visits
0
0

查询copilot模型的上下文长度

执行以下指令

curl -s models.dev/api.json \
| jq '."github-copilot"' \
| jq -r ".models | to_entries[] | \"\(.value.name) [\(.key)] — context: \(.value.limit.context), output: \(.value.limit.output)\"" \
| sort

输出样例

Claude Haiku 4.5 [claude-haiku-4.5] — context: 144000, output: 16000
Claude Opus 4.1 [claude-opus-41] — context: 80000, output: 16000
Claude Opus 4 [claude-opus-4] — context: 80000, output: 16000
Claude Sonnet 3.5 [claude-3.5-sonnet] — context: 90000, output: 8192
Claude Sonnet 3.7 [claude-3.7-sonnet] — context: 200000, output: 16384
Claude Sonnet 3.7 Thinking [claude-3.7-sonnet-thought] — context: 200000, output: 16384
Claude Sonnet 4.5 [claude-sonnet-4.5] — context: 128000, output: 16000
Claude Sonnet 4 [claude-sonnet-4] — context: 128000, output: 16000
Gemini 2.0 Flash [gemini-2.0-flash-001] — context: 1000000, output: 8192
Gemini 2.5 Pro [gemini-2.5-pro] — context: 128000, output: 64000
GPT-4.1 [gpt-4.1] — context: 128000, output: 16384
GPT-4o [gpt-4o] — context: 128000, output: 16384
GPT-5-Codex [gpt-5-codex] — context: 128000, output: 64000
GPT-5 [gpt-5] — context: 128000, output: 64000
GPT-5-mini [gpt-5-mini] — context: 128000, output: 64000
Grok Code Fast 1 [grok-code-fast-1] — context: 256000, output: 10000
o3-mini [o3-mini] — context: 128000, output: 65536
o3 (Preview) [o3] — context: 128000, output: 16384
o4-mini (Preview) [o4-mini] — context: 128000, output: 65536


Comment