feat(cli): suggest closest provider/model on not found ("Did you mean…")\n\nSummary\n- Add fuzzy suggestions to ProviderModelNotFoundError with up to 3 candidates\n- Normalize punctuation (e.g., 4.5 vs 4-5) and case to better match common typos\n- Support model-only input (no provider) by searching across all providers\n- Enhance CLI error formatter to display suggestions when present\n\nImplementation\n- provider.ts: use fuzzysort; add normalization by stripping non-alphanumerics; search by key for robust matches\n- provider.ts: when provider is unknown and model is empty, treat token as unqualified model and search across all providers' models; otherwise suggest provider matches\n- error.ts: print "Did you mean: <provider/model>, …" when suggestions exist\n\nExamples\n1) Typo in model ID\n $ bun run ./src/index.ts run --model anthropic/claude-haiu-4-5 "hi"\n Error: Model not found: anthropic/claude-haiu-4-5\n Did you mean: anthropic/claude-haiku-4-5, anthropic/claude-haiku-4-5-20251001\n Try: zai-coding-plan/glm-4.5-flash

zai-coding-plan/glm-4.5
zai-coding-plan/glm-4.5-air
zai-coding-plan/glm-4.5v
zai-coding-plan/glm-4.6
opencode/big-pickle
opencode/grok-code
anthropic/claude-opus-4-0
anthropic/claude-3-5-sonnet-20241022
anthropic/claude-opus-4-1
anthropic/claude-haiku-4-5
anthropic/claude-3-5-sonnet-20240620
anthropic/claude-3-5-haiku-latest
anthropic/claude-3-opus-20240229
anthropic/claude-sonnet-4-5
anthropic/claude-sonnet-4-5-20250929
anthropic/claude-sonnet-4-20250514
anthropic/claude-opus-4-20250514
anthropic/claude-3-5-haiku-20241022
anthropic/claude-3-haiku-20240307
anthropic/claude-3-7-sonnet-20250219
anthropic/claude-3-7-sonnet-latest
anthropic/claude-sonnet-4-0
anthropic/claude-opus-4-1-20250805
anthropic/claude-3-sonnet-20240229
anthropic/claude-haiku-4-5-20251001
openai/gpt-4.1-nano
openai/text-embedding-3-small
openai/gpt-4
openai/o1-pro
openai/gpt-4o-2024-05-13
openai/gpt-4o-2024-08-06
openai/gpt-4.1-mini
openai/o3-deep-research
openai/gpt-3.5-turbo
openai/text-embedding-3-large
openai/gpt-4-turbo
openai/o1-preview
openai/o3-mini
openai/codex-mini-latest
openai/gpt-5-nano
openai/gpt-5-codex
openai/gpt-4o
openai/gpt-4.1
openai/o4-mini
openai/o1
openai/gpt-5-mini
openai/o1-mini
openai/text-embedding-ada-002
openai/o3-pro
openai/gpt-4o-2024-11-20
openai/o3
openai/o4-mini-deep-research
openai/gpt-4o-mini
openai/gpt-5
openai/gpt-5-pro to list available models\n   Or check your config (opencode.json) provider/model names\n\n2) Dot vs dash (punctuation normalization)\n   $ bun run ./src/index.ts run --model anthropic/claude-haiku-4.5 "hi"\n   Error: Model not found: anthropic/claude-haiku-4.5\n   Did you mean: anthropic/claude-haiku-4-5, anthropic/claude-haiku-4-5-20251001\n   Try: zai-coding-plan/glm-4.5-flash
zai-coding-plan/glm-4.5
zai-coding-plan/glm-4.5-air
zai-coding-plan/glm-4.5v
zai-coding-plan/glm-4.6
opencode/big-pickle
opencode/grok-code
anthropic/claude-opus-4-0
anthropic/claude-3-5-sonnet-20241022
anthropic/claude-opus-4-1
anthropic/claude-haiku-4-5
anthropic/claude-3-5-sonnet-20240620
anthropic/claude-3-5-haiku-latest
anthropic/claude-3-opus-20240229
anthropic/claude-sonnet-4-5
anthropic/claude-sonnet-4-5-20250929
anthropic/claude-sonnet-4-20250514
anthropic/claude-opus-4-20250514
anthropic/claude-3-5-haiku-20241022
anthropic/claude-3-haiku-20240307
anthropic/claude-3-7-sonnet-20250219
anthropic/claude-3-7-sonnet-latest
anthropic/claude-sonnet-4-0
anthropic/claude-opus-4-1-20250805
anthropic/claude-3-sonnet-20240229
anthropic/claude-haiku-4-5-20251001
openai/gpt-4.1-nano
openai/text-embedding-3-small
openai/gpt-4
openai/o1-pro
openai/gpt-4o-2024-05-13
openai/gpt-4o-2024-08-06
openai/gpt-4.1-mini
openai/o3-deep-research
openai/gpt-3.5-turbo
openai/text-embedding-3-large
openai/gpt-4-turbo
openai/o1-preview
openai/o3-mini
openai/codex-mini-latest
openai/gpt-5-nano
openai/gpt-5-codex
openai/gpt-4o
openai/gpt-4.1
openai/o4-mini
openai/o1
openai/gpt-5-mini
openai/o1-mini
openai/text-embedding-ada-002
openai/o3-pro
openai/gpt-4o-2024-11-20
openai/o3
openai/o4-mini-deep-research
openai/gpt-4o-mini
openai/gpt-5
openai/gpt-5-pro to list available models\n   Or check your config (opencode.json) provider/model names\n\n3) Missing provider (model-only input)\n   $ bun run ./src/index.ts run --model big-pickle "hi"\n   Error: Model not found: big-pickle/\n   Did you mean: opencode/big-pickle\n\n4) Correct model after suggestion\n   $ bun run ./src/index.ts run --model opencode/big-pickle "hi"\n   Hi! How can I help you with your opencode project today?\n\nNotes\n- Suggestions are hints only; behavior is unchanged (no auto-selection).\n- This runs locally as part of the CLI error path; performance impact is negligible (small in-memory scans).
fix/cli-clean-exit-on-model-errors
Ian Maurer 2025-11-12 10:41:38 -05:00
parent c1fa257a92
commit 2be8b2269f
2 changed files with 43 additions and 3 deletions

View File

@ -8,9 +8,12 @@ export function FormatError(input: unknown) {
if (MCP.Failed.isInstance(input))
return `MCP server "${input.data.name}" failed. Note, opencode does not support MCP authentication yet.`
if (Provider.ModelNotFoundError.isInstance(input)) {
const { providerID, modelID } = input.data
const { providerID, modelID, suggestions } = input.data
return [
`Model not found: ${providerID}/${modelID}`,
...(Array.isArray(suggestions) && suggestions.length
? ["Did you mean: " + suggestions.join(", ")]
: []),
`Try: \`opencode models\` to list available models`,
`Or check your config (opencode.json) provider/model names`,
].join("\n")

View File

@ -1,4 +1,5 @@
import z from "zod"
import fuzzysort from "fuzzysort"
import path from "path"
import { Config } from "../config/config"
import { mergeDeep, sortBy } from "remeda"
@ -522,9 +523,44 @@ export namespace Provider {
})
const provider = s.providers[providerID]
if (!provider) throw new ModelNotFoundError({ providerID, modelID })
if (!provider) {
let suggestions: string[] = []
const normalize = (str: string) => str.toLowerCase().replace(/[^a-z0-9]/g, "")
if (!modelID || modelID.trim() === "") {
// Treat single-token input as an unqualified model; search across all providers' models.
const q = normalize(providerID)
const entries: { combo: string; norm: string }[] = []
for (const [pid, prov] of Object.entries(s.providers)) {
for (const mid of Object.keys(prov.info.models)) {
entries.push({ combo: pid + "/" + mid, norm: normalize(mid) })
}
}
const byNorm = fuzzysort.go(q, entries as any, { limit: 5, key: "norm" }).map((r: any) => r.obj.combo)
const combos = entries.map((e) => e.combo)
const byRaw = fuzzysort.go(providerID, combos, { limit: 5 }).map((r) => r.target)
suggestions = Array.from(new Set([...byNorm, ...byRaw])).slice(0, 3)
} else {
const providerSuggestions = fuzzysort
.go(providerID, Object.keys(s.providers), { limit: 3 })
.map((r) => r.target + "/" + modelID)
suggestions = providerSuggestions
}
throw new ModelNotFoundError({ providerID, modelID, suggestions })
}
const info = provider.info.models[modelID]
if (!info) throw new ModelNotFoundError({ providerID, modelID })
if (!info) {
const candidates = Object.keys(provider.info.models)
// Normalize punctuation differences like '-' vs '.' by stripping non-alphanumerics
const normalize = (s: string) => s.toLowerCase().replace(/[^a-z0-9]/g, "")
const corpus = candidates.map((raw) => ({ raw, norm: normalize(raw) }))
const query = normalize(modelID)
const results = fuzzysort.go(query, corpus as any, { limit: 5, key: "norm" })
const ranked = results.map((r) => ("obj" in r ? (r as any).obj.raw : (r as any).target)) as string[]
const fallback = fuzzysort.go(modelID, candidates, { limit: 5 }).map((r) => r.target)
const merged = Array.from(new Set([...ranked, ...fallback]))
const suggestions = merged.slice(0, 3).map((m) => providerID + "/" + m)
throw new ModelNotFoundError({ providerID, modelID, suggestions })
}
const sdk = await getSDK(provider.info, info)
try {
@ -658,6 +694,7 @@ export namespace Provider {
z.object({
providerID: z.string(),
modelID: z.string(),
suggestions: z.array(z.string()).optional(),
}),
)