fix(opencode): use openai-compatible SDK for e2e LLM mock routing

The e2eURL() path used createOpenAI().responses() which produces a
Responses API format that the mock server's SSE stream can't correctly
convey tool calls through. Switch to createOpenAICompatible().chatModel()
which uses the chat completions format — matching what the mock was
designed for and what the integration tests use.

This only affects the OPENCODE_E2E_LLM_URL code path (e2e tests only).
pull/20593/head
Kit Langton 2026-04-02 12:49:28 -04:00
parent 9a87b785e6
commit d603d9da65
1 changed files with 2 additions and 2 deletions

View File

@ -1458,11 +1458,11 @@ export namespace Provider {
return yield* Effect.promise(async () => {
const url = e2eURL()
if (url) {
const language = createOpenAI({
const language = createOpenAICompatible({
name: model.providerID,
apiKey: "test-key",
baseURL: url,
}).responses(model.api.id)
}).chatModel(model.api.id)
s.models.set(key, language)
return language
}