Skip to content

fix(ai-adapters): surface provider code 1305 in OpenAI stream errors#557

Draft
cooleryu wants to merge 1 commit intoGCWing:mainfrom
cooleryu:fix-openai-stream-error-details
Draft

fix(ai-adapters): surface provider code 1305 in OpenAI stream errors#557
cooleryu wants to merge 1 commit intoGCWing:mainfrom
cooleryu:fix-openai-stream-error-details

Conversation

@cooleryu
Copy link
Copy Markdown

Summary

Preserve provider error metadata from OpenAI-compatible SSE error payloads.

When a provider sends an SSE data payload like:

{"error":{"code":"1305","message":"provider temporarily overloaded"},"request_id":"req_1305"}

BitFun already recognizes it as an API error before deserializing it as a normal chat chunk. This follow-up keeps the provider code and request_id in the surfaced error message instead of reducing the error to message-only text.

Motivation

Issue #380 shows a provider overload response where the useful debugging fields were:

  • code
  • message
  • request_id

Keeping those fields visible makes model overload, quota, gateway, and provider-side failures easier to diagnose from BitFun's runtime errors and raw SSE logs.

Changes

  • Include error.code in OpenAI-compatible SSE API error messages when present.
  • Include top-level request_id / requestId when present.
  • Keep existing behavior unchanged for message-only and string-shaped error payloads.
  • Add an adapter unit test for the code + message + request_id shape.
  • Add a bitfun-core stream fixture test so existing CI covers the end-to-end StreamProcessor error path.

Test Plan

git diff --check

Result: passed.

Attempted focused Rust tests:

cargo test -p bitfun-ai-adapters stream::stream_handler::openai::tests::extracts_api_error_code_message_and_request_id_from_object_shape
cargo test -p bitfun-core --test stream_processor_openai openai_fixture_preserves_provider_error_code_and_request_id

Result: not run locally because this machine does not currently have cargo / rustc installed. The new core integration test is under src/crates/core/tests, which is covered by the existing Linux CI job:

cargo test --locked -p bitfun-core

Risk

Low. This only changes formatting for OpenAI-compatible stream error payloads that already contain an error object. Normal chat completion chunks, tool-call chunks, and message-only error payloads keep their current behavior.

Related Issue

Refs #380.

@GCWing GCWing requested a review from wsp1911 April 27, 2026 06:47
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

需要将provider_error_with_code.sse内容改为

data: {"error":{"code":"1305","message":"provider temporarily overloaded"},"request_id":"req_1305"}


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants