Skip to content

fix: accept fenced JSON output in raw JSON mode for OpenAI-compatible backends#444

Open
Samay10 wants to merge 3 commits intogoogle:mainfrom
Samay10:fix-vllm-openai-json-compatibility
Open

fix: accept fenced JSON output in raw JSON mode for OpenAI-compatible backends#444
Samay10 wants to merge 3 commits intogoogle:mainfrom
Samay10:fix-vllm-openai-json-compatibility

Conversation

@Samay10
Copy link
Copy Markdown

@Samay10 Samay10 commented Apr 15, 2026

Description

Fixes issue #414 where vLLM/lmdeploy OpenAI-compatible backends may return fenced JSON output even when LangExtract is configured for raw JSON mode. This updates langextract/core/format_handler.py so FormatHandler._extract_content() detects and strips a single fenced JSON block when use_fences=False, and adds a regression test for fenced JSON output in raw JSON mode.

Fixes #414

Bug fix

How Has This Been Tested?

  • Added regression test in tests/format_handler_test.py
  • Verified modified files compile with python3 -m py_compile
  • Full unit tests not run in the current environment due to missing absl dependency

Checklist:

  • I have read and acknowledged Google's Open Source Code of conduct.
  • I have read the Contributing page, and I either signed the Google Individual CLA or am covered by my company's Corporate CLA.
  • I have discussed my proposed solution with code owners in the linked issue(s) and we have agreed upon the general approach.
  • I have made any needed documentation changes, or noted in the linked issue(s) that documentation elsewhere needs updating.
  • I have added tests, or I have ensured existing tests cover the changes
  • I have followed Google's Python Style Guide and ran pylint over the affected code.

@github-actions github-actions bot added the size/XS Pull request with less than 50 lines changed label Apr 15, 2026
Samay10 added 2 commits April 15, 2026 23:19
- Precompute strip_text to consolidate duplicate returns (7->6)
- Fixes lint-src CI failure for google#414
@github-actions
Copy link
Copy Markdown

⚠️ Branch Update Required

Your branch is 1 commits behind main. Please update your branch to ensure CI checks run with the latest code:

git fetch origin main
git merge origin/main
git push

Note: Enable "Allow edits by maintainers" to allow automatic updates.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size/XS Pull request with less than 50 lines changed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ResolverParsingError when using vLLM/lmdeploy as backend (Qwen3-32B-GPTQ)

1 participant