Skip to content

Conversation

@krrishdholakia
Copy link
Contributor

@krrishdholakia krrishdholakia commented Nov 27, 2025

Title

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature
🐛 Bug Fix
🧹 Refactoring
📖 Documentation
🚄 Infrastructure
✅ Test

Changes


Note

Adds a Generic Guardrail API hook enabling external providers via a simple HTTP endpoint, plus a mock server, docs, and example configs.

  • Guardrails (Proxy Integration):
    • Introduces generic_guardrail_api hook with request/response models and async HTTP client (generic_guardrail_api.py, __init__.py).
    • Registers new integration in SupportedGuardrailIntegrations and extends LitellmParams with additional_provider_specific_params.
    • Adds config models/types for the new guardrail.
  • Mock Server:
    • Adds FastAPI mock Bedrock Guardrail server with Bedrock-compatible apply endpoint and POST /beta/litellm_basic_guardrail_api for the Generic Guardrail API, plus config and health endpoints.
  • Configuration:
    • Provides example config (example_config.yaml) and updates _new_secret_config.yaml to demonstrate using generic_guardrail_api with headers and api_base.
  • Docs:
    • New guide adding_provider/generic_guardrail_api.md and sidebar entry explaining the API contract, usage, and integration steps.

Written by Cursor Bugbot for commit f076e98. This will update automatically on new commits. Configure here.

Allows guardrail providers to work with litellm for guardrails without needing to make a PR to LiteLLM
@vercel
Copy link

vercel bot commented Nov 27, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
litellm Ready Ready Preview Comment Nov 27, 2025 3:34am

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the final PR Bugbot will review for you during this billing cycle

Your free Bugbot reviews will reset on December 25

Details

You are on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle.

To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.

guardrail_name=guardrail.get("guardrail_name", ""),
event_hook=litellm_params.mode,
default_on=litellm_params.default_on,
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Headers from config not passed to GenericGuardrailAPI

The initialize_guardrail function creates a GenericGuardrailAPI instance without passing the headers parameter from the configuration. The GenericGuardrailAPI.__init__ expects headers as a parameter (used for authentication), but only api_base, api_key, and additional_provider_specific_params are passed. Since headers isn't forwarded, any authentication headers (like Authorization: Bearer ...) configured in the YAML will be ignored, and API calls to the external guardrail will fail authentication. The api_key being passed is also unused since GenericGuardrailAPI doesn't have an api_key parameter—it expects authentication via headers.

Fix in Cursor Fix in Web

text=text,
request_body={},
additional_provider_specific_params=additional_params,
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Request body always empty despite documented context

The apply_guardrail method always passes an empty dict {} for request_body when creating GenericGuardrailAPIRequest, even though request_data is available as a parameter and is already being used to extract dynamic parameters. The documentation states that request_body should contain the "full original request for context", but the external guardrail API will always receive an empty object, preventing it from making context-aware decisions based on the original request data.

Fix in Cursor Fix in Web

action = "ANONYMIZED" if GUARDRAIL_CONFIG.anonymize_pii else "BLOCKED"

for pii_type, pattern in GUARDRAIL_CONFIG.pii_patterns.items():
matches = re.finditer(pattern, text)

Check failure

Code scanning / CodeQL

Regular expression injection High

This regular expression depends on a
user-provided value
and is executed by
re.finditer
.

Copilot Autofix

AI 3 days ago

To fix this vulnerability, you must sanitize all user-supplied regular expression patterns before using them in re.finditer. The recommended approach is to escape the patterns using re.escape, which converts user input into literal strings safe for use in regular expressions (preventing regex metacharacters from having special meaning). The sanitation should occur right before pattern usage, in the loop over GUARDRAIL_CONFIG.pii_patterns.items() in the check_pii function. The change should only affect the usage of pattern so that the functionality (detecting substrings as PII) remains the same, but only literal matches are performed.

What to change:

  • In check_pii, update the assignment of pattern so that re.finditer uses re.escape(pattern) instead of the raw user-supplied pattern.

Methods/Imports/Definitions needed:

  • Since re is already imported, no additional imports are needed.

Suggested changeset 1
cookbook/mock_guardrail_server/mock_bedrock_guardrail_server.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/cookbook/mock_guardrail_server/mock_bedrock_guardrail_server.py b/cookbook/mock_guardrail_server/mock_bedrock_guardrail_server.py
--- a/cookbook/mock_guardrail_server/mock_bedrock_guardrail_server.py
+++ b/cookbook/mock_guardrail_server/mock_bedrock_guardrail_server.py
@@ -249,7 +249,8 @@
     action = "ANONYMIZED" if GUARDRAIL_CONFIG.anonymize_pii else "BLOCKED"
 
     for pii_type, pattern in GUARDRAIL_CONFIG.pii_patterns.items():
-        matches = re.finditer(pattern, text)
+        safe_pattern = re.escape(pattern)
+        matches = re.finditer(safe_pattern, text)
         for match in matches:
             matched_text = match.group()
             pii_entities.append(
EOF
@@ -249,7 +249,8 @@
action = "ANONYMIZED" if GUARDRAIL_CONFIG.anonymize_pii else "BLOCKED"

for pii_type, pattern in GUARDRAIL_CONFIG.pii_patterns.items():
matches = re.finditer(pattern, text)
safe_pattern = re.escape(pattern)
matches = re.finditer(safe_pattern, text)
for match in matches:
matched_text = match.group()
pii_entities.append(
Copilot is powered by AI and may make mistakes. Always verify output.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants