Closed
Description
Describe the bug
Guardrails configure llm providers missing error key?!
To Reproduce
Steps to reproduce the behavior:
- uv add guardrails-ai
- guardrails configure
Expected behavior
Configuration for the hub / api key
Library version:
Version 0.1.8
uv add installs version 0.1.8 instead of 0.6.4 π€·
Additional context
There seem to be some errors with your openai import
Error:
.venv/lib/python3.12/site-packages/guardrails/llm_providers.py", line 17, in <module>
openai.error.APIConnectionError,
^^^^^^^^^^^^
AttributeError: module 'openai' has no attribute 'error'
This one should work:
guardrails/llm_providers.py
OPENAI_RETRYABLE_ERRORS = [
openai.APIConnectionError,
openai.APIError,
# openai.TryAgain,
openai.Timeout,
openai.RateLimitError,
# openai.ServiceUnavailableError,
]
once fixed I am greeted with this error:
from griffe.dataclasses import Docstring
ModuleNotFoundError: No module named 'griffe.dataclasses'
an upgrade fixed it, seems like the default version is somehow messed up?
pip install --upgrade guardrails-ai