Skip to content

Conversation

chuanli11
Copy link
Collaborator

This pull request fixes #52.

  • Added Lambda API support end-to-end:
    • Introduced LAMBDA_API_KEY in .env.template and loaded it in config.py.
    • Implemented llm/lambda_client.py using the OpenAI-compatible client pointed at https://api.lambda.ai/v1 and set the model to deepseek-llama3.3-70b. The client provides generate_response with proper message formatting and retries.
  • Integrated the new client into the app:
    • main.py registers "deepseek" as a panelist LLM, adds a user-facing name, and initializes the client with the Lambda API key.
    • TurnManager includes "deepseek" in panelist_ids so it participates in the panel discussion.
    • UI adds a color mapping for "deepseek".
  • Tests updated to cover initialization and (optionally) real API integration, and to assert the new panelist is present.

These changes ensure the deepseek-llama3.3-70b model from the Lambda inference API is available and actively participates as a panelist in the discussion.

Automatic fix generated by OpenHands 🙌

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add a new speaker to the panel

2 participants