-
Notifications
You must be signed in to change notification settings - Fork 3
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge latest changes in prep for PyPi publish (#14)
* Remove launch.json * chore: Refactor SlackReceiver to handle channel events and join new channels * Add ability for a component to send a message directly to a named flow * Add ability to stream partial LLM responses to a named flow * Add a configurable message that is sent to the slack channel that sent a message to slack_input * Add the ability to clear the ack message and stream partial results to that ack message * feat: Update default stream_batch_size to 15 in LangChainChatModelWithHistory - also started on converting absolute paths for imports to relative ones * Update import statement in main.py The import statement in `main.py` was updated to remove the relative path and use an absolute import for `SolaceAiConnector` from `solace_ai_connector.solace_ai_connector`. This change improves the code organization and ensures proper module resolution. * Another major reorganization of directory structure to make it more sensible. This is part of a cleanup before releasing to Pypi * Fixed some documentation generation after package reorganization * chore: Remove unused slack.yaml configuration file * Made some changes in utils.py for dynamic loading. We will no longer do a directory search looking for modules. We are more explicit now Also promote the gen_docs tool to an installed script so that it can be used for plugins * Moved slack components into their own plugin: solace-ai-connector-slack. Adjusted the importing of modules from config files to be more friendly for plugins and added a 'component_package' properly that will be auto-installed if it is specified and the package is not present. * chore: Update component_base.py to include flow_lock_manager and flow_kv_store This commit updates the component_base.py file to include the flow_lock_manager and flow_kv_store attributes. These attributes are necessary for components to access the flow-level lock manager and key-value store. By adding these attributes, components can now utilize the lock manager to synchronize access to shared resources and the key-value store to store and retrieve data across multiple components within the flow. This change improves the flexibility and functionality of the component_base.py file. Changed the default location of the trust_store for the Solace API to be provided by the certifi module. Added a configuration item for the ChatModel with History component to be able to limit the size of entries being added to the chat history * chore: Update trust_store_path for Solace API This commit updates the trust_store_path for the Solace API in the solace_messaging.py file. The trust_store_path is now provided by the certifi module, which ensures that the default location of the trust store is used. This change improves the security and reliability of the Solace API integration. ``` * Bump up to latest Solace API and small fix in a debug log * DATAGO-79372: Add Publish workflow (#3) * DATAGO-78654 : Add CI (#4) * DATAGO-78654: Add ci --------- Co-authored-by: Aman Riat <[email protected]> Co-authored-by: Art Morozov <[email protected]>
- Loading branch information
1 parent
1fe3952
commit 1c39679
Showing
26 changed files
with
824 additions
and
881 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,156 @@ | ||
name: CI | ||
on: | ||
push: | ||
branches: | ||
- main | ||
pull_request: | ||
types: [opened, synchronize] | ||
|
||
permissions: | ||
id-token: write | ||
checks: write | ||
issues: read | ||
pull-requests: write | ||
|
||
jobs: | ||
test: | ||
runs-on: ubuntu-latest | ||
env: | ||
HATCH_CACHE_DIR: ${{ github.workspace }}/.hatch_cache | ||
HATCH_DATA_DIR: ${{ github.workspace }}/.hatch_data | ||
|
||
steps: | ||
- uses: actions/checkout@v4 | ||
with: | ||
fetch-depth: 0 | ||
|
||
- name: Install Hatch | ||
uses: pypa/hatch@install | ||
|
||
- name: Restore Hatch Directory | ||
uses: actions/cache/restore@v4 | ||
id: cache-restore | ||
with: | ||
path: | | ||
${{ env.HATCH_CACHE_DIR }} | ||
${{ env.HATCH_DATA_DIR }} | ||
key: ${{ runner.os }}-hatch-${{ hashFiles('pyproject.toml','requirements.txt') }} | ||
|
||
- name: Install Dependencies | ||
if: steps.cache-restore.outputs.cache-hit != 'true' | ||
run: | | ||
hatch python install 3.8 3.12 | ||
- name: Install Dependencies | ||
if: steps.cache-restore.outputs.cache-hit != 'true' | ||
run: | | ||
hatch env create test | ||
- name: Cache Hatch Directory | ||
uses: actions/cache/save@v4 | ||
if: steps.cache-restore.outputs.cache-hit != 'true' | ||
id: cache-hatch | ||
with: | ||
path: | | ||
${{ env.HATCH_CACHE_DIR }} | ||
${{ env.HATCH_DATA_DIR }} | ||
key: ${{ runner.os }}-hatch-${{ hashFiles('pyproject.toml','requirements.txt') }} | ||
|
||
- name: Set up Docker Buildx | ||
id: builder | ||
uses: docker/setup-buildx-action@v3 | ||
|
||
- name: Prepare env file | ||
run: | | ||
cp .env_template .env | ||
shell: bash | ||
|
||
- name: Build Docker Image | ||
uses: docker/build-push-action@v6 | ||
with: | ||
push: false | ||
tags: solace/solace-ai-connector:local | ||
platforms: linux/amd64 | ||
builder: ${{ steps.builder.outputs.name }} | ||
load: true | ||
|
||
- name: Run Lint | ||
continue-on-error: true | ||
run: | | ||
hatch run +py=312 lint:ruff check -o lint.json --output-format json ./src ./tests | ||
shell: bash | ||
|
||
- name: Run Structured Tests | ||
run: | | ||
hatch run +py=312 test:make structure-test | ||
shell: bash | ||
|
||
- name: Run Unit Tests | ||
shell: bash | ||
run: | | ||
hatch test --cover --all --parallel --junitxml=junit.xml | ||
- name: Combine Coverage Reports | ||
continue-on-error: true | ||
run: | | ||
hatch run +py=312 test:coverage combine | ||
shell: bash | ||
|
||
- name: Report coverage | ||
run: | | ||
hatch run +py=312 test:coverage xml | ||
shell: bash | ||
|
||
- name: SonarQube Scan | ||
if: always() | ||
uses: sonarsource/[email protected] | ||
env: | ||
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} | ||
SONAR_HOST_URL: ${{ vars.SONAR_HOST_URL }} | ||
with: | ||
args: > | ||
-Dsonar.tests=tests/ | ||
-Dsonar.verbose=true | ||
-Dsonar.sources=src/ | ||
-Dsonar.projectKey=${{github.repository_owner}}_${{github.event.repository.name}} | ||
-Dsonar.python.coverage.reportPaths=coverage.xml | ||
-Dsonar.python.ruff.reportPaths=lint.json | ||
- name: SonarQube Quality Gate check | ||
id: sonarqube-quality-gate-check | ||
uses: sonarsource/sonarqube-quality-gate-action@master | ||
env: | ||
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} | ||
SONAR_HOST_URL: ${{ vars.SONAR_HOST_URL }} | ||
|
||
# Build and verify packages | ||
- name: Build | ||
run: hatch build | ||
|
||
- name: Verify Packages | ||
run: | | ||
ls dist/*.tar.gz | hatch run +py=312 test:xargs -n1 twine check | ||
ls dist/*.whl | hatch run +py=312 test:xargs -n1 twine check | ||
shell: bash | ||
|
||
- name: Surface failing tests | ||
if: always() | ||
uses: pmeier/pytest-results-action@main | ||
with: | ||
# A list of JUnit XML files, directories containing the former, and wildcard | ||
# patterns to process. | ||
# See @actions/glob for supported patterns. | ||
path: junit.xml | ||
|
||
# (Optional) Add a summary of the results at the top of the report | ||
summary: true | ||
|
||
# (Optional) Select which results should be included in the report. | ||
# Follows the same syntax as `pytest -r` | ||
display-options: fEX | ||
|
||
# (Optional) Fail the workflow if no JUnit XML was found. | ||
fail-on-empty: true | ||
|
||
# (Optional) Title of the test results section in the workflow summary | ||
title: Unit Test results |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,74 @@ | ||
name: Release | ||
on: | ||
workflow_dispatch: | ||
inputs: | ||
version: | ||
type: choice | ||
required: true | ||
description: "Version bump type" | ||
options: | ||
- patch | ||
- minor | ||
- major | ||
|
||
jobs: | ||
release: | ||
name: Release | ||
timeout-minutes: 20 | ||
runs-on: ubuntu-latest | ||
environment: | ||
name: pypi | ||
url: https://pypi.org/p/solace_ai_connector | ||
permissions: | ||
id-token: write | ||
contents: write | ||
|
||
steps: | ||
- name: Checkout | ||
uses: actions/checkout@v4 | ||
with: | ||
fetch-depth: 0 | ||
ssh-key: ${{ secrets.COMMIT_KEY }} | ||
|
||
- name: Set up Python | ||
uses: actions/setup-python@v5 | ||
with: | ||
python-version: '3.x' | ||
|
||
- name: Install hatch | ||
run: | | ||
pip install --upgrade pip | ||
pip install hatch | ||
- name: Bump Version | ||
run: | | ||
CURRENT_VERSION=$(hatch version) | ||
echo "CURRENT_VERSION=${CURRENT_VERSION}" >> $GITHUB_ENV | ||
hatch version "${{ github.event.inputs.version }}" | ||
NEW_VERSION=$(hatch version) | ||
echo "NEW_VERSION=${NEW_VERSION}" >> $GITHUB_ENV | ||
- name: Fail if the current version doesn't exist | ||
if: env.CURRENT_VERSION == '' | ||
run: exit 1 | ||
|
||
- name: Build project for distribution | ||
run: hatch build | ||
|
||
- name: Publish package distributions to PyPI | ||
uses: pypa/gh-action-pypi-publish@release/v1 | ||
|
||
- name: Create Release | ||
uses: ncipollo/release-action@v1 | ||
with: | ||
artifacts: "dist/*.whl" | ||
makeLatest: true | ||
generateReleaseNotes: true | ||
tag: ${{ env.CURRENT_VERSION }} | ||
|
||
- name: Commit new version | ||
run: | | ||
git config --local user.email "[email protected]" | ||
git config --local user.name "GitHub Action" | ||
git commit -a -m "[ci skip] Bump version to $NEW_VERSION" | ||
git push |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
Oops, something went wrong.