You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If we add another, optional LLM call - for example for getting the tags and summary of the file generated in the main LLM call - we could use these data as a naming strategy for the storage adapters - related to #10
If anyone is interested in making this feature happen - let me know I can specify the details
The text was updated successfully, but these errors were encountered:
Related to CatchTheTornado#16
Add an optional LLM call for generating tags and summary of the file.
* **app/main.py**
- Add a new endpoint `/llm_tags_summary` to generate tags and summary using the LLM.
- Update the `OllamaGenerateRequest` class to include a new field `generate_tags_summary`.
- Update the `generate_llama` function to handle the new `generate_tags_summary` field.
* **app/tasks.py**
- Add a new function `generate_tags_summary` to generate tags and summary using the LLM.
- Update the `ocr_task` function to include an optional call to `generate_tags_summary` after extracting text.
* **client/cli.py**
- Add a new command `llm_tags_summary` for generating tags and summary.
- Update the `main` function to handle the new `llm_tags_summary` command.
* **.env.example**
- Add a new environment variable `LLM_TAGS_SUMMARY_API_URL`.
If we add another, optional LLM call - for example for getting the tags and summary of the file generated in the main LLM call - we could use these data as a naming strategy for the storage adapters - related to #10
If anyone is interested in making this feature happen - let me know I can specify the details
The text was updated successfully, but these errors were encountered: