[Feature][Customize] Customize Plugin: Non-incremental for qa_test_cases.csv Should Cascade Deletes to qa_apis and qa_test_case_executions #8444
Labels
component/plugins
This issue or PR relates to plugins
improvement
type/feature-request
This issue is a proposal for something new
Search before asking
Use case
As a DevLake user leveraging the customize plugin to manage Quality Assurance (QA) data, I need to perform full updates of our test case inventory using qa_test_cases.csv. When doing so, I expect all associated QA data, including related API information (qa_apis) and test case execution history (qa_test_case_executions), to be refreshed consistently. This is crucial for maintaining data integrity and ensuring that our QA dashboards and metrics reflect only the current and relevant test case landscape, preventing analysis based on stale or orphaned data.
Description
Currently, when importing qa_test_cases.csv via the customize plugin's API in "non-incremental" mode, only the qa_test_cases domain layer table is cleared before the new data is ingested.
The associated tables, qa_apis and qa_test_case_executions, are not cleared during this process. This behavior can lead to data integrity issues:
This results in "dirty data" in qa_apis (and potentially qa_test_case_executions), where records exist for test cases that are no longer part of the current dataset.
Requested Change:
We request that the "non-incremental" mode for qa_test_cases.csv import via the customize plugin be enhanced. When qa_test_cases.csv is imported in this mode, the qa_apis and qa_test_case_executions tables should also be cleared (truncated) before new data related to the imported test cases is populated.
This will ensure that a "non-incremental" truly refreshes the entire relevant QA data model, preventing the accumulation of stale or orphaned records in these dependent tables and maintaining data consistency across the QA domain.
Related issues
No response
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: