Skip to content

Imperva: Fix try-except blocks#2058

Open
squioc wants to merge 4 commits intodevelopfrom
fix/ImpervaCode
Open

Imperva: Fix try-except blocks#2058
squioc wants to merge 4 commits intodevelopfrom
fix/ImpervaCode

Conversation

@squioc
Copy link
Collaborator

@squioc squioc commented Feb 24, 2026

  • Fix try-except blocks to avoid unassigned variable errors
  • Track the number of forwarded events in the logs

Summary by Sourcery

Improve Imperva log fetching by safely handling file processing errors and reporting forwarded event counts per batch.

New Features:

  • Track and log the number of events successfully forwarded for each fetch batch.

Bug Fixes:

  • Prevent unassigned variable errors when file decryption or handling fails by returning consistent HandlingFileResult objects.

Enhancements:

  • Propagate the number of forwarded events from decrypted log handling up to the batch runner for aggregate logging.

Documentation:

  • Update the changelog with the new version and a description of the added tracking and error handling fixes.

Tests:

  • Adjust existing fetch logs tests to validate the new nb_forwarded_events field in HandlingFileResult.

Chores:

  • Bump the Imperva integration version in the manifest to 1.21.4.

@squioc squioc requested review from a team and Copilot February 24, 2026 10:50
@squioc squioc added the bug Something isn't working label Feb 24, 2026
@sourcery-ai
Copy link
Contributor

sourcery-ai bot commented Feb 24, 2026

Reviewer's Guide

Refactors error handling around log file processing to ensure safe variable usage and adds tracking/aggregation of the number of forwarded events per batch, updating the public interface, logging, metrics, and tests accordingly.

Sequence diagram for updated batch log processing and event forwarding

sequenceDiagram
    participant Runner as Runner
    participant Fetcher as ImpervaLogsFetcher
    participant Executor as ThreadPoolExecutor
    participant Worker as process_file
    participant Handler as handle_file
    participant Decrypt as decrypt_file
    participant Content as handle_log_decrypted_content
    participant Intake as push_events_to_intakes

    Runner->>Fetcher: run()
    activate Fetcher
    Fetcher->>Fetcher: start_batch_timer()
    Fetcher->>Executor: create_pool(max_workers=NUM_WORKERS)
    activate Executor
    loop for each addition in additions
        Executor->>Worker: process_file(log_file)
        activate Worker
        Worker->>Handler: handle_file(log_name)
        activate Handler
        Handler->>Fetcher: get_file_response(log_name)
        Handler->>Decrypt: decrypt_file(response.content, filename)
        activate Decrypt
        Decrypt-->>Handler: decrypted_file
        deactivate Decrypt
        Handler->>Content: handle_log_decrypted_content(decrypted_file)
        activate Content
        Content->>Content: split_to_events(decrypted_file_text)
        Content->>Content: OUTCOMING_EVENTS.inc(len(events_list))
        Content->>Intake: push_events_to_intakes(events_list)
        activate Intake
        Intake-->>Content: events_ids
        deactivate Intake
        Content-->>Handler: nb_events_forwarded
        deactivate Content
        Handler-->>Worker: HandlingFileResult(successful=True, last_timestamp, nb_forwarded_events)
        deactivate Handler
        Worker-->>Executor: HandlingFileResult
        deactivate Worker
    end
    Executor-->>Fetcher: iterator_of_HandlingFileResult
    deactivate Executor

    Fetcher->>Fetcher: collect_last_timestamp(results)
    Fetcher->>Fetcher: collect_nb_forwarded_events(results)
    Fetcher->>Fetcher: total_forwarded_events = sum(nb_forwarded_events)

    Fetcher->>Fetcher: compute_batch_duration()
    Fetcher->>Fetcher: log("Fetched and forwarded total_forwarded_events events in batch_duration seconds")
    Fetcher->>Fetcher: FORWARD_EVENTS_DURATION.observe(batch_duration)
    Fetcher-->>Runner: batch_completed
    deactivate Fetcher
Loading

Class diagram for updated HandlingFileResult and log fetcher methods

classDiagram
    class HandlingFileResult {
        LogFileId log_name
        bool successful
        int last_timestamp
        int nb_forwarded_events
    }

    class ImpervaLogsFetcher {
        +handle_file(log_name: LogFileId) HandlingFileResult
        +handle_log_decrypted_content(decrypted_file: bytes) int
        +decrypt_file(file_content: bytes, filename: str) bytes
        +push_events_to_intakes(events_list: list~str~) list~str~
        +run() None
    }

    ImpervaLogsFetcher --> HandlingFileResult : returns
    ImpervaLogsFetcher --> HandlingFileResult : aggregates nb_forwarded_events
    ImpervaLogsFetcher ..> LogFileId : uses as parameter
Loading

File-Level Changes

Change Details Files
Track and propagate the number of forwarded events from file handling through to batch logging.
  • Extend HandlingFileResult model with an optional nb_forwarded_events field used to report per-file forwarded event counts.
  • Update handle_file to return early on success with both last_timestamp and nb_forwarded_events, and to set nb_forwarded_events=0 on failure.
  • Change handle_log_decrypted_content to return the number of forwarded events based on push_events_to_intakes results.
  • In run, accumulate nb_forwarded_events from each processed file, compute the batch total, and log it alongside the batch duration while still emitting the FORWARD_EVENTS_DURATION metric.
Imperva/imperva/fetch_logs_v2.py
Fix try/except control flow to avoid unassigned variable usage and adjust tests and metadata for the new behavior.
  • Move the successful HandlingFileResult return into the try block so last_timestamp is never read if extraction/decryption fails.
  • Initialize nb_forwarded_events list before the try in run to ensure it always exists even if exceptions occur.
  • Update unit test expectations to account for the new nb_forwarded_events field in HandlingFileResult.
  • Bump integration version and document the changes in the changelog.
Imperva/imperva/fetch_logs_v2.py
Imperva/tests/test_fetch_logs_v2.py
Imperva/manifest.json
Imperva/CHANGELOG.md

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 1 issue

Prompt for AI Agents
Please address the comments from this code review:

## Individual Comments

### Comment 1
<location path="Imperva/imperva/fetch_logs_v2.py" line_range="133" />
<code_context>
-            return HandlingFileResult(log_name=log_name, successful=False)
-
-        return HandlingFileResult(log_name=log_name, successful=True, last_timestamp=last_timestamp)
+            return HandlingFileResult(log_name=log_name, successful=False, nb_forwarded_events=0)

-    def handle_log_decrypted_content(self, decrypted_file: bytes) -> None:
</code_context>
<issue_to_address>
**question:** Clarify whether a failed file handling should report 0 forwarded events or an unknown/None value.

Using `nb_forwarded_events=0` on failure makes it impossible to distinguish “no events forwarded” from “processing failed and event count is unknown.” If metrics should only aggregate successfully processed files, consider keeping `nb_forwarded_events` as `None` on failure and excluding those from the batch sum. If the current semantics (treating failures as 0) are intentional, please confirm that this matches how downstream metrics are interpreted.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the Imperva connector to avoid unassigned-variable issues in try/except paths and adds batch-level logging for the number of forwarded events.

Changes:

  • Extend HandlingFileResult to include nb_forwarded_events, and return it from per-file processing.
  • Track and log total forwarded events per batch along with batch duration.
  • Bump connector version to 1.21.4 and add corresponding changelog entry.

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 3 comments.

File Description
Imperva/imperva/fetch_logs_v2.py Return forwarded event counts per file, aggregate them per batch, and log totals + duration.
Imperva/tests/test_fetch_logs_v2.py Update expectations to include nb_forwarded_events in HandlingFileResult.
Imperva/manifest.json Version bump to 1.21.4.
Imperva/CHANGELOG.md Document fixes/additions for 1.21.4.

Comment on lines 123 to +126
res = trigger.handle_file(logs[0])
assert res == HandlingFileResult(successful=True, log_name=logs[0], last_timestamp=1759413775485)
assert res == HandlingFileResult(
successful=True, log_name=logs[0], last_timestamp=1759413775485, nb_forwarded_events=0
)
Copy link

Copilot AI Feb 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

test_handle_file now asserts nb_forwarded_events=0, but the fixture file contains 3 events and handle_log_decrypted_content() returns len(push_events_to_intakes(...)). Because push_events_to_intakes is a bare MagicMock, this test will always see len(...) == 0 and won’t validate the new behavior. Set an explicit return_value for push_events_to_intakes (e.g., a list of 3 ids) and assert the corresponding forwarded count.

Copilot uses AI. Check for mistakes.
Comment on lines 142 to +145
OUTCOMING_EVENTS.labels(intake_key=self.configuration.intake_key).inc(len(events_list))

self.push_events_to_intakes(events_list)
events_ids = self.push_events_to_intakes(events_list)
return len(events_ids)
Copy link

Copilot AI Feb 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OUTCOMING_EVENTS is documented as "Number of events forwarded", but it is incremented using len(events_list) (events attempted) while nb_forwarded_events is computed from the ids returned by push_events_to_intakes (events actually forwarded). To keep metrics consistent with the new forwarded-events logging, increment OUTCOMING_EVENTS based on the number of successfully forwarded events (i.e., the length of the returned ids).

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants