Skip to content

Conversation

@ohrite
Copy link
Contributor

@ohrite ohrite commented Nov 24, 2025

Description

This PR adds test coverage for the Littlepay sync and parse workflow

Resolves #4383
Resolves #4384

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation

How has this been tested?

pytest

Post-merge follow-ups

  • No action required
  • Actions required (specified below)

Monitor execution on staging

@ohrite ohrite self-assigned this Nov 24, 2025
@github-actions
Copy link

github-actions bot commented Nov 24, 2025

Terraform plan in iac/cal-itp-data-infra-staging/composer/us

No changes. Your infrastructure matches the configuration.
No changes. Your infrastructure matches the configuration.

Terraform has compared your real infrastructure against your configuration
and found no differences, so no changes are needed.

📝 Plan generated in Plan Terraform for Warehouse and DAG changes #1170

@github-actions
Copy link

github-actions bot commented Nov 24, 2025

Terraform plan in iac/cal-itp-data-infra/airflow/us

Plan: 6 to add, 0 to change, 0 to destroy.
Terraform used the selected providers to generate the following execution
plan. Resource actions are indicated with the following symbols:
+   create

Terraform will perform the following actions:

  # google_storage_bucket_object.calitp-composer["dags/download_and_parse_littlepay.py"] will be created
+   resource "google_storage_bucket_object" "calitp-composer" {
+       bucket         = "calitp-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "dags/download_and_parse_littlepay.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/dags/download_and_parse_littlepay.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-composer["dags/download_and_parse_ntd_xlsx.py"] will be created
+   resource "google_storage_bucket_object" "calitp-composer" {
+       bucket         = "calitp-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "dags/download_and_parse_ntd_xlsx.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/dags/download_and_parse_ntd_xlsx.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-composer["plugins/hooks/ntd_xlsx_hook.py"] will be created
+   resource "google_storage_bucket_object" "calitp-composer" {
+       bucket         = "calitp-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/hooks/ntd_xlsx_hook.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/hooks/ntd_xlsx_hook.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-composer["plugins/operators/littlepay_s3_to_gcs_operator.py"] will be created
+   resource "google_storage_bucket_object" "calitp-composer" {
+       bucket         = "calitp-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/operators/littlepay_s3_to_gcs_operator.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/operators/littlepay_s3_to_gcs_operator.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-composer["plugins/operators/ntd_xlsx_to_gcs_operator.py"] will be created
+   resource "google_storage_bucket_object" "calitp-composer" {
+       bucket         = "calitp-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/operators/ntd_xlsx_to_gcs_operator.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/operators/ntd_xlsx_to_gcs_operator.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-composer["plugins/operators/ntd_xlsx_to_jsonl_operator.py"] will be created
+   resource "google_storage_bucket_object" "calitp-composer" {
+       bucket         = "calitp-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/operators/ntd_xlsx_to_jsonl_operator.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/operators/ntd_xlsx_to_jsonl_operator.py"
+       storage_class  = (known after apply)
    }

Plan: 6 to add, 0 to change, 0 to destroy.

📝 Plan generated in Plan Terraform for Warehouse and DAG changes #1170

@github-actions
Copy link

github-actions bot commented Nov 24, 2025

Terraform plan in iac/cal-itp-data-infra-staging/airflow/us

Plan: 6 to add, 2 to change, 0 to destroy.
Terraform used the selected providers to generate the following execution
plan. Resource actions are indicated with the following symbols:
+   create
!~  update in-place

Terraform will perform the following actions:

  # google_storage_bucket_object.calitp-staging-composer["dags/download_and_parse_littlepay.py"] will be created
+   resource "google_storage_bucket_object" "calitp-staging-composer" {
+       bucket         = "calitp-staging-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "dags/download_and_parse_littlepay.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/dags/download_and_parse_littlepay.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-staging-composer["dags/download_and_parse_ntd_xlsx.py"] will be created
+   resource "google_storage_bucket_object" "calitp-staging-composer" {
+       bucket         = "calitp-staging-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "dags/download_and_parse_ntd_xlsx.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/dags/download_and_parse_ntd_xlsx.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-staging-composer["plugins/hooks/ntd_xlsx_hook.py"] will be created
+   resource "google_storage_bucket_object" "calitp-staging-composer" {
+       bucket         = "calitp-staging-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/hooks/ntd_xlsx_hook.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/hooks/ntd_xlsx_hook.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-staging-composer["plugins/operators/littlepay_s3_to_gcs_operator.py"] will be created
+   resource "google_storage_bucket_object" "calitp-staging-composer" {
+       bucket         = "calitp-staging-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/operators/littlepay_s3_to_gcs_operator.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/operators/littlepay_s3_to_gcs_operator.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-staging-composer["plugins/operators/ntd_xlsx_to_gcs_operator.py"] will be created
+   resource "google_storage_bucket_object" "calitp-staging-composer" {
+       bucket         = "calitp-staging-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/operators/ntd_xlsx_to_gcs_operator.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/operators/ntd_xlsx_to_gcs_operator.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-staging-composer["plugins/operators/ntd_xlsx_to_jsonl_operator.py"] will be created
+   resource "google_storage_bucket_object" "calitp-staging-composer" {
+       bucket         = "calitp-staging-composer"
+       content        = (sensitive value)
+       content_type   = (known after apply)
+       crc32c         = (known after apply)
+       detect_md5hash = "different hash"
+       generation     = (known after apply)
+       id             = (known after apply)
+       kms_key_name   = (known after apply)
+       md5hash        = (known after apply)
+       md5hexhash     = (known after apply)
+       media_link     = (known after apply)
+       name           = "plugins/operators/ntd_xlsx_to_jsonl_operator.py"
+       output_name    = (known after apply)
+       self_link      = (known after apply)
+       source         = "../../../../airflow/plugins/operators/ntd_xlsx_to_jsonl_operator.py"
+       storage_class  = (known after apply)
    }

  # google_storage_bucket_object.calitp-staging-composer-catalog will be updated in-place
!~  resource "google_storage_bucket_object" "calitp-staging-composer-catalog" {
!~      content             = (sensitive value)
!~      crc32c              = "KFEMgw==" -> (known after apply)
!~      detect_md5hash      = "FcddRn1vcGiT6Fc+seoihQ==" -> "different hash"
!~      generation          = 1765318154134732 -> (known after apply)
        id                  = "calitp-staging-composer-data/warehouse/target/catalog.json"
!~      md5hash             = "FcddRn1vcGiT6Fc+seoihQ==" -> (known after apply)
        name                = "data/warehouse/target/catalog.json"
#        (16 unchanged attributes hidden)
    }

  # google_storage_bucket_object.calitp-staging-composer-manifest will be updated in-place
!~  resource "google_storage_bucket_object" "calitp-staging-composer-manifest" {
!~      content             = (sensitive value)
!~      crc32c              = "4BO/Pw==" -> (known after apply)
!~      detect_md5hash      = "DVr5N9SKcJwP6GHtb+dYEg==" -> "different hash"
!~      generation          = 1765318155436013 -> (known after apply)
        id                  = "calitp-staging-composer-data/warehouse/target/manifest.json"
!~      md5hash             = "DVr5N9SKcJwP6GHtb+dYEg==" -> (known after apply)
        name                = "data/warehouse/target/manifest.json"
#        (16 unchanged attributes hidden)
    }

Plan: 6 to add, 2 to change, 0 to destroy.

📝 Plan generated in Plan Terraform for Warehouse and DAG changes #1170

@ohrite ohrite force-pushed the mov/4383-4384-littlepay-dag branch from e9dcce8 to a92e4dd Compare November 24, 2025 18:20
@ohrite ohrite force-pushed the mov/4383-4384-littlepay-dag branch from a92e4dd to 9b4c8d0 Compare December 3, 2025 04:28
@ohrite ohrite force-pushed the mov/4383-4384-littlepay-dag branch from 9b4c8d0 to b5355bf Compare December 10, 2025 16:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Developer sees unit test coverage for Littlepay v3 parsing Developer sees unit test coverage for Littlepay V3 syncing

2 participants