Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

Track timestamps in lazer_exporter to avoid duplicate messages

Summary

Previously, the lazer_exporter would periodically copy all price infos from the local state via state.get_all_price_infos() and send FeedUpdate messages for every price feed, even if the data hadn't changed. This led to many duplicate messages being sent to the relayer.

This PR adds a HashMap to track the most recent PriceInfo::timestamp for each pyth_sdk::Identifier. Before sending a FeedUpdate, we now check if the price info has a newer timestamp than what was last sent. If not, we skip sending that update.

Changes:

  • Added last_sent_timestamps HashMap to track the last sent timestamp per identifier
  • Added timestamp comparison logic to skip price infos that haven't been updated
  • Updates the HashMap after successfully adding each FeedUpdate to the batch

Review & Testing Checklist for Human

Risk level: Yellow - Logic change that affects message deduplication behavior

  • Memory growth: Verify that the last_sent_timestamps HashMap size is acceptable in production. It will grow to the number of unique identifiers ever seen during the agent's lifetime. Consider whether a bounded cache or TTL-based cleanup is needed for long-running processes.
  • Timestamp comparison correctness: Validate that using <= is correct (skip if timestamp is less than OR equal to last sent). Should we ever send updates with the same timestamp twice?
  • End-to-end testing: Run the agent with this change and verify that duplicate messages are actually being filtered. Monitor relayer logs to confirm reduced message volume when prices don't change.

Test Plan

  1. Deploy agent with these changes to a test environment
  2. Monitor the FeedUpdate messages being sent to the relayer
  3. Verify that when price data doesn't change between publish intervals, no duplicate messages are sent
  4. Verify that when price data DOES change, updates are sent correctly
  5. Monitor memory usage over an extended period to ensure the HashMap size is reasonable

Notes

Keep track of the most recent PriceInfo::timestamp for each
pyth_sdk::Identifier and skip objects that do not have a newer timestamp.
This prevents sending duplicate FeedUpdate messages when the local state
has not been updated.

Co-Authored-By: Mike Rolish <[email protected]>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

Co-Authored-By: Mike Rolish <[email protected]>
Copy link
Collaborator

@ali-behjati ali-behjati left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please bump the version @devin

Co-Authored-By: Mike Rolish <[email protected]>
@devin-ai-integration
Copy link
Contributor Author

Version bumped to 3.0.7 in commit 7e19cf2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants