Replies: 1 comment 3 replies
-
Event Hubs is hyper-optimized for being able to process large numbers of events with minimal overhead. Durable Functions with the Azure Storage backend, on the other hand, requires each message to be created, dequeued, and deleted individually, requiring substantially higher amounts of I/O. This is why normal event processing scenarios are generally going to favor Event Hubs compared to orchestrations or entities. That said, the Netherite storage provider for Durable Functions uses Event Hubs in its implementation to get the same benefits of more efficient I/O. It internally uses batching to avoid too many small I/Os, though the actual programming model in functions remains the same. Have you tried testing your ~100K workload with Durable Functions and Netherite? |
Beta Was this translation helpful? Give feedback.
-
Hi Team,
There is a little bit of initial conversation, but I want to post my observations after testing. Looks like durable-functions lack of batching functionality which is presented when you have eventhub trigger with batching enabled. My scenario is to parse a csv file with ~100+k records, process them with azure function (could be in batch/chunks) and be sure that all rows processed via orchestrator state change. With just using eventhub it takes up to minute, with durable functions it can take up to an hour, any suggestions on batching techniques around durable functions?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions