While debugging a Python azure function app locally, that processes events from an Azure Eventhub with 32 paritions I would like to know whether it is possible to handle the events one by one. It seems that whatever setting I set to limit the number of events that arrive are always equal to the number of partitions, in this case 32.
My host.json:
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[4.*, 5.0.0)"
},
"extensions": {
"eventHubs": {
"batchCheckpointFrequency": 1,
"eventProcessorOptions": {
"maxBatchSize": 1,
"prefetchCount": 1,
"maxConcurrentCalls": 1
},
"initialOffsetOptions": {
"type": "fromEnd"
}
}
}
}
The function app code:
@app.event_hub_message_trigger(arg_name="azeventhub",
event_hub_name="eventhubname",
connection="EventHubConnectionString",
cardinality=func.Cardinality.ONE,
consumer_group="consumer_group_x")
async def event_processor(azeventhub: func.EventHubEvent):
event = azeventhub.get_body().decode('utf-8')
logging.info("Received from EventHub: %s", event)
try:
json_event = json.loads(event)
await process_event(json_event)
except Exception as e:
logging.error("Error while processing event: %s", e)
time.sleep(10) # it does not wait after each event
In the above code the sleep is only executed after it has processed 32 events, so I assume it sleeps 32 times 10 seconds at the same time. The same behaviour happens when I make the code synchronous.
Is this not possible by design or are there settings that really can force this?
To Process event one by one from Batch of events. Set
cardinalitytooneas mentioned in this MS Doc.I have used service bus client to receive processe events as message one by one after triggered by
EventHub Trigger.This code worked for me
function_app.py:local.settings.json:INPUT:OUTPUT:events processed one by one and sent to service bus one by one as a single message: