Throughput testing of an event store in an event sourced application

83 Views Asked by At

I would like to test the event store's performance and throughput in an event-sourced application. The application is a simple bank account (source: https://eventsourcing.readthedocs.io/en/stable/topics/examples/bank-accounts.html) where the "Account" is an aggregate that emits an event "Opened" whenever a new account aggregate is created. My objective is to plug different event stores into the application and test its performance and throughput. I would like to have a graph similar to this:

Events Rate over Number of Events

I have an RDS Postgres running which I would be testing first. There are other database candidates for the event store that I am planning to test and all will be running on AWS. In the graph, the event rate is the aggregate creation rate where each event is emitted when a new aggregate is created and every new command for aggregate creation is executed only after the acknowledgment.

P.S.: In my quest, I have come across many tools like Jmeter and Gatling but I am not able to think of how to perform such tests with these tools.

1

There are 1 best solutions below

0
Dmitri T On

What are you trying to test here? Performance of the RDS? Or performance of sample code from eventsource package documentation? Well-behaved load test must simulate real life system under test usage so if your "application" will have the frontend like web page or API or it will be used by the upstream system somehow you could implement this contract first and then choose the appropriate load testing tool which can simulate the real life usage of this contract.


If your "application" is just a piece of Python code and it will always be like this I don't think you will be able to use JMeter or Gatling for calling the application, take a look at Locust which is a Python-based load testing framework providing the possibility to call the functions of your "application"

See What Is Locust Load Testing? article for more details.