I really don't understand what am i doing wrong because insert of 30000 rows takes about 20 minutes. I also tried to insert 30000 with datagrip and it just takes like 1 second. I am also thinking of composing a string sql of the data and executing it.
public Mono<Boolean> insertOrders(List<ClientOrder> orders) {
if (orders.isEmpty()) {
return Mono.just(true);
} else {
return databaseClient.inConnectionMany(connection -> {
Statement statement = connection.createStatement(OrderQueries.INSERT_ORDERS);
for (int i = 0; i < orders.size(); ) {
var order = orders.get(i);
statement.bind("$1", order.getUserId())
.bind("$2", order.getUserName())
.bind("$3", order.getCustomerEmail())
.bind("$4", order.getCustomerMobile())
.bind("$5", order.getAmount())
.bind("$6", order.getCreateDate())
.bind("$7", order.getStatusChangeDate());
if (++i < orders.size()) {
statement.add();
}
}
return Flux.from(statement.execute()).flatMap((result -> result.map(((row, rowMetadata) -> row.get("id", Long.class)))));
}).doOnError((error) ->
System.out.println(error.getMessage())
).collectList().map((list) -> {
return list.size() == orders.size();
});
I would like to get some advice on doing multiple insert using spring boot r2dbc on postgres.
I also used similar solution to insert multiple entities https://github.com/hantsy/spring-r2dbc-sample/blob/master/database-client/src/main/java/com/example/demo/PostRepository.java#L90, it is just an example.
I have encountered similar performance issue before when migrating data from old database to new database.
But for your case, based on my previous experience,I think it is better to break the long list to smaller batches.
For example, 50 items in a batch, that when inserting counter reaches 50, commit the transaction and flush data into database, and then reset the counter to next iteration, till all data are inserted.