Bulkprocessor writes double or triple data to es

134 Views Asked by At

I am using bulkprocessor to batch write data into es, but when the ES load is high and the response is slow, the data will be written twice or triple, even if I have turned off the retry mechanism. Here is my codes.

 @Bean
public BulkProcessor bulkProcessor() {
    RestHighLevelClient client = restHighLevelClient();
    BiConsumer<BulkRequest, ActionListener<BulkResponse>> bulkConsumer =
            (request, bulkListener) -> client.bulkAsync(request, RequestOptions.DEFAULT, bulkListener);

    return BulkProcessor.builder(bulkConsumer, new BulkProcessor.Listener() {
                @Override
                public void beforeBulk(long l, BulkRequest bulkRequest) {

                }

                @Override
                public void afterBulk(long l, BulkRequest bulkRequest, BulkResponse bulkResponse) {
                    
                }

                @Override
                public void afterBulk(long l, BulkRequest bulkRequest, Throwable throwable) {
                    
                }

            }).setBulkActions(200)
            .setBulkSize(new ByteSizeValue(2, ByteSizeUnit.MB))
            .setFlushInterval(TimeValue.timeValueSeconds(5))
            .setConcurrentRequests(4)
            .setBackoffPolicy(BackoffPolicy.noBackoff())
            .build();
}

Can anyone help? I just want to ensure that the data is not repeated, and it doesn't matter if it's lost.

0

There are 0 best solutions below