Kryo out of memory error for flink siddhi

595 Views Asked by At

I am using flink siddhi and getting out of memory error while processing large objects. In output stream generated by siddhi cep i have object having more than 200 fields, and I have some operators after that to process this object. [flink version 1.7.2]

java.lang.OutOfMemoryError: Java heap space
    at com.esotericsoftware.kryo.io.Input.readBytes(Input.java:307)
    at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.read(DefaultArraySerializers.java:42)
    at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.read(DefaultArraySerializers.java:25)
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
    at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:143)
    at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:21)
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
    at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.deserialize(KryoSerializer.java:315)
    at org.apache.flink.runtime.state.DefaultOperatorStateBackend.deserializeOperatorStateValues(DefaultOperatorStateBackend.java:592)
    at org.apache.flink.runtime.state.DefaultOperatorStateBackend.restore(DefaultOperatorStateBackend.java:378)
    at org.apache.flink.runtime.state.DefaultOperatorStateBackend.restore(DefaultOperatorStateBackend.java:62)
    at org.apache.flink.streaming.api.operators.BackendRestorerProcedure.attemptCreateAndRestore(BackendRestorerProcedure.java:151)
    at org.apache.flink.streaming.api.operators.BackendRestorerProcedure.createAndRestore(BackendRestorerProcedure.java:123)
    at org.apache.flink.streaming.api.operators.StreamTaskStateInitializerImpl.operatorStateBackend(StreamTaskStateInitializerImpl.java:245)
    at org.apache.flink.streaming.api.operators.StreamTaskStateInitializerImpl.streamOperatorStateContext(StreamTaskStateInitializerImpl.java:143)
    at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:250)
    at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:738)
    at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:289)
    at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
    at java.lang.Thread.run(Thread.java:748)
1

There are 1 best solutions below

0
Shreyas B On

The exception usually means that the JVM heap is too small for Flink/Siddhi to process.

You can increase the JVM Heap size by increasing the total memory of flink. https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/memory/mem_setup/#configure-total-memory

Edit your conf/flink-conf.yaml and add the heap size for Taskmanager and Job manager with the highest appropriate values in mb/gb.

jobmanager.memory.heap.size: 
taskmanager.memory.task.heap.size: