An error occurred while saving the DataFrame: An error occurred while calling o430.save. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 16.0 failed 4 times, most recent failure: Lost task 0.3 in stage 16.0 (TID 71) (10.130.9.4 executor driver): org.apache.spark.SparkException: [TASK_WRITE_FAILED] Task failed while writing rows to abfss://[email protected]/EP/delta/sales_history.
while writing df in delta format i am getting above error.
suddenly getting this kind of issues. how to fix the bug
This can be anything. Most likely if your job was running and suddenly it throws an exception then it can be an OOM (OutOfMemory) on the driver node. For further investigations you need to investigate the DAG and its loggings.