I am getting this error while running rdd.first(), after running rdd = sc.parallelize([1,2,3]) in command prompt.
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (LAPTOP-6IIGK4P5 executor driver): org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
attaching the screenshot for this(https://i.stack.imgur.com/A4hOa.png)