spark-shell taking most of the resources in the cluster

24 Views Asked by At

I am new to Spark when i schedule the jobs for some time they were running fine and after sometime either they were slow or stuck and when i open the YARN-UI to see the allocation of resources to the jobs i found that spark-shell consuming around 76% of the resources is this usual or something wrong with cluster configuration. Any suggestions or am i doing anything wrong. While submitting the jobs i am submitting them in cluster mode. I am using spark version 3.2.1.

enter image description here

0

There are 0 best solutions below