How to run SparkApplication using `spark-kubernetes-operator` multiple times

147 Views Asked by At

I am planning on using spark-kubernetes-operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator) for managing my spark applications. I have managed to install and get the basic examples working on local k8s setup.

I have created a sample SparkApplication using kubectl apply -f <file>.yaml. Now this application has executed and driver is in the Completed state.

If I want to rerun the job again, do I need to resubmit the app with same yaml like kubectl apply -f <file>.yaml ?

Is there any other way to rerun it again?

0

There are 0 best solutions below