'Calling Kubernetes Spark Operator with Java api
There is a good of examples of creating Spark jobs using the Kubernetes Spark Operator and simply submitting a request with the following
kubectl apply -f spark-pi.yaml
Where the spark-pi.yaml can be found at the this here
Does anyone know the easiest way to submit a job like this with the Java K8s api?
Solution 1:[1]
I would recommend to look into Fabric8 K8s client used by Apache Spark in K8s or the official Java K8s client. With these libs you can submit the K8s resources using the code.
Solution 2:[2]
I have written a Application to submit spark job to Kubernetes where all you need to pass is Config Map (key value pair for app)
you could find the same in github Under class RunSparkJobInKube(jobConfiguration: Map[String,String])
this might help help you provide an idea for your requirement.
though this is scala you can call inside java as normal method.
here in this app I have integrated with IAM (aws specific) in case if you are interested in security.
Solution 3:[3]
I have generated the Spark operator Java client to submit the spark job to Kubernetes. I am sharing the Github URL of repository client-java-spark-operator
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Aliaksandr Sasnouskikh |
Solution 2 | Bhargav Kosaraju |
Solution 3 | Mukhtiar Ahmed |