'How do I use JARs stored in Artifactory in spark submits?
I am trying to configure the spark-submits to use JARs that are stored in artifactory. I've tried a few ways to do this
Attempt 1: Changing the
--jars
parameter to point to the https end point- Result 1: 401 Error. Credentials are being passed like so: https://username:password@jfrog-endpoint. The link was tested using wget and it authenticates and downloads the JAR fine. Error
Attempt 2: Using a combination of
--packages
--repositories
- Result 2: URL doesn't resolve to the right location of the jar
Attempt 3:Using combination of
--packages
and modified ivysettings.xml (containing repo and artifact pattern) ivy settings- Result 3: URL resolves correctly but still results in "Not Found" After some research it looks like the error might say "Not Found" and the it looks like it has "tried" the repo, it could still very well be a 401 error. Error
Any ideas would be helpful! Links i've explored:
- Can i do spark-submit application jar directly from maven/jfrog artifactory
- spark resolve external packages behind corporate artifactory
- How to pass jar file (from Artifactory) in dcos spark run?
- https://godatadriven.com/blog/spark-packages-from-a-password-protected-repository/
- https://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|