'sbt package is trying to download a package whose path does not exist
These are the contents of my build.sbt file:
name := "WordCounter"
version := "0.1"
scalaVersion := "2.13.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.4.5"
)
when I try to run sbt package
this is the output that I get
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.13:2.4.5
[error] not found: https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.13/2.4.5/spark-core_2.13-2.4.5.pom
If I navigate to that URL in my browser I can confirm that it does not exist, it returns a 404 code.
What I don't understand is why sbt
is trying to go to spark-core_2.13
when the repository only lists spark-core_2.12
as the latest dependency listed.
Is there something wrong with my build.sbt
file? Is there a way to tell SBT to go to an arbitrary path for that dependency only?
Solution 1:[1]
your scala version scalaVersion := "2.13.1"
is the culprit.
since you have above scala version and your dependency is
"org.apache.spark" %% "spark-core" % "2.4.5"
so it s trying to find out spark-core 2.4.5 artifact compiled with 2.13 and its not existing....
see here for scala spark version compatablity
see mvn repo as well here
if you change it to 2.12 it will work.
I think there is no spark version which is compatible with scala 2.13. see here
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Ram Ghadiyaram |