'Cannot connect to Cassandra in spark-shell
I am trying to connect to a remote cassandra cluster in my spark shell using the Spark-cassandra connector. But its throwing some unusual errors.
I do the usual thing as mentioned on the github page of spark-cassandra connector
Run the shell
$SPARK_HOME/bin/spark-shell --packages datastax:spark-cassandra-connector:2.0.0-s_2.11
import cassandra connector
import com.datastax.spark.connector._ import org.apache.spark.sql.cassandra._ Everything works fine upto this point But when i try to create an rdd
val rdd=sc.cassandraTable("test","user")
it throws this exception
java.lang.NoClassDefFoundError: org/apache/commons/configuration/ConfigurationException
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.datastax.spark.connector.cql.CassandraConnectorConf$.apply(CassandraConnectorConf.scala:257)
at com.datastax.spark.connector.cql.CassandraConnector$.apply(CassandraConnector.scala:189)
at com.datastax.spark.connector.SparkContextFunctions.cassandraTable$default$3(SparkContextFunctions.scala:52)
... 53 elided
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.ConfigurationException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
Spark version-2.3.1 connector version = 2.0.0
Solution 1:[1]
For Spark version 2.3.1 you must use Spark Connector version >= 2.3.0 - current version is 2.3.2:
spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.11:2.3.2 \
--conf spark.cassandra.connection.host=<IP>
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |