'Error while trying to create external table in hive
I am trying to create an external table using hive with hadoop but somehow it failed. These are the error I get when I try to run my queries.
02:23:29.516 [HiveServer2-Background-Pool: Thread-39] ERROR hive.ql.exec.DDLTask - org.apache.hadoop.hive.ql.metadata.HiveException: Cannot validate serde: org.openx.data.jsonserde.JsonSerDe
at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3858)
at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:700)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3960)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:333)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1077)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:235)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:90)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:299)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:312)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: Class org.openx.data.jsonserde.JsonSerDe not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2329)
at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3852)
... 22 more
How can I solve it?
Solution 1:[1]
The exception says
java.lang.ClassNotFoundException: Class org.openx.data.jsonserde.JsonSerDe not found
Install JSONSerDe (download JARS from http://www.congiu.net/hive-json-serde/ and put them into hive/lib), read instructions here: Hive-JSON-Serde
Also instead of putting jars into hive/lib you can try adding jars in hive session:
ADD JAR ADD JAR /usr/lib/hive/lib/json-serde-1.3.8-jar-with-dependencies.jar;
ADD JAR ADD JAR /usr/lib/hive/lib/json-udf-1.3.8-jar-with-dependencies.jar;
Alternatively you can try native Hive JSONSerDe: org.apache.hive.hcatalog.data.JsonSerDe
- just change the class name in table DDL SerDe. It should be already installed. Read more details about differences here: https://docs.aws.amazon.com/athena/latest/ug/json-serde.html
Solution 2:[2]
I have the same issue when I use the hive command in cmd. But it works normally when I use the beeline with hive2 connection.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | leftjoin |
Solution 2 | Ixtiyor Majidov |