cannot display hive databases in spark

52 Views Asked by At

I run this command

spark-shell and it works

--files /home/hadoop/spark/conf/hive-site.xml --master spark://nodemaster:7077  --jars   $HIVE_HOME/lib/hive-metastore-2.3.4.jar,  $HIVE_HOME/lib/hive-exec-2.3.4.jar,  $HIVE_HOME/lib/hive-common-2.3.4.jar,  $HIVE_HOME/lib/hde-2.3.4.jar,  $HIVE_HOME/lib/guava-14.0.1.jar,  $SPARK_HOME/jars/spark-hive_2.11-2.4.0.jar,  $SPARK_HOME/jars/spark-core_2.11-2.4.0.jar,  $SPARK_HOME/jars/spark-sql_2.11-2.4.0.jar   --conf spark.sql.hive.metastore.version=2.3   --conf spark.sql.hive.metastore.jars=$HIVE_HOME"/lib/*"   --conf spark.sql.warehouse.dir=hdfs://172.20.1.1:9000/user/hive/metastore   --conf spark.sql.catalogImplementation=hive      

but when i run this

val databases = spark.sql("SHOW DATABASES")

I got this error

java.lang.NoSuchMethodError: 

org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V

  at org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:41)

  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:56)

  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)

  at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)

  at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)

  at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)

  at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)

  at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)

  at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)

  at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)

  at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)

  at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)

  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)

  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)

  ... 49 elided
0

There are 0 best solutions below