Can you set spark settings in Hive on Spark?

31 Views Asked by At

There's a setting I use for ORC in Spark, for proper schema evolution:

spark.conf.set("spark.sql.orc.impl", "native")

The other one that exists besides native is hive. But its schema evolution capabilities suck.

So, I was wondering, since I am using Hive on Spark, can I somehow set spark.sql.orc.impl from in Hive?

0

There are 0 best solutions below