Has anyone ever setup Aapache Phoenix 5.x with Pyspark 3.x. I can't configure pyspark for a task where I would write data with spark to phoenix.
flattened_df.write \
.format("org.apache.phoenix.spark") \
.mode("overwrite") \
.option("table", phoenix_table) \
.option("zkUrl", zookeeper_url) \
.save()
I get an error sayning: Py4JJavaError: An error occurred while calling o227.save. : java.lang.NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps(java.lang.Object[])'
Tried adding spark-phoenix jar, phoenix client jar but could not get it to work