Got Unsupported type ARRAY when read table with DataFrameReader jdbc() function:
java.sql.SQLException: Unsupported type ARRAY
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$getCatalystType(JdbcUtils.scala:251)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:316)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:316)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(JdbcUtils.scala:315)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:210)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.apply(JDBCRelation.scala:225)
at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:313)
The table is like:
mydb=> \d check_type;
Table "public.check_type"
Column | Type | Collation | Nullable | Default
--------+-----------+-----------+----------+---------
id | integer | | |
types | my_type[] | | |
mydb=> \d my_type;
Composite type "public.my_type"
Column | Type | Collation | Nullable | Default
--------+---------+-----------+----------+---------
id | integer | | |
Is there a workaround for that? Or is it an expected behavior?
Spark version:
Spark-core = 2.4.0;
Spark-sql = 2.4.0;