`While creating a basic pyspark dataframe, its thowing this error, i am unable to find if its because of the software installation or version,
Java version 8 Spark 3.3.2 python 3.9.13
please let me know the issue if its because of the version and if its due to the environment as even simple codes are not working
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('sparkdf').getOrCreate()
data = [["java", "dbms", "python"],
["OOPS", "SQL", "Machine Learning"]]
columns = ["Subject 1", "Subject 2", "Subject 3"]
dataframe = spark.createDataFrame(data, columns)
dataframe.show()```
Expecting a pyspark dataframe to be created but getting this error
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_22308\2222183590.py in <module>
5 StructField('lastname', StringType(), True)
6 ])
----> 7 df = spark.createDataFrame(emptyRDD,schema)
8 #df.printSchema()
~\AppData\Roaming\Python\Python39\site-packages\pyspark\sql\session.py in createDataFrame(self, data, schema, samplingRatio, verifySchema)
1274 data, schema, samplingRatio, verifySchema
1275 )
-> 1276 return self._create_dataframe(
1277 data, schema, samplingRatio, verifySchema # type: ignore[arg-type]
1278 )
~\AppData\Roaming\Python\Python39\site-packages\pyspark\sql\session.py in _create_dataframe(self, data, schema, samplingRatio, verifySchema)
1318 rdd, struct = self._createFromLocal(map(prepare, data), schema)
1319 assert self._jvm is not None
-> 1320 jrdd = self._jvm.SerDeUtil.toJavaArray(rdd._to_java_object_rdd())
1321 jdf = self._jsparkSession.applySchemaToPythonRDD(jrdd.rdd(), struct.json())
1322 df = DataFrame(jdf, self)
TypeError: 'JavaPackage' object is not callable`