Databricks SparkR can't create spark DataFrame from large R data.frame

126 Views Asked by At

I have a local R data.frame with 16 Million records in databricks. When I try to load it into a spark DataFrame I receive an error after circa 5 minutes every time.

sparkR.session()
spark_data_frame <- createDataFrame(local_r_data_frame, numPartitions=500)

The error:

 RserveException: eval failed

I have tried with up to 10000 paritions and the same error is received after the same amount of processing time.

0

There are 0 best solutions below