I can't generate an ANSI file in pyspark

25 Views Asked by At

I'm trying to generate a txt file with ANSI encoding, but when I upload the file to the AWS S2 bucket the result is a UTF-8 file, doing this using an ERM cluster in jupyternotebook. When I run the same code locally, I generate a perfectly ANSI encoded txt file. It turns out that there is no option to generate it locally and then upload it to the cloud, since this will be generated through a DAG in Airflow. I tried this using the 'cp1252' and 'latin-1' encodings, and it works ok locally but not ok on the cloud. Has anyone ever seen something similar?

I'm trying to generate a txt file with ANSI encoding, but when I upload the file to the AWS S2 bucket the result is a UTF-8 file but I need an ANSI file, the code works locally but not in the cloud.

0

There are 0 best solutions below