Quotes not displayed in CSV output file

138 Views Asked by At

Initial data is in Dataset<Row> and I am trying to write to csv file an each cell value to be placed in quotes.

result.coalesce(1).write()
            .option("delimiter", "|")
            .option("header", "true")
            .option("nullValue", "")
            .option("quoteMode", "ALL_NON_NULL")
            .csv(Location);

Expected output:

null
"London"|"UK"
"Delhi"|"India"
"Moscow"|"Russia"

Current Output:

null
London|UK
Delhi|India
Moscow|Russia

Spark version is 2.3

2

There are 2 best solutions below

4
Oli On BEST ANSWER

"quoteMode" is an option of databrick's CSV writer. Here you are using spark's built in CSV writer which does not support that option. Have a look at this page for the available options.

In your case, the option you are looking for is .option("quoteAll", true).

0
Remis Haroon - رامز On

As @Oli answered, the first option you have is "quoteMode" in CSV writer.

If you need more control, then you can use a concat function on all your columns to prefix and suffix your values with a quote. example below

import org.apache.spark.sql.functions.{concat, lit, col}

val df = Seq(
("1","a",null,"c"),
("3",null,"d","c"),
("4","a","b",null)
).toDF("id","A","B","C")

df.show()

+---+----+----+----+
| id|   A|   B|   C|
+---+----+----+----+
|  1|   a|null|   c|
|  3|null|   d|   c|
|  4|   a|   b|null|
+---+----+----+----+

val dfquotes = df.select(df.columns.map(c => concat(lit("\""), col(c), lit("\"")).alias(c)): _*)

dfquotes.show()

+---+----+----+----+
| id|   A|   B|   C|
+---+----+----+----+
|"1"| "a"|null| "c"|
|"3"|null| "d"| "c"|
|"4"| "a"| "b"|null|
+---+----+----+----+