'convert' statement of sql in pyspark or spark...?

56 Views Asked by At

Ok, so I'm trying to "translate" some stuff into pyspark. The statement that I have is the following:

CONVERT(VARCHAR(6), DATEADD(MONTH, -1, DATEADD(MONTH, -1, GETDATE())),112) AS CURRENT_DATE

I've been searching and reading the documentation, but no clue how to do this. I'm new in both SQL and pyspark.

I tried to search a form of doing the same thing I did in SQL.

I want my CURRENT_DATE to be at the form of yyyymm, to do it so, I converted the date in varchar(6), the 112 correspond to yyyymmdd so it's a varchar(8), using varchar(6) will cut off the days. All the DATEADD works in pyspark and it's simply a form of getting the date I need.

The problem is I don't know how to write say yyyy-mm-dd (pyspark format I guess) as yyyymm.

0

There are 0 best solutions below