Pandas concat behaves different now

144 Views Asked by At

I have two python/anaconda environments with juypter notbooks. No. 1

  • Python 3.11.5
  • Pandas 2.1.1
  • Jupyter 6.5.4

No. 2

  • Python 3.12
  • Pandas 2.1.1
  • Jupyter 7.0.6

Basically, I have a quite simple task to join two DataFrames with

pd.concat(["buffer", "new"])

buffer has the shape (1979444, 10)

and dtype: shop object kat1 object kat2 object kanal object device object datum datetime64[us] brutto float64 sales float64 menge float64 wareneinsatz float64 dtype: object

new has shape (44040, 10)

and dtype shop object kat1 object kat2 object kanal object device object datum datetime64[ns] brutto float64 sales float64 menge float64 wareneinsatz float64 dtype: object

but I changed that.

No. 1 works fine but with No. 2 I get this error message, although both use the same script:

ValueError: all the input array dimensions except for the concatenation axis must match exactly, but along dimension 1, the array at index 0 has size 1979444 and the array at index 1 has size 44040

It looks like the DataFrames changed dimensions/rows and columns but I dont know why. Any idea? Its the same Pandas Version so I dont think its a new "feature".

Summary Same Script, same machine but different environments with Python 3.12 instead of 3.11.5 but same Pandas Version so I expected same outcome.

At first, columns had a different sequence and two cols were formated as int not floats but I fixed that.

I can switch back to the older installation but somewhere in time I will have to update Python and it would be nice if it would work then.

1

There are 1 best solutions below

2
On

Found it!

I was able to narrow it down to a date column. Look both the same but the first df was formated as: datum datetime64 [ns]

the second as: datum datetime64 [us]

Works with python 3.11.5 but not 3.12 - tested it outside of Jupyter. Didnt find it before because the error message does not really make sense:

ValueError: Shape of passed values is (2, 6), indices imply (4, 6)

Thanks for helping me thinking! ;)