We are using psycopg2-binary==2.8.6
Am trying to insert huge data into my table (staging_schema.table_name) we do exist foreign key and all necessary data, However it's throwing an error:
Error: insert or update on table "table_name" violates foreign key constraint "test_table_id_fkey"
Key (api_id)=(487717270)
is not present in table "test_table"
buffer = StringIO()
api_trx_stg_df.to_csv(buffer,index=False, header=False)
buffer.seek(0)
cursor = utils_dict['psycopg2_connection'].cursor()
try:
cursor.copy_from(buffer, 'staging_schema.table_name', sep=",",columns=api_trx_stg_df.columns)
utils_dict['psycopg2_connection'].commit()
except (Exception, psycopg2.DatabaseError) as error:
logging.info("Error: %s" % error)
utils_dict['psycopg2_connection'].rollback()
cursor.close()
return 1
logging.info("copy_from_stringio() has done")
cursor.close()
When am trying same with the to_sql(), it's working as expected but we are dealing huge data, so I have moved it to copy_from.
When am trying same with the to_sql() it's working as expected but we are dealing huge data so I have moved it to copy_from. How can we fix this ?