Performing insert or update in batches

36 Views Asked by At

I'm currently trying to create a function that will perform updates on a table, but I would like to do this in batches of 100 records per updates.

This is what I have currently:

CREATE OR REPLACE FUNCTION food.update_food_batch()
RETURNS VOID AS $$
DECLARE
    total_rows INTEGER;
    processed_rows INTEGER := 0;
BEGIN
    -- Get the total number of rows to be updated
    SELECT COUNT(*) INTO total_rows FROM food.food1;

    -- Start a transaction
    BEGIN
        LOOP
            -- Update rows in batches of 100
            UPDATE food.food_all f
            SET
                source_ref_id = c.item_id,
                status = 'A',
                updated_at = current_timestamp,
                nf_calories = c.nf_calories,
                nf_protein = c.nf_protein,
                nf_total_carbohydrate = c.nf_total_carbohydrate,
                nf_total_fat = c.nf_total_fat,
                nf_trans_fat = c.nf_trans_fatty_acid,
                nf_saturated_fat = c.nf_saturated_fat,
                nf_dietary_fiber = c.nf_dietary_fiber,
                nf_cholesterol = c.nf_cholesterol,
                nf_sodium = c.nf_sodium,
                nf_sugars = c.nf_sugars,
                nf_added_sugars = c.nf_added_sugars,
                nf_calcium = c.nf_calcium_mg,
                nf_iron = c.nf_iron_mg,
                nf_potassium = c.nf_potassium,
                nf_vitamin_d = c.nf_vitamin_d_mcg,
                nf_metric_qty = c.metric_qty,
                nf_metric_unit = c.metric_uom,
                upc = c.upc,
                name = c.item_name,
                ingredient_statement = c.nf_ingredient_statement,
                serving_size_qty = c.nf_serving_size_qty,
                serving_size_unit = c.nf_serving_size_unit,
                serving_metric_qty = c.nf_serving_weight_grams,
                brand_name = c.brand_name
            FROM (
                SELECT
                    item_id,
                    nf_calories,
                    nf_protein,
                    nf_total_carbohydrate,
                    nf_total_fat,
                    nf_trans_fatty_acid,
                    nf_saturated_fat,
                    nf_dietary_fiber,
                    nf_cholesterol,
                    nf_sodium,
                    nf_sugars,
                    nf_added_sugars,
                    nf_calcium_mg,
                    nf_iron_mg,
                    nf_potassium,
                    nf_vitamin_d_mcg,
                    metric_qty,
                    metric_uom,
                    upc,
                    item_name,
                    nf_ingredient_statement,
                    nf_serving_size_qty,
                    nf_serving_size_unit,
                    nf_serving_weight_grams,
                    brand_name
                FROM food.food1
                OFFSET processed_rows
                LIMIT 100
            ) AS c
            WHERE f.source_ref_id = c.item_id;

            -- Increment the number of processed rows
            processed_rows := processed_rows + 100;

            -- Exit loop when all rows are processed
            IF processed_rows >= total_rows THEN
                EXIT;
            END IF;

            -- Commit the transaction after every batch
            COMMIT;
        END LOOP;

        -- Log the total number of rows updated
        RAISE NOTICE 'Total rows updated: %', total_rows;
    END;
END;
$$ LANGUAGE PLPGSQL;

SELECT food.update_food_batch();

There seems to be an error with the way I am using COMMIT, as I receive this error:

SQL Error [2D000]: ERROR: invalid transaction termination
Where: PL/pgSQL function food.update_food_batch() line 85 at COMMIT

Anyone know a workaround for this? I would like to commit these updates in batches since I am updating lots of records, and I have an application that fails due to this because of limited memory of the other application.

0

There are 0 best solutions below