Question:
I'm working with a PostgreSQL database, and I need to add an additional column to a table that contains millions of records. The problem is that this operation is taking several minutes to complete, impacting the overall performance of my application. Are there ways to speed up this process and minimize downtime, especially when dealing with PostgreSQL?
Problem Description:
- PostgreSQL version: PostgreSQL 12.11
- The table contains approximately 4.000.000 records.
- The SQL query I'm using looks like this:
ALTER TABLE my_table ADD COLUMN new_column INT;
Adding a column to a larger table is time consuming generally, but in general, it is less time consuming when adding a column with a default value.
If you are using the varchar, you can use,
and there is another option, like concurrently. This option allows you to have the operation running while the query execute. Example like,
And we can use the batch the update method to add the column in batches to reduce the time consumption for whole process and it splits to small processes.
Hope this is helpful..