I have a Node.js application using Sequelize as the ORM to interact with a PostgreSQL database. One of my queries retrieves data from a table that can have millions of records and has around 15-20 columns. The query fetches data based on a date range and optional filters am using rawSQL query. Here is the code for the query function:
kindly guide me thanks
here is my code
exports.getactivationData = async function (params, pagination = true) {
try {
let dbModels = await db.Models();
let query = `SELECT * FROM my_large_table WHERE date("createdAt") BETWEEN '${params.startDate}' AND '${params.endDate}'`;
if (!pagination) {
query = `SELECT count(*) FROM my_large_table WHERE date("createdAt") BETWEEN '${params.startDate}' AND '${params.endDate}'`;
}
if (params.filter1) {
query += ` AND sensitive_column_id = '${params.sensitiveFilterId}'`;
}
if (params.filter2) {
query += ` AND sensitive_column_value = '${params.sensitiveFilterValue}'`;
}
if (pagination && params.limit) {
query += ` ORDER BY "${params.sortBy}" ${params.sortType} `;
query += ` LIMIT '${params.limit}' OFFSET '${params.offset}'`;
}
const myLargeTableData = await db.sequelize.query(query, {
model: dbModels.myLargetableModel
});
return promiseAdapter.resolve(myLargeTableData);
} catch (error) {
return promiseAdapter.reject(error);
}
}
I'm concerned about the performance of this query, especially when dealing with large datasets (e.g., fetching 10 hours of data results in around 62k records). How can I optimize this Sequelize query to improve its performance when querying a large PostgreSQL table?
Additionally, in my controller, I call this function like so:
let data = await reportmodel.getactivationData (params);
// For count
let resultCount = await reportmodel.getactivationData (params, false);
Note: I want to optimize as above takes time for almost 30-40 seconds even with limit of 10 records as by default I am using limit = 10. Still its taking much time