High memory utilizing when fetch data from stored procedure to DataTable

64 Views Asked by At

I want to process millions of rows and doing some process on that and export that records into excel file. When I am fetching records and fill it into a DataTable, it is utilizing very large amount of memory - around 8 to 10 gb. My SQL query result is containing around 300 columns.

I have tried with batch and tried with SqlDataReader also. With and without batchwise it is taking almost same memory and after using SqlDataReader it is not impacting on memory. Below is my current code. Is there any way to stop this high utilization?

Fetch records in batches of 10000 rows:

DataTable dt = new DataTable();

for (int i = 0; i <= 0; i++)
{
    int startRow = 1;
    int endRow = 10000;

    for (int j = 0; (startRow == 1 || dt.Rows.Count >= 10000); j++)
    {
        dt = null;
        dt = _dataService.SqlExecuteDT(data);
        startRow = startRow + 10000;
    }
}

Code for calling the stored procedure, and filling the datatable:

public static DataTable SqlExecuteDT(string spname, Dictionary<string, object> parameters, DBName dBName = DBName.DBCONN)
{
    using (SqlConnection conn = new SqlConnection(DBConnection.GetConnectionString(dBName)))
    {
        using (SqlCommand cmd = new SqlCommand(spname, conn))
        {
            cmd.CommandTimeout = Timeout;
            cmd.CommandType = System.Data.CommandType.StoredProcedure;

            if (parameters != null)
            {
                foreach (KeyValuePair<string, object> kvp in parameters)
                    cmd.Parameters.AddWithValue(kvp.Key, kvp.Value ?? DBNull.Value);
            }

            conn.Open();

            // Use a forward-only, read-only data reader for memory efficiency
            using (SqlDataReader reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess))
            {
                // Retrieve column schema
                DataTable schemaTable = reader.GetSchemaTable();

                // Create DataTable to hold the data
                DataTable dataTable = new DataTable();

                // Add columns to DataTable based on schema
                foreach (DataRow row in schemaTable.Rows)
                {
                    string columnName = row["ColumnName"].ToString();
                    Type dataType = (Type)row["DataType"];
                    dataTable.Columns.Add(columnName, dataType);
                }

                // Populate DataTable with data
                while (reader.Read())
                {
                    DataRow dataRow = dataTable.NewRow();

                    for (int i = 0; i < reader.FieldCount; i++)
                    {
                        dataRow[i] = reader[i];
                    }

                    dataTable.Rows.Add(dataRow);
                }

                return dataTable;
            }
        }
    }
}
1

There are 1 best solutions below

3
Charlieface On

Your SqlExecuteDT function is overly complex. Everything you need is already available via DataTable.Load.

public static DataTable SqlExecuteDT(string spname, Dictionary<string, object?> parameters = null, DBName dBName = DBName.DBCONN)
{
    using var conn = new SqlConnection(DBConnection.GetConnectionString(dBName));
    using var cmd = new SqlCommand(spname, conn);
    cmd.CommandTimeout = Timeout;
    cmd.CommandType = System.Data.CommandType.StoredProcedure;

    if (parameters != null)
    {
        foreach (var kvp in parameters)
            cmd.Parameters.AddWithValue(kvp.Key, kvp.Value ?? DBNull.Value);
    }

    conn.Open();
    using SqlDataReader reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess);

    var table = new DataTable();
    table.Load(reader);
    return reader;
}

And then your outer loop makes no sense at all. Why is there a loop there at all? It should be just:

var dt = _dataService.SqlExecuteDT(SomeProcName);