How to sort 30Million csv records in Powershell

1.4k Views Asked by At

I am using oledbconnection to sort the first column of csv file. Oledb connection is executed up to 9 million records within 6 min duration successfully. But when am executing 10 million records, getting following alert message.

Exception calling "ExecuteReader" with "0" argument(s): "The query cannot be completed. Either the size of the query result is larger than the maximum size of a database (2 GB), or there is not enough temporary storage space on the disk to store the query result."

is there any other solution to sort 30 million using Powershell?

here is my script

$OutputFile = "D:\Performance_test_data\output1.csv"
$stream = [System.IO.StreamWriter]::new( $OutputFile )

$sb = [System.Text.StringBuilder]::new()
$sw = [Diagnostics.Stopwatch]::StartNew()

$conn = New-Object System.Data.OleDb.OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source='D:\Performance_test_data\';Extended Properties='Text;HDR=Yes;CharacterSet=65001;FMT=Delimited';")
$cmd=$conn.CreateCommand()
$cmd.CommandText="Select * from 1crores.csv order by col6"

$conn.open()

$data = $cmd.ExecuteReader()

echo "Query has been completed!"
$stream.WriteLine( "col1,col2,col3,col4,col5,col6")

while ($data.read()) 
{ 
  $stream.WriteLine( $data.GetValue(0) +',' + $data.GetValue(1)+',' + $data.GetValue(2)+',' + $data.GetValue(3)+',' + $data.GetValue(4)+',' + $data.GetValue(5))

}
echo "data written successfully!!!"

$stream.close()
$sw.Stop()
$sw.Elapsed

$cmd.Dispose()
$conn.Dispose()
5

There are 5 best solutions below

6
iRon On BEST ANSWER

I have added a new answer as this is a complete different approach to tackle this issue.
Instead of creating temporary files (which presumable causes a lot of file opens and closures), you might consider to create a ordered list of indices and than go over the input file (-FilePath) multiple times and each time, process a selective number of lines (-BufferSize = 1Gb, you might have to tweak this "memory usage vs. performance" parameter):

Function Sort-Csv {
    [CmdletBinding()] param(
        [string]$InputFile,
        [String]$Property,
        [string]$OutputFile,
        [Char]$Delimiter = ',',
        [System.Text.Encoding]$Encoding = [System.Text.Encoding]::Default,
        [Int]$BufferSize = 1Gb
    )
    Begin {
        if ($InputFile.StartsWith('.\')) { $InputFile = Join-Path (Get-Location) $InputFile }
        $Index = 0
        $Dictionary = [System.Collections.Generic.SortedDictionary[string, [Collections.Generic.List[Int]]]]::new()
        Import-Csv $InputFile -Delimiter $Delimiter -Encoding $Encoding | Foreach-Object { 
            if (!$Dictionary.ContainsKey($_.$Property)) { $Dictionary[$_.$Property] = [Collections.Generic.List[Int]]::new() }
            $Dictionary[$_.$Property].Add($Index++)
        }
        $Indices = [int[]]($Dictionary.Values | ForEach-Object { $_ })
        $Dictionary = $Null                                     # we only need the sorted index list
    }
    Process {
        $Start = 0
        $ChunkSize = [int]($BufferSize / (Get-Item $InputFile).Length * $Indices.Count / 2.2)
        While ($Start -lt $Indices.Count) {
            [System.GC]::Collect()
            $End = $Start + $ChunkSize - 1
            if ($End -ge $Indices.Count) { $End = $Indices.Count - 1 }
            $Chunk = @{}
            For ($i = $Start; $i -le $End; $i++) { $Chunk[$Indices[$i]] = $i }
            $Reader = [System.IO.StreamReader]::new($InputFile, $Encoding)
            $Header = $Reader.ReadLine()
            $i = $Start
            $Count = 0
            For ($i = 0; ($Line = $Reader.ReadLine()) -and $Count -lt $ChunkSize; $i++) {
                if ($Chunk.Contains($i)) { $Chunk[$i] = $Line }
            }
            $Reader.Dispose()
            if ($OutputFile) {
                if ($OutputFile.StartsWith('.\')) { $OutputFile = Join-Path (Get-Location) $OutputFile }
                $Writer = [System.IO.StreamWriter]::new($OutputFile, ($Start -ne 0), $Encoding)
                if ($Start -eq 0) { $Writer.WriteLine($Header) }
                For ($i = $Start; $i -le $End; $i++) { $Writer.WriteLine($Chunk[$Indices[$i]]) }
                $Writer.Dispose()
            } else {
                $Start..$End | ForEach-Object { $Header } { $Chunk[$Indices[$_]] } | ConvertFrom-Csv -Delimiter $Delimiter
            }
            $Chunk = $Null
            $Start = $End + 1
        }
    }
}

Basic usage

Sort-Csv .\Input.csv <PropertyName> -Output .\Output.csv
Sort-Csv .\Input.csv <PropertyName> | ... | Export-Csv .\Output.csv

Note that for 1Crones.csv it will probably just export the full file in once unless you set the -BufferSize to a lower amount e.g. 500Kb.

8
Eric Eskildsen On

You could try SQLite:

$OutputFile = "D:\Performance_test_data\output1.csv"

$sw = [Diagnostics.Stopwatch]::StartNew()

sqlite3 output1.db '.mode csv' '.import 1crores.csv 1crores' '.headers on' ".output $OutputFile" 'Select * from 1crores order by 最終アクセス日時'

echo "data written successfully!!!"

$sw.Stop()
$sw.Elapsed
3
bluuf On

You can try using this:

$CSVPath = 'C:\test\CSVTest.csv'
$Delimiter = ';'

# list we use to hold the results
$ResultList = [System.Collections.Generic.List[Object]]::new()

# Create a stream (I use OpenText because it returns a streamreader)
$File = [System.IO.File]::OpenText($CSVPath)

# Read and parse the header
$HeaderString = $File.ReadLine()

# Get the properties from the string, replace quotes
$Properties = $HeaderString.Split($Delimiter).Replace('"',$null)
$PropertyCount = $Properties.Count

# now read the rest of the data, parse it, build an object and add it to a list
while ($File.EndOfStream -ne $true)
{
    # Read the line
    $Line = $File.ReadLine()
    # split the fields and replace the quotes
    $LineData = $Line.Split($Delimiter).Replace('"',$null)
    # Create a hashtable with the properties (we convert this to a PSCustomObject later on). I use an ordered hashtable to keep the order
    $PropHash = [System.Collections.Specialized.OrderedDictionary]@{}
    # if loop to add the properties and values
    for ($i = 0; $i -lt $PropertyCount; $i++)
    { 
        $PropHash.Add($Properties[$i],$LineData[$i])
    }
    # Now convert the data to a PSCustomObject and add it to the list
    $ResultList.Add($([PSCustomObject]$PropHash))
}

# Now you can sort this list using Linq:
Add-Type -AssemblyName System.Linq
# Sort using propertyname (my sample data had a prop called "Name")
$Sorted = [Linq.Enumerable]::OrderBy($ResultList, [Func[object,string]] { $args[0].Name })

Instead of using import-csv I've written a quick parser which uses a streamreader and parses the CSV data on the fly and puts it in a PSCustomObject. This is then added to a list.

edit: fixed the linq sample

0
iRon On

Putting the performance aside and at least come to a solution that works (meaning one that doesn't hang due to memory shortage) I would rely on the PowerShell pipeline. The issue is thou that for sorting an object you will need to stall te pipeline as the last object might potentially become the first object.
To resolve this part, I would do a coarse division on the first character(s) of the concern property first. Once that is done, fine sort each coarse division and append the results:

Function Sort-BigObject {
    [CmdletBinding()] param(
        [Parameter(ValueFromPipeLine = $True)]$InputObject,
        [Parameter(Position = 0)][String]$Property,
        [ValidateRange(1,9)]$Coarse = 1,
        [System.Text.Encoding]$Encoding = [System.Text.Encoding]::Default
    )
    Begin {
        $TemporaryFiles = [System.Collections.SortedList]::new()
    }
    Process {
        if ($InputObject.$Property) {
            $Grain = $InputObject.$Property.SubString(0, $Coarse)
            if (!$TemporaryFiles.Contains($Grain)) { $TemporaryFiles[$Grain] = New-TemporaryFile }
            $InputObject | Export-Csv $TemporaryFiles[$Grain] -Encoding $Encoding -Append
        } else { $InputObject.$Property }
    }
    End {
        Foreach ($TemporaryFile in $TemporaryFiles.Values) {
            Import-Csv $TemporaryFile -Encoding $Encoding | Sort-Object $Property
            Remove-Item -LiteralPath $TemporaryFile
        }
    }
}

Usage
(Don't assign the stream to a variable and don't use parenthesis.)

Import-Csv .\1crores.csv | Sort-BigObject <PropertyName> | Export-Csv .\output.csv
  • If the temporary files still get too big to handle, you might need to increase the -Coarse parameter

Caveats (improvement considerations)

  • Objects with an empty sort property will be immediately outputted
  • The sort column is presumed to be a (single) string column
  • I presume the performance is poor (I didn't do a full test on 30 million records, but 10.000 records take about 8 second which means about 8 hours). Consider replacing native PowerShell cmdlets with .Net streaming methods. buffer/cache file input and outputs, parallel processing?
0
js2010 On

I downloaded gnu sort.exe from here: http://gnuwin32.sourceforge.net/packages/coreutils.htm It also requires libiconv2.dll and libintl3.dll from the dependency zip. I basically did this within cmd.exe, and it used a little less than a gig of ram and took about 5 minutes. It's a 500 meg file of about 30 million random numbers. This command can also merge sorted files with --merge. You can also specify begin and end key position for sorting --key. It automatically uses temp files.

.\sort.exe < file1.csv > file2.csv

Actually it works in a similar way with the windows sort from the cmd prompt. The windows sort also has a /+n option to specify what character column to start the sort by.

sort.exe < file1.csv > file2.csv