Hello, I’m looking for a way to parse csv files with posh, really big files 1gb+. import-csv works kinda slow. what would be the alternative way to do so?
Right now I’m doing this:
$newcsv = Import-Csv -Path $_.FullName -Encoding Default -Delimiter ";"`
-Header date,0,user,1,folder,session,2,pc,3,4 | Where-Object {($_.date -like "2015*")`
-and ($_.session -like "opensession")}
foreach ($_ in $newcsv) {
$newcsv[[array]::indexof($newcsv,$_)].pc = $_.pc.split(" ").trim('(m)') | select -Last 1
}
$newcsv | Select-Object user,pc -Unique | Export-Csv -Delimiter ";" -NoTypeInformation -Path "$folder\Unique\unique$name.csv" -Encoding Default
$newcsv | Select-Object date,user,folder,session,pc| Export-Csv -Delimiter ";" -NoTypeInformation -Path "$folder\NotUnique\notunique$name.csv" -Encoding Default
}
And it takes 30 minutes on a really good laptop (i7\16gb\240ssd) to parse 800mb file.
I saw this thing (Microsoft.VisualBasic.FileIO.TextFieldParser), but i have no clue what to do with and, and can’t find anything relevant. I have no background in coding and cant seem to figure this one out