Need advise on the best and fastest way to do this.
Let’s say I have a transaction file with 1000 computer names in it, 1 per line.
Then I have a master file with 20000 lines that as part of the line data, contains a computer name but with other attributes.
I need to read the transaction file to get the computer name, then read the master to get an attribute if the record matches on the computer name.
I have tried foreach inside foreach and runs too dam long. I have also tried compare-object and didn’t get far.
What is the best way to accomplish what it is I am after?
Olaf
March 16, 2020, 11:33am
2
Please help us to help you. You should post along with the code you already have A FEW REPRESENTATIVE LINES of both files you mentioned above. You should format all of this as code using the code tags “PRE”. You may read the instructions about this forum in the very first post of this forum: Read Me Before Posting! You’ll be Glad You Did! . Thanks
Transaction file contains:
server2
server7
server9
Master contains:
server1
attribute1
server2
attribute2
server3
attribute3
server4
attribute4
server5
attribute5
server6
attribute6
server7
attribute7
server8
attribute8
server9
attribute9
server10
attribute10
I lookup 2, 7 and 9 and get back attribute2, attribute7 and attribute9
Olaf
March 16, 2020, 11:41am
4
Hmmm … your code and a few lines of your files FORMATTED AS CODE please. Please read the instructions I linked for you. Please do not create a new post - edit your existing first one.
Thanks.
No code, hence why I started this thread.
#region Make sample input
@'
"ComputerName"
"Server2"
"Server7"
"Server9"
'@ | Out-File .\TransactionFile1.csv
@'
"ComputerName","Attribute1"
"Server1","Value1"
"Server2","Value2"
"Server7","Value7"
"Server8","Value8"
'@ | Out-File .\MasterFile1.csv
#endregion
#region Input
$TransactionList = Import-Csv .\TransactionFile1.csv
$MasterList = Import-Csv .\MasterFile1.csv
#endregion
#region Process
foreach ($ComputerName in $TransactionList.ComputerName) {
if ($Found = $MasterList | where ComputerName -Match $ComputerName) {
"$ComputerName found in Master file, Attribute1 value: $($Found.Attribute1)"
} else {
"$ComputerName not found in Master file"
}
}
#endregion
One foreach loop with ‘where-object’ to match…
This will work as long as the record counts in the input files is small enough to load in memory…
The test below took 3 seconds to process 20K rows.
# Create test files
$tfile = [System.Collections.ArrayList]@(1..1000 | ForEach-Object {"Server$_"})
$tfile | Add-Content .\tfile.txt
$tfile = Get-Content .\tfile.txt
$mfile = [System.Collections.ArrayList]@()
1..20000 | ForEach-Object {
$mfile.Add([PSCustomObject]@{Server="Server$_";Attribute="attribute$_"}) | Out-Null
}
$mfile | Export-Csv -NoTypeInformation -Path .\mfile.csv
$mfile = Import-Csv .\mfile.csv
# Process csv file
switch ($mfile)
{
{$tfile -contains $_.Server} {$_}
}
I was doing the same thing as @random commandline, but I’m not sure about the switch vs just a where:
Measure-Command {
$tmp = $mfile | Where{$tfile -contains $_.Server}
}
Measure-Command {
$tmp2 = switch ($mfile)
{
{$tfile -contains $_.Server} {$_}
}
}
Days : 0
Hours : 0
Minutes : 0
Seconds : 3
Milliseconds : 194
Ticks : 31941452
TotalDays : 3.69692731481481E-05
TotalHours : 0.000887262555555556
TotalMinutes : 0.0532357533333333
TotalSeconds : 3.1941452
TotalMilliseconds : 3194.1452
Days : 0
Hours : 0
Minutes : 0
Seconds : 3
Milliseconds : 360
Ticks : 33600789
TotalDays : 3.88898020833333E-05
TotalHours : 0.00093335525
TotalMinutes : 0.056001315
TotalSeconds : 3.3600789
TotalMilliseconds : 3360.0789