Dan Thanks alot, the only thing is there is multiple ID’s so in the example I posted there is only ID 3 but there will be multiple ID’s. ie ID 1 , ID 2
Exception calling "OpenText" with "0" argument(s): "The process cannot access the file 'C:\Program Files (x86)\Traccar\logs\tracker-server.log' because it is being used
by another process."
At C:\Users\jwall\Documents\JWallCreations\PullLog.ps1:6 char:17
+ $log | foreach {$file = $_.OpenText()
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : IOException
Will only let it run once then I get this. argh! I need to be able to cycle through this log every minute, rather I would like too
I was not able to reproduce your error. I was successful looping a single and multiple files. Also, while the loops were running, I was able to open the text file(s) manually. I guess I am overlooking something. I created loops using the following while loop and by piping an array (1…10). Also I changed $log variable to this “Get-ChildItem -Path . -Filter log*”.
while ($true) {$count++; “Loop#: $count”; & “C:\path\to\pullLog.ps1”}
I was impressed with you code, wondering if you could help me understand it.
If I do this :
$a = {@()}
Then look at the type name its,
System.Management.Automation.ScriptBlock
When I saw @() I assumed it would be an array.
Would you mind explaining the following code in a bit more detail so I can learn it ?
Parse log for matching ids
$col = {@()}.Invoke() - What is this invoking ?
$log | foreach {$file = $_.OpenText()
while ($file.EndOfStream -eq $false){
$line = $file.ReadLine() ; If ($line -match ‘id’){
$col.Add($line)}}} - How is this working ?
Ok, I understand how Curtis’ $hash table works.
If “$hash[‘key’] = value” is used to add to a hashtable, it overwrites the key’s value (if key exists). If “$hash.Add(‘key’,‘value’)” is used an error occurs (if key exists).
Exception calling “Add” with “2” argument(s):
“Item has already been added. Key in dictionary: ‘key’ Key being added: ‘key’”
At line:1 char:1
Graham, this is what I like to do if I am parsing log files. In the past, I have parsed IIS log files that were MegaBytes in size and 100,000+ lines long. Using Get-Content was too slow and used alot of system memory.
This is an example of how fast adding to collection is compared to array.
# Array and Collection Speed Example col = 210 milliseconds; array = 4.6 seconds
$arraytest = measure-command {$test = @(); 1..10000 | foreach {$test += $_}}
$coltest = measure-command {$test = {@()}.Invoke(); 1..10000 | foreach {$test.Add($_)}}
"Array Complete in {0}.{1} seconds" -f $arraytest.Seconds,$arraytest.Milliseconds
"Collection Complete in {0}.{1} seconds" -f $coltest.Seconds,$coltest.Milliseconds
# Arrays have a fixed size and cannot be added to without using
# the += assignment operator. Using this operator creates a new array and overwrites the # previous.
# Converts array into a collection
$col = {@()}.Invoke()
# Use .NET System.IO.StreamReader to open a file and read each line
# Add matching lines to collection
$log | foreach {$file = $_.OpenText()
while ($file.EndOfStream -eq $false){
$line = $file.ReadLine() ; If ($line -match 'id'){
$col.Add($line)}}}
The advantage of the hash approach is that it is weeding out the unwanted records as it parses the log file. In the end there are only the 3 lines to parse into PSCustomObjects instead of all of the other records you don’t really want anyway. You find the ones you want and just parse those.