I have been using the following for sometime, however the logs I need to scan have grown immensely and it will no longer complete. Powershell will use around 4GiB and process endlessly. Logs are roughly 3.6 GB in total, select-string would hit on 90% of the lines. Ideas to optimize this would be greatly appreciated
There is a lot of piping and for loops. Recommend providing example log and example expected output. Piping is expensive, with some example information others can provide alternate approaches. Group-Object should be returning Name and Count, so Select is not needed and is passing all of that data to another command to process as an example.
I tested a bit more seems it is hanging up on “group-object” command, “uniq -c” from linux can complete this in less than a minute with the same data.
This is an example of what each row would contain, fields can be missing completely or empty
2020-11-12 06:00:00 [12345] INFO DisplayName=philip, last, ExAddress=IMCEAEX-_O=CONTOSO_OU=First+20Administrative+20Group_cn=Recipients_cn=philip@contoso.com, SmtpAddress=philip@contoso.com