Monitoring a log file in Powershell 2.0

Hi,

I am having to use PS 2.0 to try and monitor a log file that is being written to. I know that in PS 3.0 we have tail with get-content…but the servers in the environment that I am working in are still Windows 2008 R2 and have not upgraded. Furthermore at this stage I do not see them upgrading just yet to the new version.

I am looking for a particular expression in a apache tomcat log file that is written after the startup is completed.

Format goes something like this:

16/11/2015 4:48:52 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 128658 ms

I thought I might create a variable out of the expression:

$m = get-content .\catalina.2015-11-16.txt | select-object -last 1 | Select-String -Pattern “INFO: Server startup”

and then insert that into a do while loop like so:

do {Write-Host “Searching text file”} while {$m -eq $NULL}

I removed the text "“INFO: Server startup” from the text file to test the above evaluation with Write-Host (yes I killed a puppy but I was just testing the logic :)) and sure enough it started spitting out the text “Searching text file”

I then thought just to test that I readd that text into the log file again and saved to see if it reevaluated the condition. (this in my mind simulated the log file being written to)

What I found is that the loop continued to run despite the change in the log file (after save).

This led me to believe that it pulls the text file into memory and reevaluates the logic from there.

Sooo my question is: is there any way that we can get the logic to periodically check the log file to see if there are any changes and then evaluate the condition again perhaps after a predetermined time?

Kind regards,

Wei-Yen

You can keep track of the previous length of the file, and skip ahead to that position when reading it in the future. This is how the -Wait parameter works behind the scenes on Get-Content. Here’s some quick PowerShell code to demonstrate what I mean. It has a little bit of code to handle the conditions where a file has been deleted or rewritten with less content than the last time it was read (the parts where $lastOffset is set back to zero), but otherwise is pretty straightforward.

$path = "$pwd\test.txt"
$lastOffset = 0

while ($true)
{
    Start-Sleep -Milliseconds 100
    
    if (-not (Test-Path -LiteralPath $path -PathType Leaf))
    {
        $lastOffset = 0
        continue
    }

    $item = Get-Item -LiteralPath $path
    if ($item.Length -lt $lastOffset)
    {
        $lastOffset = 0
    }

    if ($item.Length -gt $lastOffset)
    {
        $stream = $item.OpenRead()
        $stream.Position = $lastOffset
        $reader = New-Object System.IO.StreamReader($stream)
        while (-not $reader.EndOfStream)
        {
            $reader.ReadLine()
        }

        $lastOffset = $stream.Position

        $reader.Close()
        $stream.Close()
    }
}

Hi David,

Thank you very much for the time David. I had to read the code several times over to try and understand it.

As I understand its reading each line while the log file is not finished through this method.

$reader.ReadLine()

Would I be able to add a criteria to say move out of the loop when l it sees a particular string?

if ($item.Length -gt $lastOffset)
{
    $stream = $item.OpenRead()
    $stream.Position = $lastOffset
    $reader = New-Object System.IO.StreamReader($stream)
    while (-not $reader.EndOfStream)
    {
        $reader.ReadLine()
    }

    $lastOffset = $stream.Position

    $reader.Close()
    $stream.Close()
}

Sure, you’d just add a bit of code to analyze the lines, and then break out of the loops. Because there are nested loops here, you’d need to use a label for the break statement, like so:

$path = "$pwd\test.txt"
$lastOffset = 0

:outerLoop while ($true)
{
    Start-Sleep -Milliseconds 100
    
    if (-not (Test-Path -LiteralPath $path -PathType Leaf))
    {
        $lastOffset = 0
        continue
    }

    $item = Get-Item -LiteralPath $path
    if ($item.Length -lt $lastOffset)
    {
        $lastOffset = 0
    }

    if ($item.Length -gt $lastOffset)
    {
        $stream = $reader = $null
        try
        {
            $stream = $item.OpenRead()
            $stream.Position = $lastOffset
            $reader = New-Object System.IO.StreamReader($stream)
            while (-not $reader.EndOfStream)
            {
                $line = $reader.ReadLine()
                if ($line -match 'Some Pattern You Want To Find')
                {
                    break outerLoop
                }
            }

            $lastOffset = $stream.Position
        }
        catch
        {
            throw
        }
        finally
        {
            if ($reader) { $reader.Close() }
            if ($stream) { $stream.Close() }
        }
    }
}

Hi Dave,

Many thanks for your help with this…

I ran this script against a txt file and found that it was reading the entire file. The reason why I ask is that the log file may contain that text pattern already and I am looking for from a previous date and that will mess with the logic.

In the cmdlet I was using, I used Select -Last 1 to limit what powershell was seeing.

I looked around through the web to see if we can limit the lines in the method but I maybe using the wrong keywords to search.

Many thanks again David, I am grateful for your help. (I am learning a lot).

@wei-yen-tan, the way the script is currently written, it will need to run constantly in order to just get the latest updates. The offset value is stored in a variable. Each time the script ends, that variable will be eliminated. So when the script starts again, it will start back at the beginning of the log, $lastOffset = 0. If you need the offset to persist through multiple iterations of running the script. you need to store your offset value outside of the script. You could write the current offset to a file, the registry, a database, etc, etc. Then when the script launches, read that value back into your script at the beginning to start where the last run left off.