Script Help with getting proper info to a file

Get-ChildItem C:\NocTest\ -file filename.xlsx -Recurse | Where-Object {$_.LastWriteTime -and “filename”} | Out-File C:\NocTest\Logs\filename.log


Directory: C:\NocTest

Mode LastWriteTime Length Name

-a---- 3/15/2021 8:59 AM 8437 Filename.xlsx

I need this to just display the LastWriteTime and the name of the file. The out-file also need to rewrite each time. When I change it to Add-Content I lose the timestamp.

Ultimitly end goal is to
grab a file from a folder and enter to .log once that file is processed it will be moved to a folder called process. This folder now need the LastWriteTime monitored to make sure it is not older then 9 days. If within the 9 days it will append Success to the .log file. If over 9 days it appends “failed”

Hopefully I understood your requirements correctly. This looks for all files with .xlsx extension inside the C:\NocTest\ folder and displays the last write time and file name. At the same time, the same output is written to C:\NocTest\Logs\filename.log.

Get-ChildItem C:\NocTest\ -File *.xlsx -Recurse | Select-Object LastWriteTime,Name | Tee-Object -FilePath C:\NocTest\Logs\filename.log

This does the same for a single file.

Get-ChildItem C:\NocTest\ -File filename.xlsx -Recurse | Select-Object LastWriteTime,Name | Tee-Object -FilePath C:\NocTest\Logs\filename.log

Thank you I’ll have to do some research on the tee-object. I was not aware of that command. The joys of being a Noob

Tee-Object displays the output onscreen and at the same time you can also send it to a file or variable. More info here.

Almost to my goal here. Not sure if it’s logic issue or just how it’s formed. I can get the original file in C:\test.xlsx time stamp and put in the log file. However when I move it over the the c:\processed it put’s that time in the test.log too. What I need is the time stamp for when the file comes into C:\Test into the log. Next withing 12 hours it should be moved into C:\Test\Processed, this is where a 9 day counter should begin. If in 9 days new file hasn’t replaced the file in C:\Test\Processed\test.xlsx it should append success or fail if within the 9 days success if after 9 days failed. It should also write over the file vs add. In a nut shell I have a Noc sensor that is monitoring the string “success” and “fail” to trigger our NOC. The time stamp is just for logs when we want to see what date it got lost.

$file= Get-ChildItem -Path C:\NocTest -File test.xlsx  

$checkdays= $file | Where {$_.LastWriteTime -le (Get-Date).AddDays(9)}

$processed= Get-ChildItem -Path C:\NocTest\Processed -File test.xlsx

$logpath= Get-ChildItem -Path C:\NocTest\Logs\

$logfile= Get-ChildItem -Path C:\NocTest\Logs\ -File NocTest.log

Get-ChildItem C:\NocTest\ -File test.xlsx -Recurse | Select-Object LastWriteTime,Name | Out-File -FilePath C:\NocTest\Logs\NocTest.log 

#If ($logpath -eq $nothing) {

 #     $createlog

 #   }

if ($processed -eq $checkdays) {

        Add-Content -Path C:\NocTest\Logs\NocTest.log "success"


else {

         Add-Content -Path C:\NocTest\Logs\NocTest.log "failed"


What’s confusing to me about your code is you keep using the


cmdlet to retrieve a SINGLE file. This

will only return one file C:\NocTest\test.xlsx. If that’s what you want, then I’d recommend using

Get-Item -Path C:\NocTest\test.xlsx

If you actually want ALL .xlsx file you should use a wildcard like @ferc recommended.

Get-ChildItem C:\NocTest\ -File *.xlsx

Even when you use

You will only get any occurrence of the specific file titled test.xlsx in C:\NocTest or any subdirectory.

One other thing for consideration, any user with access to Google and write privileges on the file can very easily manipulate these time stamps. Google search “time stomping” Here’s an example in PowerShell

(Get-Item -Path C:\NocTest\test.xlsx).LastWriteTime = [datetime]"Mar 3, 2021"

Thank you for the advice. The reason for the Get-ChildItem so much is I was handed a project with a deadline and my Powershell experience extends to google searches and recently started CBTNuggets.

I understand being handed project and new to the language. I would bet the person who wrote the script originally did much of the same google search and try it methodology which is ok, but I think it really helps to understand what exactly your script is doing. Many times, taking the time to understand actually saves time of just trying things until it works. One logic issue I did see is this:

This is comparing the LastWriteTime to 9 days in the future. Specifically it will return $true if the LastWriteTime is less than or equal to 9 days in the future. Unless you have a modified Delorean that can time travel in the future, ALL files would meet this criteria. If you want to find files in the directory that are 9 days old or older the logic would be this:

$_.LastWriteTime -le (Get-Date).AddDays(-9)

or this

$_.LastWriteTime.AddDays(9) -le (Get-Date)

I would recommend spending time in the actual documentation. I would start by looking at

Get-Help Get-Item -Online
Get-Help Get-ChildItem -Online
Get-Help Add-Content -Online

Get-Help works every time. Just ask the god of thunder:

Oh I know the person did. That was my code from all the searching pieced together and a bit of past programing knowledge I’ve had. Didn’t realize how powerful Powershell really is compared to the old CMD

1 Like