I’m working on a PowerShell script that runs nightly and can take up to 90 minutes to complete. I want to improve how I handle logging, especially when it comes to troubleshooting and monitoring progress in real-time.
What’s your go-to approach for logging in these cases?
Are you using Start-Transcript, custom log files, or structured logs with timestamps?
Do you log every step or only exceptions and warnings?
Any tips for performance impact or handling large log files?
Would love to hear how the pros handle this. Appreciate any insights!
You’re asking for opinions. I’d rather use the category “Open Discussions” for topics like this.
Definitely NOT Start-Transcript. Most of the time I use the SCCM log format. Since we use SCCM anyway in our environment and it comes with a very handy logviewer and it is constantly being updated even during active logging. And of course it uses timestamps.
That depends pretty much on the tasks you want to log. If I have a large amount of files to copy or to move I would not log every single file. Instead I’d log a summary at the end or just a success or error message. If I need the single files logged I’d rather write a separate log file.
In general I’ll log the start and the end of a specific action. The end message contains the success or error info.
You should avoid outputting something to the console. If you need console output for debugging purposses you should use Write-Verbose or Write-Debug. This way you can turn on the console logging if needed but it does not pollute the console if you don’t need it.
If you’re not using anything in particular yet you may take a look into some of the already available modules containing advanced logging capabitlies. There are some general ones like PowerShell Framework or some specific ones like Powershell Logging Module
If another way is better or not depends on your requirement or your personal preference.
I usually feel like Start-Transcript is the cheap and ugly solution you use when you don’t have a better one. And since we use SCCM in our environment what comes with the log file viewer I use a custom function to write log entries in the SCCM format.
Transcript logs everything - even stuff I don’t care about and it does not have a time stamp and it logs the commands and the output mixed with each other.
I prefer to log only what I think is worth it and in a format I like.
Thanks a ton to everyone who chimed in, really appreciate you sharing your experiences! It’s super helpful to hear how others are handling logging in real-world scripts, especially the idea of using SQL or custom log functions instead of relying on Start-Transcript. I’ve picked up a lot just from this thread. Thanks again for taking the time to help out!
Generally for scripts running unattended I have a function that writes to the server Event Log.
If the script can fail multiple times during the run (a server not responding to ping, a missing or misspelled UPN) I will stash those errors in a list or PSCustomObject, then output them as one error log with information in the body of the event at the end of the run. If an error is a showstopper I will end the script and output that as a failed run, and if the script runs without issue I will just log a success.
I’ve seen logging functions that output events for each loop through a foreach/ForEach-Object and that’s a good way of filling your event logs with useless information.
I actually was at PSConfEU this year. Unfortunately that presentation clashed with another I really wanted to see, and I hadn’t gotten around to watching the recording of that one yet. Will have to rectify that ASAP.