by michaeljc70 at 2013-02-23 07:52:36
I am writing a script to send an email notification when certain files change. There is a .CSV that has the File path and filespec (it is in a specific directory, but the filename can vary due to timestamp added to the filename so wildcards areby DonJ at 2013-02-24 04:53:45
used) and email address.
I am estimating there are 50 directories containing around 10k files that will be the source I am checking. It is a shared (network) drive. The files in the source directory don’t change a lot so there aren’t that many that are changing. There are probably about 500 files in the CSV that will need notifications sent.
The notification doesn’t have to be instant, it can be done every 15 minutes or so.
My question is, what is the best way to do this?
I originally wrote this using Get-ChildItem. It would wait 15 minutes, do a GCI where LastWriteTime < 15 min and then see if any matched what was in the notify list. This was very slow. Just getting the list of all files changed in the last 15 min took too long.
I then used events with the FileSystemWatcher. This was fast. The issue I had is duplicate events (well documented) and if the machine running the script goes down (for upgrade, whatever), those events will never be triggered and the emails won’t be sent.
I read WMI would work, but heard that is slow too.
Maybe GetFiles is a little faster than GCI. I am wondering if I had the wrong approach looking for all files that changed in last 15 min in the source as the starting point and then looking if a file I was looking for was in there. I could start will my list of files to watch, and do a GetFiles every 15 minutes on each file in the list and then check the LastWriteTime. Not sure if that would be faster as I would be doing 500 Getfiles, but they would be more targeted (specific child directory).
FileWatcher. It’s low-overhead, and honestly you should consider using Visual Studio to write a small “file watcher” service that runs continuously, rather than forcing the shell to do so. WMI definitely not. You could couple this by keeping a list of processed files, and then scanning for new ones each time the service starts. That would compensate for an outage of the machine running it.