I’m currently looping through multiple files e.g. foreach { ($file in $files #do something }.
Since I would like to get an idea if the script is processing each file at a point in time, I’ve included write-host write-host -foregroundcolor yellow $file.fullname.
What isn’t known to me is whether it has a performance impact if the script will be looping through >10 millions files which will include files of varying size e.g. 1KB to 10TB?
Assuming it does impact performance, what alternatives are there?
Generating output will slow you down … a lot … skip it. Try it with a 1000 files and measure it and compare it against running the script without console output to see it yourself.
If it’s about speed - none. Every step you do not contributing to your actual task will slow you down.
For example - if you would wanted to have a percentage you would need to figure out in advance how many files you have to be able to calculate the percentage.
Thanks @Olaf. Have I understood you correctly that to allow the recommendation is for the script to run until it ends without any indication of progress?