I need to clear around 75k legacy user profile folders spread across my SAN. So far I’ve got a script that will process the deletion of the folder… albeit slowly. (Robocopy is lethal!)
So to try and speed things up I run 10 powershell scripts in parallel. So far I’ve split the input file into batches of 1,000 (using Powershell) and target each powershell script to one of the smaller files. Challenge is that some user profiles take hours to process and some seconds so some scripts take days and others hours. It’s a constant challenge to keep these working.
It would be wonderful if I could have one input file and have all my scripts grabbing the next folder from the ‘queue’ and processing that.
The alternative would be to split the big file into very small batches (10?) and then have each Powershell script moving the next file in the queue into it’s own folder and process it there.
Each approach has a challenge with contention - just wondered if anyone else had done something similar and how they approached it.
Kind regards,
Phil Worsley (University of Leeds)