Multiple PowerShell Scripts processing one input file

I need to clear around 75k legacy user profile folders spread across my SAN. So far I’ve got a script that will process the deletion of the folder… albeit slowly. (Robocopy is lethal!)

So to try and speed things up I run 10 powershell scripts in parallel. So far I’ve split the input file into batches of 1,000 (using Powershell) and target each powershell script to one of the smaller files. Challenge is that some user profiles take hours to process and some seconds so some scripts take days and others hours. It’s a constant challenge to keep these working.

It would be wonderful if I could have one input file and have all my scripts grabbing the next folder from the ‘queue’ and processing that.

The alternative would be to split the big file into very small batches (10?) and then have each Powershell script moving the next file in the queue into it’s own folder and process it there.

Each approach has a challenge with contention - just wondered if anyone else had done something similar and how they approached it.

Kind regards,
Phil Worsley (University of Leeds)

Phil,
Welcome to the forum. :wave:t4:

When you say …

… you’re doing something wrong. robocopy is actually the fastest (built in) tool when it comes to file system operations.

In general … depending on the actual task you have to do and the approach you use you can use the built in functionality of Foreach-Object -Parallel of the PowerShell version 7.x to speed up your task.

With the parameter -ThrottleLimit you can limit or increase the amont of parallel running script blocks.

Please read the help completely including the examples to learn ho to use it.