Powershell script running for a few days

I have a powershell script that’s been running for about 4 days, and it’s not progressing, but it says “running” on powershell. I have about 50 folders, and it finished 4, but I guess it’s stuck on the 5th one? I let it run over the weekend, and now it’s Wednesday, and it has not progressed past the 4th folder.

This is the script I’m running. I’m not sure how large these folders are, so I’m assuming it’s the large folder size taking too much time to run. Is there anything I can do to make it run more efficiently?

$DCS_location = 'J:\DCS'
$folders = Get-ChildItem -Path $DCS_location -Force | Where-Object {$_.Extension -eq ""}
#$folders = Get-ChildItem -Path $DCS_location -Force | Where-Object {$_.name -eq "Bobcat"}

foreach ($f in $folders) {    
    $folder_full_name = Get-ChildItem $f.FullName -recurse -Force
    $folder_size = ($folder_full_name | Measure-Object -property length -sum).Sum

    $final_folder = $f.FullName + "\final"

    Write-Host $f.Name ('{0:N0}' -f $folder_size) "bytes"

    if ((Test-Path -Path $final_folder) -eq $true) {
        $sub_folders = Get-ChildItem -Path $final_folder -Force | Where-Object {$_.Extension -eq ""}         
        $final_sub_folders = Get-ChildItem $final_folder -Recurse -Force              
        $final_folder_size = ($final_sub_folders | Measure-Object -property length -sum).Sum
        
        Write-Host "`t`t`tFinal" ('{0:N0}' -f $final_folder_size) "bytes"
        
        foreach ($s in $sub_folders) {
            $number_of_files = (Get-ChildItem $s.FullName -Recurse -Force).count
            $size = ((Get-ChildItem $s.FullName -Recurse -Force | Measure-Object -property length -sum)).Sum

            Write-Host "`t`t`t`t`t$s $number_of_files files," ('{0:N0}' -f $size) "bytes"
        } 

        $data_folder = $final_folder + "\data"
        $db_folder = $final_folder + "\db"
        $data = Get-ChildItem $data_folder -Recurse -Force
        $db = Get-ChildItem $db_folder -Recurse -Force

        $difference = Compare-Object -ReferenceObject $data -DifferenceObject $db              

        foreach ($d in $difference) {
            $counter = $difference.Count
            
            $bytes = ($difference.InputObject | Measure-Object -Property Length -Sum).sum
        }
        Write-Host "`t`t`t"$counter "files," ('{0:N0}' -f $bytes) "bytes in Data directory NOT in DB"
    }
}

If you have large files / resources, then waits are to be expected.

  • Writing to the screen also slows things down.
  • Loops are general slow in general and you are then comparing the information.
  • This is a whole lot of disk thrashing and lots of calculations happening.
  • If you are going across the wire to a remote server / host, you have that to deal with.

If you have a bulk of resources that you have no clue how large they are or how long X or Y will take, then look to PSJobs/Parallel processing, in this way you can break this up.

Personally, I’d stop this job, and rethink how to approach it.

  • Do a simple scan to get sizes.
  • Create a array of those.
  • Create background jobs based on the size objects discovered.
  • Use a progress monitor to alert you to what is going on or set a trigger to send an email about the state to avoid unnecessary screen output. Especially since you may not be in front of the screen.
  • If you have the PowerShell console clicked (in select mode), hit enter . The select mode pauses the execution for the end user to select the contents on the console.

You can see “Select Administrator: Windows PowerShell” .