Need help decrease the script run time

Hello All,

I use the below script to get the details of the user profiles on profile server where there are more user profile folders. Its taking around 6-7 hrs to get the report. can someone help me to change the script so that it executes fast.

The script queries each of the user profile and get the details of its Size and if the size is greater than 2 GB, Alert Variable shows where there is more data, like ‘On profile’ or ‘Appdata’ or ‘On Profile and Appdata’ and other cases as ‘####’

It also gets the details of user such as Mail address, office, Title and if user doesn’t exists marks the details as #NA#.

Below is the code.

$startTime = $(get-date)
$2gb = 2147483648

ipmo ActiveDirectory


$css = @"
<center>
<style>
align : "center"
h1, h5, th { text-align: center; font-family: Segoe UI; }
table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey;  }
th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
td { font-size: 11px; padding: 5px 20px; color: #000; }
tr { background: #b8d1f3; }


tr:nth-child(even) {
  background: lightgray;
}
tr:nth-child(odd) {
  background: white;
}
}

</style>
</center>
"@

Function Get-ReadableSize{
[cmdletbinding()]
[Alias("GRS")]
   Param (
   [parameter(ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
  
   [double]$size
   )
		
		
        $postfixes = @( "B", "KB", "MB", "GB", "TB", "PB" )
        for ($i = 0; $size -ge 1024 -and $i -lt $postfixes.Length; $i++) { $size = $size / 1024; }
        return "" + [System.Math]::Round($size, 2) + " " + $postfixes[$i]; 

}

Function Get-folderSize{

[cmdletbinding()]
param(
    [Parameter(Mandatory = $false)]
    [Alias('Path')]
    [String[]]
    $BasePath = 'C:\temp',        
    [Parameter(Mandatory = $false)]
    [Alias('User')]
    [String[]]
    $FolderName = 'all',
    [Parameter()]
    [String[]]
    $OmitFolders
)

#Get a list of all the directories in the base path we're looking for.
if ($folderName -eq 'all') {

    $allFolders = Get-ChildItem $BasePath -Directory -Force | Where-Object {$_.FullName -notin $OmitFolders}

}
else {

    $allFolders = Get-ChildItem $basePath -Directory -Force | Where-Object {($_.BaseName -like $FolderName) -and ($_.FullName -notin $OmitFolders)}

}

#Create array to store folder objects found with size info.
[System.Collections.ArrayList]$folderList = @()

#Go through each folder in the base path.
ForEach ($folder in $allFolders) {

    #Clear out the variables used in the loop.
    $fullPath = $null        
    $folderObject = $null
    $folderObject2 = $null
    $folderSize = $null
    $folderSizeInMB = $null
    $folderSizeInGB = $null
    $folderBaseName = $null

    #Store the full path to the folder and its name in separate variables
    $fullPath = $folder.FullName
    $folderBaseName = $folder.BaseName     

    Write-Verbose "Working with [$fullPath]..."            

    #Get folder info / sizes
    $folderSize = Get-Childitem -Path $fullPath -Recurse -Force -ErrorAction SilentlyContinue | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue       
    $fullprofsize = $foldersize.Sum
    $SUM = $fullprofsize|GRS
    
    $value = $sum.split(" ")[0] -as [double]
    $unit = $sum.split(" ")[1]



   # try {$UPN  = (Get-ADUser -Identity $folderBaseName -properties UserPrincipalName -ErrorAction 0).UserPrincipalName}
    try {$UPN  = (Get-ADUser -Identity $folderBaseName -properties UserPrincipalName -ErrorAction 0).UserPrincipalName
    }
    catch [Microsoft.ActiveDirectory.Management.ADIdentityNotFoundException]{$UPN = "#NA#"}
    catch {$UPN = "####"}
    finally{
                            If($UPN -ne "#NA#"){

                   $user = Get-ADUser -Identity $folderBaseName -Properties GivenName,Surname,Office, Title

                    $GivenName    = $User.GivenName
                    $Surname      = $User.Surname
                    $Office       = $User.Office
                    $Title        = $User.Title


    $AppDsize   = (Get-Childitem -Path "\\Server\D$\ProfileShare\$($folderBaseName)\UPM_Profile\AppData" -Recurse -Force -ErrorAction SilentlyContinue | Measure-Object -Property Length -Sum)
    $AppDataSize = $AppDsize.Sum
    $AppData    = $AppDataSize|GRS
                    

    }
                            else{
                    $GivenName    = "#NA#"
                    $Surname      = "#NA#"
                    $Office       = "#NA#"
                    $Title        = "#NA#"
                    $AppData      = "####"
                    $status       = "####"

    }
    }

    #Here we create a custom object that we'll add to the array
    $folderObject = [PSCustomObject]@{

        FolderName   = $folderBaseName
        UPN          = $UPN
        Size         = $SUM
        Size_Value   = $value
        Size_Unit    = $unit
        AppData      = $AppData
        Alert        = $status
        GivenName    = $GivenName
        Surname      = $Surname
        Office       = $Office
        Title        = $Title


    } 
    


$ActProfSize = $fullprofsize - $AppDataSize 

If($AppDataSize -gt $2gb -and $ActProfSize -gt $2gb){
		$Status = "On Profile and AppData"
	}
elseif($AppDataSize -gt $2gb) {
		$status = "AppData"        	
	}
elseif($ActProfSize -gt $2gb){
	$Status = "On Profile"
}
else{
	$status = "####"
}


 

    
    Write-Host "$folderBaseName ----$UPN --- $SUM - $AppData - $status"
	
    #Add the object to the array
    $folderList.Add($folderObject) | Out-Null


}




#Return the object array with the objects selected in the order specified.
Return $folderList
}

$Arr1 = Get-folderSize -BasePath "\\Server\D$\ProfileShare" -FolderName All

I think it is far beyond the scope of this or any forum to refactor or optimize a whole script for you. You should start measuring the individual steps in your script, identify the longest running parts and try to make them faster one by one.

Without digging too deep into your code here are some general tips:
From a technical point of view filesystem operations are the most expensive and time consuming ones. You should try to minimize them and not query one directory twice or more.

It is a bad idea to run those task remotely with a UNC path. Either you should use PowerShell remoting with Invoke-Command or run the script directly on the file server. That will already speed up your script without changing any code.

On top of that is PowerShell quite slow with file system operations. Robocopy is way faster.
Here is a function I wrote time ago utilizing robocopy to determine the size of a folder including its subfolders and files. Either use it as it is or use it as inspiration for your own code.

Olaf,

Wonderful script! Adding that to my library :slight_smile:

Rahul20,

To add on to Olaf’s points, pipelining large amount of data is almost always a time killer. For example, your calls to Get-ChildItem are piped into Where-Object when it would be better to use a -Filter on Get-ChildItem. Only parse the data you need to parse, not everything AND what you need to parse. about_Pipelines documentation.

1 Like