Hey all,
So a new issue i am encountering when working with large data, is my servers running out of memory and then the script grinding to a slow pace.
The script i am running is as follows:
$SourceFile = Import-Csv -Delimiter “;” -Path
foreach($User in $SourceFile)
{
$HomeServer = Get-QADObject -Identity $User.DistinguishedName -IncludeAllProperties
If($HomeServer.msExchHomeServerName.Length -eq 0)
{
$User |select DistinguishedName, userPrincipalName, mail, ObjectClass |
Export-Csv -NoTypeInformation -Delimiter ";" -Path Forest_msExchHomeServerName_Empty.csv' -Append
}
}
Now I am currently scanning around 40000 users and my server (which has 16gb memory) has maxed out and the script is now running extremely slowly. Now i suspect it has to do with my variable management and maybe i need to clear the variable, but i am not to sure.
How can i fine tune this script so that it doesn’t consume so much memory?
And does anyone have any good articles around performance tweaks that can be used in PowerShell scripting?