WMI Queries take very long time

Hi,

Can you please help me to optimize this code. I have around 300 computers stored in a .txt file. I create a for loop to execute the below queries for each computer. I use a hashtable to store the values and create a custom object and then export the results to a csv file. This takes around 25 minutes. Is there a way to make it faster

$bootupMemory = gwmi -Query “SELECT * FROM Win32_OperatingSystem” -ComputerName $srv
#$cpuLoad = gwmi -Query “SELECT * FROM Win32_Processor” -ComputerName $srv
#$tSessions = gwmi -Query “SELECT * FROM Win32_TerminalService” -ComputerName $srv
$ima = gwmi -Query “SELECT * FROM Win32_Service WHERE name=‘imaservice’” -ComputerName $srv
$mfcom = gwmi -Query “SELECT * FROM Win32_Service WHERE name=‘mfcom’” -ComputerName $srv
$ctxPrintMgr = gwmi -Query “SELECT * FROM Win32_Service WHERE name=‘cpsvc’” -ComputerName $srv
$msmqstatus = gwmi -Query “SELECT * FROM Win32_Service WHERE name=‘msmq’” -ComputerName $srv

$cDrive = gwmi -Query “SELECT * FROM Win32_Logicaldisk WHERE deviceid=‘c:’” -ComputerName $srv
$dDrive = gwmi -Query “SELECT * FROM Win32_Logicaldisk WHERE deviceid=‘d:’” -ComputerName $srv
$loginStatus = gwmi -Query “SELECT loginsenabled, numberofsessions FROM Metaframe_Server” -Namespace root\citrix -ComputerName $srv
$load = gwmi -Query “SELECT * FROM Metaframe_Server_loadlevel” -Namespace root\citrix -ComputerName $srv

First thing is to remove redundant code.

$services = Get-WmiObject Win32_Service
$services| ? {$.name -eq ‘wuauserv’}
$services | ? {$
.name -eq ‘WwanSvc’}

Two basic problems

First problem is that you are using Get-WmiObject a number of times against the same machine. Each call to Get-WmiObject involves creating a DCOM link to the remote machine, getting the data and then closing the DCOM link. That is a slow expensive process.

The answer is to do one of 2 things - ideally create a CIM session to the remote machine and use Get-CimInstance over the CIM session. The alternative is to create a PowerShell remoting session and use that to get the data.

Secondly you are repeating calls to the same class - specifically Win32_Service and Win32_Logicaldisk. Make ONE call to the class and use filters to control the return data - you can then allocate to variables as required.

Using the full WQL query involves more typing than necessary but won’t affect performance much

Don’t use aliases in production scripts. Its the biggest NO NO of all

Going out to 300 servers isn’t going to be quick but it appears you’re working with the servers sequentially. You need to think about introducing some parallel processing into your code - either through accessing multiple computers at once through CIM/Remoting session, workflows or even PS jobs

thank you Sir.
Can you please help me how to use Cim session and parallel processing.

Mr. Siddaway explained many of the things you were doing wrong. Here is a basic example of using CIM session with a WQL query. Note in this example I only care about the Name and State of the service, so there is no need to return all properties (e.g. Select * …) and we use OR statements to only get services we care about. Next we create a single session with the remote computer and would pass the -CimSession to any other queries.

$srvQry = @"
    SELECT Name, State 
    FROM Win32_Service 
    WHERE Name='imaservice' 
    OR Name='mfcom' 
    OR Name='cpsvc' 
    OR Name='msmq'
"@

$session = New-CimSession –ComputerName server02
$services = Get-CimInstance -Query $srvQry -CimSession $session

Invoke-command works slower for my diskspace script. Plain get-wmiobject works faster; version1 took 20 seconds whereas version2 took 2 minutes and 42 seconds.
Also invoke-command does not work with hostname, I have to provide it with FULLY QUALIFIED DOMAIN NAME. any idea why is this happening ?

Can you please help

Space report - version 1

$serversRaw = qfarm /load
$hostnames =  for($i=3;$i -lt $serversRaw.length; $i++) { ($serversRaw[$i] -split " ")[0] }

if (($result = Read-host "Enter either c: or D: [default is C:]") -eq '') { $drive = "C:"} else {$drive = $result }

$x = $hostnames | % { gwmi -cl win32_logicaldisk -computer $_ | ? { $_.deviceid -eq $drive } | select size, freespace, pscomputername }

$outfile = "$($env:userprofile)" + "\" + "DriveSpaceStats-$(get-date -f MMddyy-hhmmss).csv"
$output1 = $x | % { "$($_.pscomputername) $(($_.size/1.0GB).ToString("#.##")) $(($_.freespace/1.0GB).ToString("#.##"))" }

$output1 -replace "\s","," | out-file $outfile

Space Report - v2


$serversRaw = qfarm /load
$hostnames =  for($i=3;$i -lt $serversRaw.length; $i++) { ($serversRaw[$i] -split " ")[0] }

if (($result = Read-host "Enter either c: or D: [default is C:]") -eq '') { $drive = "C:"} else {$drive = $result }
$d = gwmi -cl win32_computersystem | select -expandproperty domain
$drive
$zz = @()
foreach ($s in $hostnames)

{
  $zz += $s + "." + $d 
}
#$zz

$x = icm -computer $zz { param($d) gwmi -cl win32_logicaldisk | ? { $_.deviceid -eq $d } | select size, freespace, pscomputername } -argumentlist $drive -throttlelimit 200

#icm -computer $zz { get-date; hostname }


#$x | receive-job

#receive-job



$outfile = "$($env:userprofile)" + "\" + "DriveSpaceStats-$(get-date -f MMddyy-hhmmss).csv"
$output1 = $x | % { "$($_.pscomputername) $(($_.size/1.0GB).ToString("#.##")) $(($_.freespace/1.0GB).ToString("#.##"))" }

$output1 -replace "\s","," | out-file $outfile


@Asrar Your second command is still executing synchronously. The time difference that you are seeing is the overhead required to setup the remote session. To see the real power of what @Richard-Siddaway is describing you need to retool your script. But to start with let us take the example of your test above.

To run this command in parallel you would do something like this

$hostnames = ".","Localhost"
$drive = "C:"
$outfile = "$($env:userprofile)" + "\" + "DriveSpaceStats-$(get-date -f MMddyy-hhmmss).csv"
$Action = {
gwmi -cl win32_logicaldisk | ? { $_.deviceid -eq $Using:drive } | select size, freespace, pscomputername
}
foreach ($s in $hostnames){
  Invoke-Command -ComputerName $s -ScriptBlock $Action -argumentlist $drive -JobName $s -AsJob
}
Wait-Job -State Running
foreach ($s in $hostnames){
  $x = Receive-Job -Name "$s"
  $output1 = $x | % { "$($_.pscomputername) $(($_.size/1.0GB).ToString("#.##")) $(($_.freespace/1.0GB).ToString("#.##"))" }
  $output1 -replace "\s","," | out-file $outfile
  Remove-Job -Name "$s"
}

This code executes against each system as a background job and then after all of the jobs are complete it collects the results and adds them to your output file. From here one path you can take is to expand the script block to capture all of the information you want from each system.

As far as converting to CIM sessions start with reading this series of posts Comparing PowerShell PSSessions and CIM Sessions - Scripting Blog