Reporting On Remote Servers

Hoping someone here has solved a similar problem and can point me in the right direction, basically I have an array of “Server” objects containing customer and server info, and need to run a remote command on each (limited to say 20 concurrent connections) and attach the output to its respective Server object. This allows me to generate say a spreadsheet containing the customer name, how many licences they have, and the script output of when they last logged in.

While I can pass these Servers to Invoke-Command the returned objects only have the ComputerName added (as it gets converted to String for the cmdlet). I can try to link the output back to the Server objects, but I found this is messy and somewhat unreliable - for example if multiple objects are returned.

I now realise I need to write a function to do the above, but am unsure of the best way to go about it? I tried a Workflow but an error is thrown saying it probably won’t work very well, when bypassed it does do the job but unusably slowly.

Workflow / Run Spaces / Parallel processing is exactly what you need from the performance perspective. Yet, as for this…

Workflow but an error is thrown saying it probably won't work very well

… I’ve never encountered such a thing.

All that being said, you really need to show your code and expected output for us to chime in with anything of real value.

This is the error I get if I don’t put Invoke-Command in brackets:

At line:15 char:21
+ ...   $activity = Invoke-Command $server -Credential $cred -ScriptBlock {
+                   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Cannot call the 'Invoke-Command' command. Other commands from this module have 
been packaged as workflow activities, but this command was specifically 
excluded. This is likely because the command requires an interactive Windows 
PowerShell session, or has behavior not suited for workflows. To run this 
command anyway, place it within an inline-script (InlineScript { 
Invoke-Command }) where it will be invoked in isolation.
    + CategoryInfo          : ParserError: (:) [], ParseException
    + FullyQualifiedErrorId : CommandActivityExcluded

Full workflow basically amounts to as follows:

Workflow Get-SystemActivity    # Workflow to handle queries in parallel
        [Parameter(Mandatory = $true)]
    Foreach -Parallel -ThrottleLimit 32 ($server in $servers)
        $activity = (Invoke-Command $server -Credential $Cred -ScriptBlock {

        $server | Add-Member -MemberType NoteProperty -Name "Activity" -Value $activity -PassThru

What seemed to happen was that it would slowly start 32 connections, maybe one every 20-30 seconds on average, and wait for them to finish (about 30 mins total) before starting the next 32. Additionally, I would have thought this would return objects as each loop was completed but when piping to a progress indicator it received nothing for a long time, then all 32 came down in roughly 1/3s intervals. Not sure if that’s just how parallel Foreach works? Seems kinda counter-intuitive.

Using the .NET threading classes is always going to be the most efficient way with the most control I guess.

I would do something like this:

  1. Store server names in an array
  2. Pass entire array of names to Invoke-Command and store all output
  3. combine array of objects with server data with array of output from Invoke-command
  4. do a Group-Object -Property {if ($.Name) {$.Name} else {$_.PSComputerName}}
  5. Loop over each output object from the above, look at each grouped portion, and look at how you want to put things together

If you need parallel processing, Invoke-Command also supports -AsJob

Doing workflows is a huge pain, even when you really need to (which is practically never); just avoid them, honestly.