Recommendations for retrieving remote background jobs for hundreds of computers

The environment has hundreds of servers spread throughout North America. My first script to gather information took 1 hour and 40 minutes to run because I made it work on one server at a time.

My second iteration is using new-pssession to fanout the workload to have each server run a background job. Then after all the jobs are completed to use receive-job for them.

The problem I’m finding is that some server’s results are not returned. In a few cases the session appears to break after the remote server starts running the tasks. I can live with that. In other cases, it looks like the script just “forgot” to ask for results - no errors, just the results weren’t retrieved despite the session still being open.

Is there a best-practice to having hundreds of server run background jobs and retrieving the results? How do you go about doing it so you can be sure that you got everything that is available?

Well … I don’t have experiences with hundreds of servers but the way you asked your question it is rather vague. And without seeing your actual code and without knowing what you’re actually trying to achieve it would be hard to recommend something meaningful.

So would you like to share your actual code along with a description what you’re trying to do?

1 Like

For hundreds of servers, this does not seem out of line, depending of course on how much data you are gathering, which you don’t mention. Since it works, why not run from a task in the middle of the night … and then who would care how long it took?

I trust you are also using PowerShell 7? I have found the performance of PS7 on remote systems to be very impressive.

My $.02 :slight_smile:

1 Like

Well, I’m making progress…

What I’ve found is that the start-job scriptblock I’m passing to each server “appears” to run. That is, a job is created, runs and completes.

However, trying to receive the job results in an empty variable on some servers. It’s not the code in the scriptblock because I have literally copy and pasted the entire start-job -scripblock { all the code } -Name ‘Name’ into a remote session, watched it run without error, checked on the status of the job and then tried to receive the job. There is nothing returned despite the job status indicating that there is more data. When I copy/paste the code that is within the scriptblock it executes as intended.

Could this be a resource issue on some servers?

How could we possibly know that? :man_shrugging:t3: Especially when you don’t share any code. And we do not know your environment.

You may overcomplicate this. If you use Invoke-Command and you provide an array of servers for the parameter -ComputerName or an array of PSSessions for the parameter -Session PowerShell will kick off the remote session a kind of all at once. So there shouldn’t be the need to use jobs in such a case. :man_shrugging:t3:

May I ask what information you want to gather from these hundrets of servers?

1 Like

I have to agree with Olaf. I have NEVER had reliable results using Jobs for large scale tasks in the past. I ditched them long ago.

So how do you go about doing a large scale job? Conceptually speaking…

You still did not tell us what you’re actually trying to do. If it’s about getting an inventory for the servers we use the software deployment solution we have in place.

But in general for every day task like getting something done on a number of remote computers I use Invoke-Command. Just like I recommended above. :man_shrugging:t3:

I’d say it depends pretty much on the task you want to get done.

Update 2023/05/18

I’ve found that moving the code from the start-job portion of the script into a separate script and having the main script use invoke-command with -Filepath script.ps1 that all servers return all of the results I’m looking for.

I asked two questions in the opening of this thread.

  1. Is there a best practice for running background jobs on remote server and getting the results.
  2. How to ensure all results are returned/gathered.

Later I asked about resource issues.

It seems that people were more interested in examining my code than answering any of the questions I raised.

So, I’ll answer them here as best I can.

Best Practice
There does not seem to be a best practice for retrieving information from remote servers. So if you’re finding inconsistent results - try putting the code you want running on remote servers into a separate script and call that from your main script. To verify the secondary script works, run it manually on a remote server in a desktop session.

Resource Limitations
There are limitations (not that my code reached any) which can be found in WinRM Installation and Configuration documentation.

Thanks for sharing your results :grinning:

  1. Is there a best practice for running background jobs on remote server and getting the results.
    Remote jobs are “background” jobs regardless. It was answered more than once, use Invoke-Command with either the computer name or sessions. (this can be the WRONG answer depending on the script itself… see below. If it’s a long running script, it may be best to have the machines run the script in Task scheduler)

  2. How to ensure all results are returned/gathered.
    This is quite loaded, because what is being run/collected can change this answer, drastically. But generally:
    A) Make sure the computers are online and stay online.
    B) Check your output against your list of computers, any missing you’ll need to retry and or investigate.

I encourage you to not assume the intention of the questions you are asked when seeking help. I know I was waiting for you to answer Olaf’s basic questions as I had plenty of advice to offer, but you gave insufficient information to ensure I was giving the best advice I could.

Since you seem to be satisfied with your own findings I close this thread now. :+1:t3: :love_you_gesture:t3: