Background Job Global Session

Hi guys,
I’m new to PowerShell background jobs, I read the help documentation about jobs and I think I need to use the jobs to avoid one of my current conundrum.

I have a PS script that connects to SQL server and does some inserts and updates to a SQL Azure table.
The PS script is invoked via SQL Server Agent, every 3 minutes.

Sometimes the script of inserting and updating data takes more than 3 minutes to complete, in which SQL Server agent invoke the PS script again (because it’s running every 3 minutes) and now I have a deadlock error:

Exception calling "ExecuteNonQuery" with "0" argument(s): "Transaction (Process ID 127) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction." - Line Number: 528

So, I have been dabbling with the below script, the idea is to invoke my script as a job, then get-job to see if the job still running and if it’s still running don’t do anything comeback after 3 minutes and check on this job again.

I have noticed that the Job runs under current session. Which means if I do get-job on a different session of PowerShell the running job won’t show up in the new session.
So my question is

  1. Am I approaching this issue the right way?
  2. Is there a way of running the job “globally” on the server so when Invoke command Get-Job I can see if the job is still running?

Here is a small script that I have put together for this question:

$Jobs = Get-Job 
If ($Jobs.Count -eq 0)
    Start-Job -FilePath "C:\Users\MyUserName\OneDrive - MyCompany\MyCompany\PowerShell\Test\Get-Processes.ps1" -Name "Job-Get-Process"
    ForEach ($Job in $Jobs)
        $JobState = $Job.State 
        Switch ($JobState)
                #Remove the Job 
                $Job | Export-Csv -Path "C:\Users\MyUserName\OneDrive - MyCompany\MyCompany\PowerShell\Test\Jobs.csv" -NoTypeInformation -Append
                $Job | Remove-Job 
                Start-Job -FilePath "C:\Users\MyUserName\OneDrive - MyCompany\MyCompany\PowerShell\Test\Get-Processes.ps1" -Name "Job-Get-Process"
                #Remove the job then re-run it 
                $Job | Export-csv -Path "C:\Users\MyUserName\OneDrive - MyCompany\MyCompany\PowerShell\Test\Jobs.csv" -NoTypeInformation -Append
                $Job | Remove-Job 
                #Do nothing come back after 3 minutes to see if this job is completed or failed.  
                $Job | Export-csv -Path "C:\Users\MyUserName\OneDrive - MyCompany\MyCompany\PowerShell\Test\Jobs.csv" -NoTypeInformation -Append                

There’s no global “job runner” in PowerShell; jobs are always part of the current process (session). What you’re after is something more of an orchestrator package, not just a task scheduler. You could, I suppose, do something similar with Task Scheduler since you can probably query Tasks to see their current status. I suspect you could do that in SQL Agent as well, by querying the job status from MSDB.

Normally I try to detach these kinds of things from a time period entirely. Time-based batch jobs always run the risk of running on top of one another, and then you have to do what you’re doing - code a bunch of avoidance crap. Instead, I’ll try to build triggers, and operate only against the thing that triggered me. For example,

  1. Every 3 minutes, check for new files in C:\Files
  2. Process the files in C:\Files


  1. Register for a notification when a new file is created in C:\Files
  2. Process only that file

In the second case, because I’m only operating against one thing, I’m less likely to have problems if my code is running multiple times in parallel. I’ve no idea what you’re actually doing in your code, but getting off of time-based batch processing is sometimes the best way to avoid having to code anti-collision stuff.

Hi Don,
Thank you for your prompt response.

for some weird reason my post is not showing under my user name I had to search my post to find your response…

Anyway, sql server agent status is always success because it invoked the powershell script and it doesn’t know if PS script failed or succeeded.
so I can’t rely on the SS Agent status.

Every 3 minutes,
The PS Script is querying a table and looking for new records with a boolean value of False for a specific field.
if False value records are found (sometimes is one record and sometimes are 100 records)
Take the records which is a batch numbers and query another table with those batches, then send the data to SQL Azure.
That’s why sometimes it run and finishes before 3 minutes and sometimes it take more than 3 minutes.

I mean, I can create a file on the server call it started right when invoking the script.
When the script is completed I rename the file to completed and use this as a condition in my script.
But I’m looking for more efficient and reliable process
Because I have other PS scripts that will be running to send data to other table in SQL Azure.

Thank you