Capturing all Get-Job IDs from running background jobs for Sharepoint backup

Hi folks. We have developed a script in our team that successfully starts and completes the backup of all our Sharepoint on-prem site collections. The backup via Powershell is successful. But we are trying to capture how long a backup job takes, the site URL and Site size and export to CSV file. Now when i kickoff my script, which is pasted below, i can execute the $running and see PSBegintime and PSEndTime by executing $running[0] | select *. The problem is that i’m not able to see what Site is being backed up, the “Command” property within $running[0] shows the script block we ran, but the $variables in there are not filled in with real values, they only show variable names (so no help). I found that i can figure out how long a job ran but executing this right below.

((Get-Job Job8).PSEndTime.TimeOfDay - (Get-Job Job8).PSBeginTime.TimeOfDay).TotalMinutes

What i’m having a difficult with is grabbing all Job ID #s that were created and are in either Running or Completed state. I wanted to add them to the empty array of $backupJobs. How will i be able to pull the Site URL from the running/completed Jobs? Thank you for any assistance

 

[pre] ## Backup all sitecollections

$jobThreshold = 7
$logfile = “d:\PSBackups\test\RunLog.txt” ;
$(Get-Date) > $logfile ;
$sites = get-spsite -limit all

Remove-Item d:\PSBackups\test\Data*.* ;

foreach ($site in $sites) {

Get all running jobs

Write-Host $site.ID
$running = @(Get-Job | where { $_.JobStateInfo.State -eq “Running” })

Loop as long as our running job count is >= threshold

while ($running.Count -ge $jobThreshold) {

Block until we get at least one job complete

$running | Wait-Job -Any | Out-Null

Refresh the running job list

$running = @(Get-Job | where { $_.JobStateInfo.State -eq “Running” })
}
##$sitesize = $site.Usage.Storage;
##TotalSize = TotalSize + $sitesize
Write-Host “BEGIN:” $(Get-Date) $site.Url ;
“BEGIN: $(Get-Date) $site.url” >> $logfile ;

Start-Job -InputObject $site.Url {
$url = $input | %{$}
Add-PSSnapin Microsoft.SharePoint.PowerShell
$filename = $url ;
$filename = $filename.replace(“https://” , “”);
$filename = $filename.replace(“sites/” , “”);
$filename = $filename.replace(“/” , "
“);
$filename = $filename.replace(”." , “_”);
$filename = $filename + “.dat” ;
write-host $filename;
backup-spsite -Identity $url -Path d:\PSBackups\test\Data$filename -force -NoSiteLock
}

$backupJobs = @()

Write-Host “END:” $(Get-Date) $site.Url ;
“END : $(Get-Date) $site.Url” >> $logfile ;

completed jobs

Get-Job | where { $_.JobStateInfo.State -eq “Completed” } | Receive-Job

Remove completed jobsin

Get-Job | where { $_.JobStateInfo.State -eq “Completed” } | Remove-Job
}

$(Get-Date) >> $logfile ; [/pre]

How about use the following: Then Name will contain the site.

Start-Job -InputObject $site.Url -Name $site.URL { ... }

 

I would probably track the start and end time with each job as properties. Then you’re end object will include the start/stop times. You could also provide an ID for each job when starting. Can you show where you start the job?

[quote quote=252680]How about use the following: Then Name will contain the site.

PowerShell
<textarea class="urvanov-syntax-highlighter-plain print-no" style="tab-size: 4; font-size: 14px !important; line-height: 18px !important; z-index: 0; opacity: 0;" readonly="readonly" data-settings="dblclick">Start-Job -InputObject $site.Url -Name $site.URL { ... }</textarea>
1
Start-Job -InputObject $site.Url -Name $site.URL { ... }
[/quote]

Thank you AdminofThings45! That did the trick to get the site.URL value to come up. Would you know how to pull all Job IDs, including running and/or complete state?

[quote quote=252683]I would probably track the start and end time with each job as properties. Then you’re end object will include the start/stop times. You could also provide an ID for each job when starting. Can you show where you start the job?

[/quote]

Hi Doug. The part where we start parsing thru the URL link is “Start-Job -InputObject $site.Url -Name $site.Url {…}”

The backup starts at "backup-spsite -Identity $url -Path d:\PSBackups\test\Data$filename -force -NoSiteLock "

Thank you for replying.

Get-Job | Select -Expand Id will get all of the job IDs

[quote quote=252704] Get-Job | Select -Expand Id will get all of the job IDs

[/quote]
So the backup script created 3 .dat files from 3 site collections, but when i run Get-Job | select -expand Id, i only get 2 job Ids that are returned. I also ran the following below and get only 2 listed Jobs. Where is the missing 3rd Job?

(Get-Job | where { $_.JobStateInfo.State -ne “” })

Thank you!

Changing the JobThreshold to ‘3’, and then running it, returns 3 Job Ids now.

Hi folks, so i went ahead and ran a backup against 50 site collections ($sites = get-spsite -limit 50) and set the $jobthreshold limit to ‘7’.

The backup does complete successfully, but when i run the code block below, i only get 7 Job Ids stored in the $hashtable array. Why is the code not capturing all the Job Ids? Also running “get-job | select -ExpandProperty Id” only lists 7 Job Ids, nothing more. I’m not sure why not all 50 job Ids are shown? $alljobs.count = 7. Thank you for any assistance.

[pre]

$Hashtable = @()
$date = Get-Date -Format MMddyyyyhhmmss
$alljobs = Get-Job | Select *

foreach($bkupjob in $alljobs){
$spsite = get-spsite $bkupjob.Name
$spsite.Usage.Size

$Hashtable += New-Object PSObject -Property @{
‘OurSite’=$bkupjob.Name;
‘BackupStartTime’=$bkupjob.PSBeginTime
‘BackupEndTime’=$bkupjob.PSEndTime
‘TotalMinutes’=((Get-Job $bkupjob.Id).PSEndTime.TimeOfDay - (Get-Job $bkupjob.Id).PSBeginTime.TimeOfDay).TotalMinutes
‘Size’=$spsite | select @{Name=“Size”; Expression={“{0:N2} MB” -f ($_.Usage.Storage/1000000)}}

}
}

$Hashtable | select OurSite,BackupStartTime,BackupEndTime,TotalMinutes,Size | Export-Csv “D:\psbackups$($date).csv” -NoTypeInformation

[/pre]

I should have mentioned in above post, that in PROD, we want to limit the jobthreshold number to a maximum of 7, and not go above that. We have about 250 site collections to backup and can not set job threshold to 250. Our server resources hitting the single SQL server will cause an outage with 250 concurrent jobs running in tandem. Sorry for forgetting to put this.

I figured the issue. Had to add the parameter -Keep after Receive-Job. Also added the line to right after the comment “Completed Jobs” to gather all jobs with {JobstateInfo.State -notEquals “”} into the $backupJobs array. Added a do While statement for script to wait until all Running state jobs count equaled zero. Then I removed the $allJobs array and sorted the $backupJobs by Id# and -Unique. All the jobs had data in the “HasMoreData” property. Was able to get all jobs. Also commented out the Remove-job temporarily. Thanks