Memory consumption using background jobs

I have a script that gets all the chromebooks in our school and syncs them on an on-prem SQL table, I’m trying to make a script that will process the devices in parallel using background jobs. The script seems to work well, it takes the 100 devices that the Google API returns per page and runs a SQL query to add them to SQL.

Since each page returns 100 devices each background job runs 100 queries then finishes. the problem I’m having is that even though I’m cleaning up all the variables at the end of the job and receiving each completed job each job is still using 15-20MB. When I get to 10 jobs memory usage is almost at 1GB, and even though i’m throttling the jobs to until 10 at time when a job finishes it seems to not release the ram so after 30 minutes I’m at 100% memory usage.

Source Code Link:

https://gist.github.com/injector22/2d9808ac3105e60887d0953d35826c2e

Start-Job creates a powershell.exe process to execute the code. You can try using PoshRSJob module instead, which uses threading.

[quote quote=134510]Start-Job creates a powershell.exe process to execute the code. You can try using PoshRSJob module instead, which uses threading.

https://www.powershellgallery.com/packages/PoshRSJob/1.7.4.4

[/quote]

I just gave it a try, and although it did somewhat help a bit it’s still using way too much ram to be usable. Back to the drawing board for me I guess.

Maybe try moving the SQL connection code to outside the foreach loop. That way you create the connection once instead of once for every single device.

Please check the bottlenecks using DMVs DM_EXEC_QUERY_STATS and DM_EXEC_SQL_TEXT.

And use bulk insert instead of inserting each row at a time, that will drastically reduce the I/Os and time as well.

After some help from the /r/PowerShell community they pointed me to the dbatools module which I used to do a bulk insert instead of one record at a time which solved the problem. See updated code below:

https://gist.github.com/injector22/150de287a8c54ce6a56e1e8827887b3e

That’s true, bulk insert drastically reduce the I/O operations, and less impact on the resources. For your scenario, this is the ideal process.

Thank you.