I have a script that gets all the chromebooks in our school and syncs them on an on-prem SQL table, I’m trying to make a script that will process the devices in parallel using background jobs. The script seems to work well, it takes the 100 devices that the Google API returns per page and runs a SQL query to add them to SQL.
Since each page returns 100 devices each background job runs 100 queries then finishes. the problem I’m having is that even though I’m cleaning up all the variables at the end of the job and receiving each completed job each job is still using 15-20MB. When I get to 10 jobs memory usage is almost at 1GB, and even though i’m throttling the jobs to until 10 at time when a job finishes it seems to not release the ram so after 30 minutes I’m at 100% memory usage.
I just gave it a try, and although it did somewhat help a bit it’s still using way too much ram to be usable. Back to the drawing board for me I guess.
After some help from the /r/PowerShell community they pointed me to the dbatools module which I used to do a bulk insert instead of one record at a time which solved the problem. See updated code below: