Improving performance of variables

I have a pscustomobject variable containing the output from Get-ADGroup. I’m converting this into a DataTable by piping it to Out-DataTable and this is quite slow.

However, if I run the same command again, it’s very quick. I’m guessing that this is related to the allocation of memory for the variable as each row is processed so I’m wondering “Is it possible to allocate a block of memory when initialising a variable?”

If this isn’t the reason, why would subsequent executions of the same command run quicker than the first? Thanks.

Hmmm … since Out-DataTable is not a default cmdlet of Windows PowerShell … how do you call it? What do you do with DataTable later on? What exactly do you mean with “quite slow”? Do you use it interactively? Does it actually have to be faster? :thinking:

… maybe caching? :man_shrugging:t4:

The output of Get-ADGroup has been saved into $rsGroups. It is then passed to Select-Object and then Out-DataTable

[System.Data.DataTable]$dtGroups = $rsGroups | Select-Object objectGUID,distinguishedName,SID,primaryGroupToken,displayName,name | Out-DataTable
[System.Data.DataTable]$dtMembers = [System.Data.DataTable]::new()
$rsGroups | ForEach-Object {foreach ($member in $_.member) {[void]$dtMembers.Rows.Add($_.objectGUID, $member)}}

It can take upwards of 3 minutes to populate $dtGroups (6000 records) and then 5 seconds for $dtMembers (45000 records). If I swap the population order around, the slow performance moves to $dtMembers and the highly performant to $dtGroups.

Although it’s not interactive, I feel that there is something fundamental happening here that I should be able to improve.

Why are you changing\casting to a DataTable? Is this being uploaded to SQL? The DataTable is unnecessary, even with SQL commands utilizing PSObjects as input. The only time I’ve ever needed a DataTable is a bulk insert SQL operation, and the slowest part is the transformation from PSObject to DataTable.