I’m moving databases from 1 server to another. I’ve a table that hosts the source and destination servers, the database name and the date & time to move them, along with people/groups to alert.
As I loop through the list, I check if the Logins for the database is already at the destination, if not, create it, but it’s taking forever…I’m hoping to find out if there is a faster way of doing it…
# Get the list of databases
$GetDBS = "SELECT [SourceSqlServerName],DestSqlServerName,[DatabaseName] FROM dbo.dbmigration ORDER BY [DatabaseName]"
$DbMigrationInfo = Invoke-Sqlcmd -Query $GetDBS -ServerInstance $Source -Database $database
Foreach($migration in $DbMigrationInfo)
{
$Source = $migration.SourceSqlServerName
$Destination = $migration.DestSqlServerName
$Database = $migration.DatabaseName
$SourceServer = new-object ('Microsoft.SqlServer.Management.Smo.Server') $Source
$DestServer = new-object ('Microsoft.SqlServer.Management.Smo.Server') $Destination
Foreach($login in $SourceServer.Logins | ?{ $_.LoginType -Eq "WindowsGroup" -And $_.Name -NotIn "domain\release_manager","domain\TEST_RW","domain\TEST_R","domain\TEST_DBO"})
{
if($SourceServer.Logins[$Login.Name].EnumDatabaseMappings() | ?{$_.DBName -eq $Database })
{
if(!($DestServer.Logins.Item($Login.Name)))
{
"$($Login.Name) needs to be added"
$Login = New-Object Microsoft.SqlServer.Management.Smo.Login $DestServer,$($Login.Name)
$Login.LoginType = "WindowsGroup"
$Login.Create()
} else { "$($Login.Name) was already there" }
}
}
}
This performance concern is not specific to PoSH. It is all things inline. You code, the servers involved and their consumption / processing speeds, network consumption, etc.
Loops are generally slow in almost any language and the larger the recordset, well, you know.
Your are also using three of them, nested at that. So, in such a case, if there are 100 sources, 100 DB’s and 100 logins per DB, that’s approximately (100 * 100 * 100) 1 million passes that have to be done in that nested loop set.
Then there is the speed in which SQL will handle each connection request, TSQL command and commits (remember SQL is a double commit thing), then there is the resource limitations on the server where this is being done and the target SQL server resource consumption.
My speed up suggestion is to set this up as parallel jobs in their on runspaces. So, multi-threading this effort. There are a lot of samples of using parallel processing and runspaces all over the web.
Example:
Parallel processing with PowerShell
'blogs.technet.microsoft.com/uktechnet/2016/06/20/parallel-processing-with-powershell'
Ping multiple servers simultaneously with PowerShell: Test-ConnectionAsync
Ping hundreds of machines at a time using PowerShell using multithreading (without the added overhead of additional powershell.exe processes that would come from using the Start-Job cmdlet.)
‘Browse code samples | Microsoft Learn’
I use parallel for a lot of my other scripts
That there are many isnt the issue I’m trying address,it each one taking so long. Wondering f there is a faster way to copy each one
The intention of posting the question, was hoping for a “wow, no do it this way”.
I don’t need the stop watch to tell me which part is taking the time, this is the bottle neck: “if($SourceServer.Logins[$Login.Name].EnumDatabaseMappings() | ?{$_.DBName -eq $Database })”.
But if I did need to time each step, I know just fine how to.
What I don’t know, is a more efficient way of copying the logins, for the list of database(s) provided, from one sql server to another by automated process using Powershell - I say that as I can do it by manually by running a tsql script faster.