Iterate through many servers, archive files on remote server, copy to local backup server

Hi, all - this is a bit of an offshoot of a previous post - https://forums.powershell.org/t/finding-date-and-size-of-a-file-written-to-a-specific-directlory/21438?u=smueller72

As I’m learning more about PS, I’m now wanting to move my backup process entirely from being batch file-based PS. My requirements/challenge(s) and where I’m at as of now are below.

Requirement:

Archive/compress a certain set of database backups from 100’s of servers on a nightly basis and transfer the archive to a local backup drive on the server running the PS scripts. Remove any archive from the remote server after successful archiving. Produce success and exception(i.e. new archive doesn’t exist at all, new archive is smaller than expected) logs and store and email to myself and other administrators.

Current process:

I’m currently passing in a list of the server IPs along with some other variables(state/market of the server mainly) from one batch file to another batch file which then connects to the remote server in a not-so-secure way, using WinRAR archives the last few days of backups on the remote server, copies(using RoboCopy) the archive(typically 100-150MB each) to the local backup server, and deletes the .rar from the remote server. All of this is triggered via task scheduler. A PS script then runs later in the day which outputs and emails a CSV with the size, backup time, location of the archive which then is reviewed by admins to restart backups as needed(although, backups typically cannot be started mid-day due to resource usage on the server which is also hosting a POS system for the individual store

Future State:

Use PS Remote to securely connect to the remote servers(iterating through presumably some sort of host/ini file passed into the script) Use compress-archive to compress just the last few days of backups, if need be copy the archive from remote to local, and put into a folder with the market name and subfolder the store name(using variables passed in via the host/ini file mentioned above). Use existing script to create csv and email to local admins).

Here is some of the code I have so far but, I know it definitely needs some work, and would appreciate any suggestions on how best to get to my desired future state. Thanks in advance for any advice/suggestions.

Read-Host -AsSecureString | ConvertFrom-SecureString | Out-File F:\localdrive\Credentials_encrypted.txt
$user = "admin user name"
$pass = Get-ChildItem "F:\localdrive\Credentials_encrypted.txt" | ConvertTo-SecureString
$creds = New-Object-typename System.Management.Automation.PSCredential -ArgumentList $user, $pass

## Below command will execute the command in the remote system. Means that Compress-Archive is now running locally on the remote machine. 
Invoke-Command -ComputerName remotecomputer -ScriptBlock { Compress-Archive -Path C\path\to\dailybackups *daily* -DestinationPath c:\path\to\remotearchive\DBBackup.zip } -credential $creds

#-- Once the compression is done, copy archive to local system. --#
Copy-Item -Path c:\path\to\remotearchive\DBBackup.zip -Destination f:\@backups\market of remote store\name of remote store\DBBackup.zip

A couple of thoughts for you.

  1. I would never store plain text credentials in a file for security reasons. See the Get-Credential cmdlet to get the credential into an object at runtime.
  2. If you run this as a scheduled task, the script can use the credentials set there, so you won’t need an admin user to enter credentials at runtime.
  3. If you store your hostname, market, store data in a CSV file, you can use Import-Csv to get that data into a PowerShell object. Then you can use foreach to loop through those and call the invoke-command.
  4. I think your copy-item parameters are switched - assuming C: is the local machine destination and F: is the remote server path. You can also use a unc path with the hostname from above for the remote path (\$hostdata.hostname\share\folder\zipfile)

I appreciate the feedback. I’m still very ‘green’ with PS so the suggestions are appreciated. I’ve looked at the Get-Credential cmdlet a bit closer as what I’m certainly trying to do is to make this as secure as possible. I guess I need to look a bit closer at it as I don’t completely understand how it would securely pass in creds that would allow me to access the individual servers securely with the proper credentials. Any insights on that?

I do have the list of servers(IP addresses), the market of the store(i.e region of the country) and store/city name within the market within a csv that I can import.

I would assume I could use something along the lines to get the

$IP=@()
$Market=@()
$StoreName=@()
Import-Csv c:\myfile| ForEach-Object {
    $IP += $_.IP
    $Market += $_.Market
    $StoreName +=$_.StoreName
and then use New-PSSession, Invoke-Command, Compress-Archive, and Copy-Item to run the remote commands needed(after WinRM service is started on the individual servers) to get the archive back over onto the local machine into the proper folder structure based on the market and storename variables

Does this approach seem to make sense? I’m sure I’m missing plenty so any additional advice would be welcome.

Steve,

when a post of you gets hold for moderation please do not repost. Give us a little time to approve it.

Thanks in advance. :slight_smile:

Thanks - it immediately disappeared without warning until a few minutes later when I received an email notification so I wanted to be more safe than sorry on the post.

Every post I make seems to be scrutinized so perhaps I’ll go elsewhere…

All posts from all members get checked. Otherwise you would see a lot of spam here in the forum.