How to check if the file exists in S3 bucket


I am trying to build a script to copy sql backups to an S3 bucket. The structure of the folders is like this:


  • FullBackups

– Daily

– Weekly

– Monthly

Inside the lower level folders are the backups. But the structure may change. I have a script below which recursively looks through the folders and copies the backups.

One problem though. If the file exists in S3 it gets copied again. How can I add a check to see if the file is there already and skip copying if the case. I need something like this:

$fFile = "daily_mydb_20200831.bak"
$S3file = Get-S3Object -BucketName $S3Bucket -Key "/ServerName/FullBackups/Daily/daily_master_20200831.bak"
$s3obj = ($S3file.key -split "/")[-1]
if ($fFile -eq $s3obj -and $S3file.size -ge $fFile.Length) {
    "File exists: $s3obj"
else {
    Write-S3Object -BucketName $S3Bucket -Key $s3keyname -File $backupName 

This is is the main script:

$BackupLocation = 'F:\FullBackups'
$S3Bucket = 'MyBucket'
$s3Folder = 'MyServer'

Import-Module -Name AWSPowerShell
Initialize-AWSDefaults -ProfileName MyProfile -Region ap-southeast-2

# FUNCTION – Iterate through subfolders and upload files to S3
function RecurseFolders([string]$path) {
    $fc = New-Object -com Scripting.FileSystemObject
    $folder = $fc.GetFolder($path)
    foreach ($i in $folder.SubFolders) {
        $thisFolder = $i.Path

        # Transform the local directory path to notation compatible with S3 Buckets and Folders
        # 1. Trim off the drive letter and colon from the start of the Path
        $s3Path = $thisFolder.ToString()
        $s3Path = $s3Path.SubString(2)
        # 2. Replace back-slashes with forward-slashes
        # Escape the back-slash special character with a back-slash so that it reads it literally, like so: "\\"
        $s3Path = $s3Path -replace "\\", "/"
        $s3Path = "/" + $s3Folder + $s3Path

        # Upload directory to S3
        Write-S3Object -BucketName $s3Bucket -Folder $thisFolder -KeyPrefix $s3Path

    # If subfolders exist in the current folder, then iterate through them too
    foreach ($i in $folder.subfolders) {

# Upload root directory files to S3
$s3Path = "/" + $s3Folder + "/" + $sourceFolder
Write-S3Object -BucketName $s3Bucket -Folder $BackupLocation -KeyPrefix $s3Path
# Upload subdirectories to S3

Found a solution in case someone may need it:

param($BackupFolder, $S3Bucket, $s3Folder, $Filter)
$erroractionpreference = “Stop”
$BackupFolder = ‘F:\Backup’
$S3Bucket = ‘mybucket’
$s3Folder = ‘s3myfolder’
$Filter = ‘*.zip’

# Get the AWS Stuff
Import-Module -Name AWSPowerShell
# Get credentials from the persisted store
Initialize-AWSDefaults -ProfileName MyProfile -Region MyRegion

Get-ChildItem –Path $BackupFolder –Recurse -Filter $Filter |

Foreach-Object {
#$S3filename = “”
$filename = $.FullName
$S3filename = $filename.replace(“[1]:\”,“”)
$S3filename = $s3Folder + ‘/’ + $S3filename
$S3filename = $S3filename.replace(“",”/")
if(Get-S3Object -BucketName $S3Bucket | where{$
.Key -eq $S3filename}){
“File $S3filename found”
Write-S3Object -BucketName $S3Bucket -Key $S3filename -File $filename

  1. a-z ↩︎