I have tableau set to send data that hasn’t been accessed in 30 days to back up folder. What I would like to do is send this to an S3 bucket instead. then once it is in the S3 bucket I’d like it to keep only 12 files and delete any after that. I was hoping someone could help me with how to get it to S3 bucket and request that only 12 stay in the file. I am new
tabadmin backup F:\tableau\backups\tableau.tsbak -d
#begins the server backup
copy F:\windows\sys\logs.zip G:\tableau\Backups\logs.zip
#copies the log zip file from the working directory to the archive directory
$DateStamp = get-date -uformat "%Y-%m-%d"
#creates a variable containing the current date
rename-item G:\tableau\backups\logs.zip G:\tableau\Log_Backups\logs-$DateStamp.zip
#renames the log zip file to add the datestamp
del G:\tableau\backups\logs.zip
#delets file from working directory
#Clean up old files- this is where I am curious about the s3
$limit = (Get-Date).AddDays(-30)
$Path = "G:\tableau\backups\"
#Delete files older than the $limit
Get-Children -Path $path2 -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit -and $_.Name.EndsWith("tsbak" }
$output = (tabadmin status) | out-string
#runs tabadmin status to get the server status, then converts the output to a string and stores it in a variable