new to PS, rookie q's -Need help with a transfer to S3 bucket

I have tableau set to send data that hasn’t been accessed in 30 days to back up folder. What I would like to do is send this to an S3 bucket instead. then once it is in the S3 bucket I’d like it to keep only 12 files and delete any after that. I was hoping someone could help me with how to get it to S3 bucket and request that only 12 stay in the file. I am new

tabadmin backup F:\tableau\backups\tableau.tsbak -d​
#begins the server backup​
copy F:\windows\sys\logs.zip G:\tableau\Backups\logs.zip​
#copies the log zip file from the working directory to the archive directory​
$DateStamp = get-date -uformat "%Y-%m-%d"​
#creates a variable containing the current date​
rename-item G:\tableau\backups\logs.zip G:\tableau\Log_Backups\logs-$DateStamp.zip​
#renames the log zip file to add the datestamp​
del G:\tableau\backups\logs.zip​
#delets file from working directory​
#Clean up old files- this is where I am curious about the s3​
$limit = (Get-Date).AddDays(-30)​
$Path = "G:\tableau\backups\"
#Delete files older than the $limit​
Get-Children -Path $path2 -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit -and $_.Name.EndsWith("tsbak" }​
$output = (tabadmin status) | out-string​
#runs tabadmin status to get the server status, then converts the output to a string and stores it in a variable​
 

 

Just do an Get-ChildItem and select only the first 12, set those as an exclude list, and del the rest.

Something like…(I don’t have S3Bucketrs set, so untested)

$ExcludeList = Get-ChildItem -Path $S3Bucket | Select -First 12
Remove-Item -Path $S3Bucket -Exclude $ExcludeList

Thank you, As far as my code I am not using AWS for Powershell. I am using CLI. this code has been failing.

 

aws s3 cp “G: (serverback up path\tab.server.tsbak)”+$DateStamp s3://bucketname/filename

 

its just not recognizing my s3://…

As for …

As far as my code I am not using AWS for Powershell.

… why not? That’s why they made it.

Also, you are asking for assistance on a PowerShell Q&A site, for something that is not a PowerShell problem, because you are not using PowerShell, but a Unix Shell. Uh!!? OK!!!?, Yet, all your posted code is PowerShell.

I personally don’t use CLI, never have. AWS PowerShell has always been my go to, because PowerShell is what I use everywhere else. Yet, the process would be the same in any language, of course using the syntax it allows. Yet, looking at the AWS CLI docs, does not show a method regarding how to do what I show with PowerShell (selecting a file count). It does show how to use exclusions, but only by a name or name sequence.

A swag at this would be to use CLI CP in a loop to copy only 12 files to a separate temp folder then delete all files in the original folder then move those 12 back to the original folder. It’s a kludge, but well, you know…

Lastly, if all you want is 12 files in S3 why send all the files to S3?
Just grab the files on you Windows client and copy to S3 as normal.

[quote quote=136575]As for …

As far as my code I am not using AWS for Powershell.
… why not? That's why they made it.

Also, you are asking for assistance on a PowerShell Q&A site, for something that is not a PowerShell problem, because you are not using PowerShell, but a Unix Shell. Uh!!? OK!!!?, Yet, all your posted code is PowerShell.

I personally don’t use CLI, never have. AWS PowerShell has always been my go to, because PowerShell is what I use everywhere else. Yet, the process would be the same in any language, of course using the syntax it allows. Yet, looking at the AWS CLI docs, does not show a method regarding how to do what I show with PowerShell (selecting a file count). It does show how to use exclusions, but only by a name or name sequence.

A swag at this would be to use CLI CP in a loop to copy only 12 files to a separate temp folder then delete all files in the original folder then move those 12 back to the original folder. It’s a kludge, but well, you know…

Lastly, if all you want is 12 files in S3 why send all the files to S3?

Just grab the files on you Windows client and copy to S3 as normal.

[/quote]
THATS A BINGO. that’s what Im trying to do and am using.

aws s3 cp “mydata” s3://mybucket/

im just curious where on the syntax I am getting this wrong? Powershell keeps saying its not recognizing the tabea file in the E:. Was curious is there were any real world examples of people using the CLI to get to get their documents to the S3 bucket. the tableau

That is because, if you need to pass special characters as part of a piece of information, they have to be escaped.

https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_special_characters?view=powershell-6
http://www.neolisk.com/techblog/powershell-specialcharactersandtokens

Looking at what you posted, I don’t see an E: drive specified, just F: and G:

In any case, this, as a string…

s3://mybucket/

… is not a PowerShell issue, but a Windows one. There are no single or double forward slashes allowed in the Windows file System, nor double back slashes.

So, passing this as a string, try this…

"s3://mybucket/"

… or escaped, something like this…

"s3:////mybucket//"

# Or 

"s3:`//mybucket`/"

See this article:

Escaping in PowerShell http://www.rlmueller.net/PowerShellEscape.htm https://ss64.com/ps/syntax-esc.html

As for this…

Was curious is there were any real world examples of people using the CLI to get to get their documents to the S3 bucket. the tableau

I have no idea what Tableau is (tough I’ve hard it talked about) thus never used it or seen any write-up on it with PowerShell / CLI and AWS S3.