Questions about directories folders and automation

I have been tasked with writing a script that will copy files from a server to a network share. The files are updated and we pull the new files weekly. The problem I’m having, the are all stored on x: the path the the files would be x:\folder1\subFolder2-files and there are several folders. I need the script to go into each folder, sort the files and take the newest 1, then repeat the process in each subfolder in the directory ex:

Sort files and copy
Sort files and copy
x:\Folder2\subfolder1\files etc…

I can get this to work if I list all the directories individually, but I cannot seem to figure out how to cycle the folders on its own, and repeating for every folder. I have written some very basic scripts, but this one seems beyond my abilities. Any help would be great, and help with explanation so I can figure it out myself next time would be amazing!

p.s. This isn’t my fist stop, I have been searching for pre-made scripts, ideas, and have even watched all the powershell beginner courses trying to find a solution. I have spend A LOT of time trying to get it on my own. Thanks!

Hmmm … ok … you could start with something like this:

Get-ChildItem -Path C:\sample -Directory -Recurse |
ForEach-Object {
Get-ChildItem -Path $_ | Sort-Object | Select-Object -First 1

Just add this line in your “foreach” loop.

Get-ChildItem $_ | sort LastWriteTime | select -last 1

Well I did finally get it to work.Ben, I didn’t see your suggestion until just now. Olaf You put me on the right path. I knew it was something simple, I just couldn’t seem to get it. But it’s still not complete. The below is an example of the code I am now using. This will copy the files over and just the new ones like I need. The next step (Which I didn’t include in my post on accident) is the destination Folder should continue the original file path, just without the x:. So the files should be put in the same directory, just on the network drive and not the source drive. I now don’t know how to copy the directory to the destination folder for the files. Again there are several paths, each with 1 file I am taking (There are several files in the folders that I will not be taking), that I am needing to take. I hope this makes sense and someone is able to help! I have been experimenting with variable and things, just haven’t gotten anything to work out yet!

gci -Path X:\ -File -Reucrse | Sort-Object - Property LastWriteTime -Desecending | ForEach-Object {Select FullName -First 1 | %($) $.FullName
copy-item $_.FullName -Destination “\\DestinationFolder”

If you need a particular destination path for your files you should create it before copying files there. Take a look at the cmdlets Split-Path and Join-Path.

Another idea: did you think about using robocopy? It sounds like it could be suitable for your task.

Edit: … and btw: could you please format your code as code here in the forum? It’s much easier to read then.

Hey mate,

How do you determine what folder goes into what directory?
You might need to add some “if” statements.
But not knowing the logic makes it hard to send you something to test with.

Here is what I would do for the logical flow.

1) Define my Source and Destination root directories in variables 
2) Use what you already have above to get the Child directories from source
3) For each directory do the following
  a: Find the newest file by
     i. Getting all files in directory
    ii. Sort by LastWriteTime -Desending
   iii. Select the first object
  b: If a newest file is found
     i. Get the parent directory path of the newest file
    ii. In the parent directory path, replace the Source root with the Destination root
        (IE. $newestFile.Directory.FullName.replace("$sourceRoot", "$destRoot"))
   iii. Test the generated destination path to determine if it exists.  If not create it.
        (IE. Test-Path $destPath)
    iv. Copy newest file to destination path

Thank you for the help. Sorry it took so long to respond, I have been away. I will play with this and let you guys know how/if I could get it to work.

Thanks again to everyone for all the help.