Copying files in a range

Hi, im trying to get this script to work. this script searches for all .doc files inside multiple folders, copy’s them and rename any duplicates.

the folder where it searches is now named ‘folder1’ inside that folder there are alot of other folders named ‘account_1’ until ‘account_800’ how can i edit the script so that it searches inside ‘Folder1’ and then only account_4 to account_30 for example. i dont want to search in 1,2 and 3 and everything after account_30.

i tried with select-object -first $number but that doesnt fully work. as it counts the number of .doc files found and not the number of folders i want to search in.

this is what i got so far.

$dest = “Destination of new folder with todays date”
$folderName = (Get-Date).tostring(“dd-MM-yyyy”)
New-Item -itemType Directory -Path $dest -Name $FolderName

function fcopy ($SourceDir,$DestinationDir)

Get-ChildItem $SourceDir -Filter "*.doc" -Recurse | Where-Object { $_.PSIsContainer -eq $false } | ForEach-Object ($_) {

	$SourceFile = $_.FullName
	$DestinationFile = $DestinationDir + $_

	if (Test-Path $DestinationFile) {
		$i = 0
		while (Test-Path $DestinationFile) {
			$i += 1
			$DestinationFile = $DestinationDir + $_.basename + "--COPY--" + $i + $_.extension
	} else {
		Copy-Item -Path $SourceFile -Destination $DestinationFile -Force
	Copy-Item -Path $SourceFile -Destination $DestinationFile -Force

fcopy -SourceDir “C:\Source\Directory\Folder1” -DestinationDir “C:\Users…\Desktop\Scripts$foldername”

You can specifiy a range in PowerShell using ..

So you could do something like this:

$range = 4..30
foreach ($n in $range) {
    fcopy -SourceDir "C:\Source\Directory\Folder1\Account_$n" -DestinationDir "C:\Users…\Desktop\Scripts\$foldername"    

thank you, i encountered this all over google with $range but didnt work. probably a typo somewhere that i couldnt find. thank you for you solution.

@matt-bloomfield the script work flawless locally but when i edit the -SourceDir and -DestinationDir to network locations it doesnt work. i dont get an error it just searches endlessly. when the -SourceDir has a local location and -DestinationDir a network location it works. the adres is like this: \\abc\AAA-SomeThing\ABC\Accounts\Account_123. i tried it with and without backslash and " ’ but none of that works. am i missing something?

I can’t see a reason why that wouldn’t work. You will need double quotes " around the path to ensure the variables are expanded and the UNC paths should start with a double backslash \\.

Can you copy a file manually between the network locations? e.g.

Copy-Item \\\abc\source\Account_123\testfile.txt\\\destination\FolderAccount_123\

If that works with Copy-Item it’s probably a problem with your fCopy function. If it doesn’t work, then perhaps it’s a permissions problem.

i can copy manually and with the copy-item command it is possible too. i got another script before this one and that one worked (without renaming duplicates and range) so permission isnt a problem. only difference is that this script has the function command and the other one doenst have that. so i guess its somehting in the fcopy…

Ok, I would replace the Copy-Item lines with Write-Host so you can see if the paths you think you’re generating are the paths you’re actually generating.

Write-Host "-Path $SourceFile -Destination $DestinationFile -Force"

it doesnt show anything, when i set the search on the network path, powershell only shows that the directory is created. i tried a few adjustments in the script and still no luck. i have an older script that works but without the range en renaming duplicates. im thinking to start over with a new script in the hope dat somehow it wil work for some reason :slight_smile:

$number = 6
$dest = “path”
$files = Get-ChildItem -Path “path” -Filter “*.doc” -Recurse | Select-Object -First $number
$folderName = (Get-Date).tostring(“dd-MM-yyyy”)

New-Item -itemType Directory -Path $dest -Name $FolderName

foreach ($file in $files) {
$file_path = Join-Path -Path $dest -ChildPath $file.Name

$i = 1

while (Test-Path -Path $file_path) {
    $file_path = Join-Path -Path $dest -ChildPath

Copy-Item -Path $file.FullName -Destination $dest$folderName

I know it’s not an answer to your actual question but I could imagine it could be a solution for your actual challenge.

I’d consider using a combination of Windows Shadow Copies and backup software to save several different versions of one file. There are tools able to store whole folder structures without wasting too much space. One of them would be HardlinkBackup.

thanks, i will look into those links you provided. its probably the easiest to use those programs but i cant let this go without getting it to work. i recently started with powershell so wanna know how why this is working or i have sleepless nights about it :wink:

I have tested it with network locations and it works as expected for me. Obviously, I’m testing on a much smaller scale than your several hundred folders.

The only thing I’ve noticed is in both your original post and a subsequent reply (#7 in this thread). You’ve not got any slashes in the destination. It should look something like this:

$range = 1..2

foreach ($n in $range) {

        fcopy -SourceDir "\\myFileServer\Data\_z_Temp\old_account$n" -DestinationDir "\\myFileServer\Data\_z_Temp\new\$folderName\"    


Hey Matt,

thanks for testing, i think i just found the ‘bug’ as i said before the whole script works 100% locally on my pc. i search for “*.doc” and i find .docx files. when i switch all paths to the network locations the filter stops working. so when i set the network locations and i write docx instead of doc. then it works suprisingly. so now i trying to apply multiple filters and then it should work how i want.

Good find :+1: - I don’t remember seeing that being mentioned before but that’s a nasty inconsistency. Looks like it’s fixed in 7.x though.

very frustrating, but thanks for your help