PowerShell 2 Get-ChildItem over network

Hi, just another noob here.

Background; When we have an incident, we need to collect certain log files from several servers.
They are stored in the servers’ log depository, which cointains logs for a lot more than what we need.

I wrote and tested the script at home in the latest PS version and am having some trouble getting it to work over the network at work, and in PS2.

Seems I’ve gotten the paths to work, but it looks like it ignores my exlusions. I used -notin at home, but that didn’t work in PS2.

                'Copying files from CR...'
				
		$servers = @()
		$servers = (
		"Servername",
		"Servername2"
		)
			
				
		##set destination folder, makes it if not already exsist
		$destServer = @()
		$destServer = ( "Destinationserver")
		$destinationFolder = "\\" + $destServer + "d$\Logs\"
				
		if (!(Test-Path -path $destinationFolder)) {New-Item $destinationFolder -Type Directory}

		##Exluded directories, separate by comma
		$excludes = "huge", "list", "ofExcluded", "folders"
				
		Foreach ($srcServer in $servers){
		Write-Host "------------------ " $srcServer "--------------------" -ForegroundColor Yellow
				
		#Get items, where the name does not include excluded directories, copies them, keeping filestructure, only copies .log
				
		$logFiles = "\\" + $srcServer + "\d$\Logs\"
		Get-ChildItem $logFiles | 
		Where-Object{$excludes -NotContains $_.Name}| 
		Copy-Item -Destination $destinationfolder -Recurse -Force -Filter *.log
				
		Write-Host "File Copy complete"
				
		}
	

Do you guys have any tips for me?

Why not use an approach that is version independent.
Just exclude natively, upfront?
Meaning…

##Exluded directories, separate by comma
$excludes = 'huge', 'list', 'ofExcluded', 'folders'

Get-ChildItem -Path $logFiles -Exclude $excludes -Recurse

Thanks! Because I didn’t know better.

It’s what I managed, to get the files generate the parent directories in the target folder.
I want this because the two servers are redundant servers, so the log files will have the same name.

I’ll try this tomorrow!

That did work more elegantly. I also discovered I had some formatting errors in my exclusions.
I still can’t seem to get directories with spaces excluded. I’ve tried “‘sample directory’” and “sample\sdirectory” with no luck.
I’ve also done the exclusions as an array;

 
$excludes = @( "spaced directory ", "normal", "Wildcard-*" )

My script also works until a certain directory where it just seems to stop.
It has about 2000 files recursive, at about 40MB per file. If I get-item the directory, it works. But as soon as I try to pass that to Copy or Sort the script stalls.

This runs fine;

get-childitem -path $logfiles -exclude $excludes -recurse -filter "*.log" 

This starts the script but just doesn’t do anything. It seems whatever I try to do with the gotten items, crashes the script;

get-childitem -path $logfiles -exclude $excludes -recurse -filter "*.log" | Sort-Object -Property LastWriteTime
#Or this
get-childitem -path $logfiles -exclude $excludes -recurse -filter "*.log" | Sort-Object -Property LastWriteTime | Select -First 1

I feel I shouldn’t need to point to every log file speficically and get them one at a time.