This script works great on a few test folders, but throws memory exception error if I run it on the full data set. I set the memory to the max, still errors.
How do I bypass the memory and send it straight to output, OR Limit the number of folders it runs on so I can run it in batches. For example, start at level 2 like it does, but only run through 5 folders at level 2. I can’t move the data around, it’s an Archive server.
$StartLevel = 2 # 0 = include base folder, 1 = sub-folders only, 2 = start at 2nd level
$Depth = 20 # How many levels deep to scan
$Path = "." # starting path
$folders = For ($i=$StartLevel; $i -le $Depth; $i++) {
$Levels = "\*" * $i
(Resolve-Path "$Path$Levels").ProviderPath | Get-Item | Where PsIsContainer |
Select FullName
}
$Folders.fullname | %{ Get-ChildItem -path $_ -Recurse | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1 | Format-Table -Property LastWriteTime,FullName -Autosize | Out-File -Append C:\Users\mmsho\Documents\Testoutput.txt -Encoding UTF8
}
You may explain what the actual purpose of that code is. What is it what you try to achieve? And it’s still not a good idea to run a fomrat cmdlet and then pipe this to Out-File.
If you get an error you should post this error completely (formatted as code as well, please)
First, I apologize for being difficult. I did take your advice and set the code to pre-formatted in my last post, so I’m learning…slowly.
I have an Archive server with tons of data, I’m trying to run a script to start 2 folders down and scan through the sub-folders and bring back the path and the latest modified date. If the project folders are old enough, legally we can delete it, which is why I need to know.
Folder structure example:
Program 1
Projects 1-1000
Project 1
Project 2
I need to start it at the Program level and have it run through Project 1, 2, etc.
I did try your suggestion to run to csv in a test environment. When I use the code below, and run it to csv I get two lines of data in the file. If I run the same code to text I get hundreds of lines (in the test environment - same folder as the csv run). I don’t know why. I’d love to output to CSV.
However, when I run to text file in the live Archive server I get the following error.
Maybe I just need help running it to csv and figuring out why I only get two lines of data to csv and hundreds when I run to text.
Get ChildItem : Exception of type 'System.OutOFMemoryException' was thrown.
At Line: 12 char:24
+ $Folders.fullname | %{ Get-ChildItem -path $_ -Recurse | Sort-Object ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecificed: (:) [Get-ChildItem], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.Powershell.Commands.GetChildItemCommand
$StartLevel = 2 # 0 = include base folder, 1 = sub-folders only, 2 = start at 2nd level
$Depth = 20 # How many levels deep to scan
$Path = "." # starting path
$ExportFolder = "C:\Users\mmsho\Documents"
$folders = For ($i=$StartLevel; $i -le $Depth; $i++) {
$Levels = "\*" * $i
(Resolve-Path "$Path$Levels").ProviderPath | Get-Item | Where PsIsContainer |
Select FullName
}
$Folders.fullname | %{ Get-ChildItem -path $_ -Recurse | export-csv "$ExportFolder\testoutput.csv"
}
Ideally, what I’d have is a list of the the Project folder and the latest modified date of any object in that folder or any sub-folder of that project.
For example:
Under Projects 1-1000
The final list would look like this:
Project 1 July 21, 2000
Project 2 May 22, 2017
Project 3 Dec. 3, 2005
etc.
The date being the latest modified date of anything in the folder (Project 1) or any of the many sub-folders within Project 1 and so on for Project 2.
So that our archiving team can delete all projects that are older than a date/year that our attorney chooses. The attorney may say Projects 1 and 3 can be deleted based on their age.
There are thousands of projects which is why they are grouped 1-1000, 1001-2000, etc.
When you crosspost the same question at the same time to different forums you should at least post links to the other forums along with your question to avoid people willing to you help making their work twice or more.