Return date of most recently modified file in each sub-dir

by Wickedemus at 2013-03-15 04:19:59

Hi

I am not a PowerShell coder, so I will be super appreciative if my question is answered fairly simply. I have a folder named Projects, in which there are many folders each containing project files and sub folders. Once a project has become dormant for over a year, I need to archive that project folder, and remove it off my system. Therefore I need a script to run through each project folder and its contents/syub folders to see when the last modified file date in any of its subfolders is, and return just one result for each main project folder and all its contents/subfolders. If the last modified file anywhere within in a project folder is over a year old, I know that the entire project folder can be archived, and will do that manually.

Therefore, I need a command to issue in PowerShell to do the dirty work for me and return a list containing the filename, last modified date, (and for completeness ake the full path) of the most recently modifed file sitting anywhere within each project directory, and it would be great to get it entered into a text file, sorted according to the date (oldest to newest result).

Thank you very much in advance for your help.
by poshoholic at 2013-03-15 06:10:04
Hi there,

You’ve come to the right place with your question. Fortunately, the answer to this is not too complicated, so it’s a good first PowerShell problem to start with.

First, I’m writing this post assuming that you are using PowerShell version 3. If you are using PowerShell version 2, then the response will be slightly different.

Suppose your projects folder is C:\Projects. You could get the output you need from PowerShell like this:
# Identify the folder you want to search
$projectsFolder = 'C:\Projects'
# Identify the location of the csv file that will be generated (this will be overwritten each time you run it)
$resultsPath = Join-Path -Path $projectsFolder -ChildPath DormantProjectReport.csv
# Get the subfolders of the folder being searched, including hidden folders (-Force) and find the most recently modified file
# Sort those files by last modification date in descending order, select the Name, LastWriteTime, and FullName properties and write them to the csv file
Get-ChildItem -LiteralPath $projectsFolder -Directory -Force | ForEach-Object {
# Get all files under each subfolder, sorted by the last modification date in descending order, select the first one (the most recently modified file) and return it
Get-ChildItem -LiteralPath $.FullName -File -Recurse -Force | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
} | Sort-Object -Property LastWriteTime -Descending | Select-Object -Property Name,LastWriteTime,FullName | Export-Csv -LiteralPath $resultsPath

The steps here are:
1. Use Get-ChildItem to get the immediate subfolders of the project folder (the first Get-ChildItem call).
2. For each of those subfolders, identify the most recently modified file anywhere under that folder, whether hidden or not (the second Get-ChildItem call piped to Sort-Object and Select-Object).
3. Sort the collection of most recently modified files for all projects in descending order according to their last modification date and export the name, last write time, and full name (path) to a csv file.

That’s pretty much it. If you have any questions about any part of this that isn’t clear, please let us know.
by Wickedemus at 2013-03-22 00:31:53
Hey Kirk

Thanks for the quick relpy - that was amazing! I was expecting an email notification of a reply, and never got one. So I decided to come and have a look see, and lo and behold my answer has been waiting here for a week! Some checks reveal that I am using PowerShell v2. I have now upgraded to v3, and will wait for the weekend to restart the server (which is required after upgrading to v3). I will test and let you know if there are any issues/mods needed.

You are a scholar and a gentleman, thanks for everything.
by Wickedemus at 2013-03-24 06:29:05
Hi Kirk

Your script works a treat, thank you. One thing which would be great to add would be output to another text file of directories and files where the path length exceeds the maximum limit. An example of the output to screen of when this happens whilst running the script is as follows:

Get-ChildItem : The specified path, file name, or both are too long. The fully qualified file name must be less than
260 characters, and the directory name must be less than 248 characters.
At line:1 char:80
+ Get-ChildItem -LiteralPath $projectsFolder -Directory -Force | ForEach-Object {G …
+ ~
+ CategoryInfo : ReadError: (D:\Projects\PA
…LATERAL SUPPORT:String) [Get-ChildItem], PathTooLongException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand

Is this possible?

Thanks and regards
by poshoholic at 2013-04-07 09:16:43
Hi again,

Sorry for the slow reply, I missed seeing your last follow-up.

Identifying the folders that are too long in this case is actually pretty easy. You can wrap the Get-ChildItem pipeline in a try/catch block, and catch any exceptions of type [System.IO.PathTooLongException], recording them in a file or handling them as a special case in your catch statement. For example, here is one (still untested) strategy that could work for you:
# Identify the folder you want to search
$projectsFolder = 'C:\Projects'
# Identify the location of the csv file that will be generated (this will be overwritten each time you run it)
$resultsPath = Join-Path -Path $projectsFolder -ChildPath DormantProjectReport.csv
# Identify the file that will be used to track paths that are too long
$longPathLog = Join-Path -Path $projectsFolder -ChildPath LongPaths.txt
# Create/overwrite the long path log file
New-Item -Path $longPathLog -ItemType File -Force > $null
# Get the subfolders of the folder being searched, including hidden folders (-Force) and find the most recently modified file
# Sort those files by last modification date in descending order, select the Name, LastWriteTime, and FullName properties and write them to the csv file
Get-ChildItem -LiteralPath $projectsFolder -Directory -Force | ForEach-Object {
# Identify the current project folder path
$currentProjectFolder = $.FullName
try {
# Get all files under each subfolder, sorted by the last modification date in descending order, select the first one (the most recently modified file) and return its name, lastwritetime and fullname properties
Get-ChildItem -LiteralPath $currentProjectFolder -File -Recurse -Force -ErrorAction Stop | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1 -Property Name,LastWriteTime,FullName
} catch [System.IO.PathTooLongException] {
# Identify that the current project has a path that is too long
$
.TargetObject | Out-File -Append -LiteralPath $longPathLog
# Write a message to the screen so that it is visible when these long paths are found
Write-Warning "The following path was too long and could not be processed: $($_.TargetObject)."
}
} | Sort-Object -Property LastWriteTime -Descending | Export-Csv -LiteralPath $resultsPath

If you want to process them even if the paths are too long, you can use PSDrives to create new drives that point to somewhere deeper in the long path using a custom share (shares are necessary to work around the long path length issue) so that you can still process them with less than 260 character paths, removing the drives you create afterwards. That is more work, involving using WMI to create the shares, New-PSDrive to create a drive mapping to the UNC paths, Remove-PSDrive to remove the paths once done, and WMI to remove the shares when finished. That’s a longer exercise though and I don’t have time to dig into that at the moment. If you need that as the ultimate solution, reply back again and I’ll see if I can help build that out later.