One of our NAS Shares has grown 4 TB in 2016 and we would like to see where those files are located to see where the growth is from. So I would like to scan for files created/edited in 2016 and then sort the results by folder size. We have TreeSize Professional and I don’t see how to do this with this product so I was hoping maybe someone has an idea or a script that does this.
I suppose you could certainly use PowerShell for this. Get-ChildItem with -recurse, piped to Where-Object to filter by date, piped to Sort-Object to sort by size.
This isn’t really the place to have someone write out a full script for you, but this should really just be a one-liner. It’s going to be long-running, though - that’s a lot of files for it to go through. A lot, a lot. And it may eat up a lot of memory compiling the final list, depending on how many files meet your filter criteria. All that said, if you have a start on something and need help with a specific bit, please - let me know!
But this would be FILES, not FOLDERS. Folders do not have a size. The size of a folder is the sum of its files’ sizes. Windows doesn’t track folder size.
thanks wasn’t sure if Powershell could get me this information
Kelly, to speed up this noticeably you could misuse robocopy. It’s much faster than Powershell.
$Source = ‘D:’
$Destination = ‘C:_temp’ # the directory must exist but nothing will be copied
$FilePattern = ‘.’
$StartDate = ‘20160101’ # Date Format YYYYMMDD
$EndDate = ‘20161231’ # Date Format YYYYMMDD$collection = robocopy $Source $Destination $FilePattern /MAXAGE:$StartDate /MINAGE:$EndDate /L /Bytes /NJH /NJS /NDL /TS /NC /S /Z /R:0 /W:0
$Output = foreach ($item in $collection) {
If($item -match ‘^\s+(\d+)\s+(\d{4}/\d{2}/\d{2}\s+\d{2}:\d{2}:\d{2})\s+(.+)$’ ){
[PSCustomObject]@{
Size = [LONG]$Matches[1]
DateTime = [DateTime]$Matches[2]
FullName = $Matches[3]
}
}
}
$Output
Now you have the data you need and can use it as you like. Par example like this:
$Output | Select-Object -Property Size,DateTime,@{Name=‘Path’;Expression={Split-Path -Path $.FullName -Parent}},@{Name=‘BaseName’;Expression={Split-Path -Path $.FullName -Leaf}}
Now you could group the result or calculate the folder size or whatever you need.
You can even pimp the data collection for special conditions like max or min file size or anything else what robocopy has an option for.
You might need to change the -match pattern to fit your language settings. I created it on a German Windows 7. If you need help with this just ask here.
I did a blog post about using the quickio.net library.
For a different reason (client had a crypto locker type event) but we basically scanned millions of files and it was pretty quick.
You could probably use the script I provide in the post and use where/sort etc. or customize it to your needs.
Personally I don’t like the robocopy route since I ended up with issues regarding special characters in the robocopy output.
But YMMW.
You can find the post here:
Personally I don't like the robocopy route since I ended up with issues regarding special characters in the robocopy output.Hmm ... I tested it on my client. It's an English Windows 7 Enterprise with a German MUI and I didn't have any problems with German Umlaute or Germanic Umlaut. ;-)
We used a Windows 2012 R2 machine, english system/display-language, swedish key/date.
Didn’t matter if it were any of the swedish characters or e.g. the danish/norwegian “Ø”.
Anyway, we didn’t have to mess with regex until the very last filtering of the extensions, so that was a big plus