Beginner help for a deletion script

Hi there, I am a non coder/scripter and have been tasked with writing a deletion script that deletes files from a shared network drive specified by file path and collect a log of success/fails.

Limitations:

  1. I have no super user access to all of our network drive locations, so will be getting the end user to run individual copies of the script in their domain of access (e.g. Z:\Finance, Z:\Operations etc)
  2. Staff use a mix of Win 10 and 11.
  3. Mostly from network drives that could be mapped to any letter on the individual users computer.

I need to:

  1. Add the users particular set of files to a template copy of the script (anywhere between a dozen or thousands),
  2. Share it with them over Teams,
  3. Talk them through executing it; and
  4. Have them send me the log output confirming status of deletions.

I’ve tried writing this myself and just failed again and again. I asked copilot to help a brother out and it provided some impressive looking non-functional code. At my skill level right now, I can’t even begin to identify what’s wrong with it.

What I would like is to be able to paste the following into a script that will capture the output:

Remove-Item -Path D:\test\passports.xlsx -Force
Remove-Item -Path D:\test\medical certificate.pdf -Force
Remove-Item -Path D:\test\sensitive personal information.msg -Force
Remove-Item -Path D:\test\interview notes.docx -Force
Remove-Item -Path D:\test\contract.docx -Force

[some code here that catches success/failure status and writes it to a text file to be saved on their local drive]

I can already hear some questions as to why it would be done this way. I am open to hearing better but here is my rationale for some of the limitations up front:

  • The range of staff I will be asking to run the script is varied and very non-technical. If they could just execute the script without any extra steps, this would be ideal. I don’t want to talk them through saving a csv of filepaths anywhere or making sure the output folder exists.
  • My organisation will not give me access to all the network share domains (maybe they even can’t - it’s managed by a third party). I am expected to go business unit by business unit to get this done.
  • I can’t use any scripting to identify files for deletion. Identifying still requires the old fashion version of fuzzy logic (human eyeballs) to confirm deletion is ok, so list needs to be produced first and injected into the script

I would very much appreciate some ideas to get me going with this, I’m totally stuck. I can delete files en masse but I cannot log having done it, particularly when it fails due to admin rights or the specified path and file dont exist.

Thank you!

can you expand a bit more on how the targeted files are to be identified? Is the individual user providing that input? Are they to target some folders by name or are there certain file names they’re looking for?

Thanks grey0ut. Files are identified a couple of different ways. By filename for the most part. We also have a scanning solution that scans file contents for certain keywords and regex patterns.
High confidence results are accepted, middle confidence results are reviewed manually.

Our scanning solution is presenting a lot of false positives at the moment so the manual review is still key.

Ok, so how does that look for the script then? Is each user’s computer going to have an input file the script uses to get its file list from? Are the users going to be expected to select the files themselves via the script?

I have an idea for one way you could handle logging. I’ll edit this comment when I get to a computer.
EDIT:
Ok assuming $FilesToDelete is an array full of strings, with each string representing a file to delete. Let’s use your example

$FilesToDelete = @(
    "D:\test\passports.xlsx",
    "D:\test\medical certificate.pdf",
    "D:\test\sensitive personal information.msg",
    "D:\test\interview notes.docx",
    "D:\test\contract.docx"
)

Then for your loop do something like this:

$Results = Foreach ($File in $FilesToDelete) {
    try {
        Remove-Item -Path $File -Force -ErrorAction Stop
        $Removed = $true
    } catch {
        $Failure = $Error[0].Exception.Message
        $Removed = $false
    }
    [PSCustomObject]@{
        File    = $File
        Removed = $Removed
        Error   = if (-not($Removed)) {$Failure}
    }
}

$Results | Export-Csv -Path "C:\Path\To\Where\You\Want\Logs" -NoTypeInformation

I didn’t test this, so grain of salt and all that, but the idea is to use a try/catch block to ‘catch’ any errors. Whether it succeeds or fails spit out a PSCustomObject that we’re capturing in the $Results array so we can output that later.
The “Removed” property will be a simple boolean so you’ll know if the file was successfully removed or not. If it fails, the third property “Error” should contain whatever the error text was (admin rights, file not found etc).
I hope this helps in some way.

1 Like

I really wanted to be able to avoid an input script, so I could just send them 1 file to execute. So like the example I posted to literally just have all files marked for deletion as line items in the script, although I am concerned about how well that works if there are thousands of them. I have done as many as 12000 using a BAT file (literally just .del “C:\something.txt”. and it just smashed through them, hoping PS would be more or less the same. If that is too inefficient for the script then input file it is. I’m kind of aware my needs as I state them doesn’t lend themselves to best practice.

After trying to solve this a few different ways I really want to have minimal requirements to the user. Over teams chats I have watched people struggle to create a temp folder in their C drive to store a log and input file. I don’t want to disparage anyones abilities (I am here seeking help after all), but I would like to circumvent the different skill levels of people involved.

If my own skills were better I would try an approach such as having the input list in a sharepoint library and share access to the user and have the output log emailed to me (this last I’ve seen code for in other languages).

Right now I would settle for the simpler approach of executing a pretty blunt deletion and asking the user to email me the log.

I’m afraid I’m all too aware of what’s possible without having the skills to execute :disappointed:

Why would you send them anything to run? If you have the list of files to delete, they should have no impact or even knowledge of the script running. You can set up a login script that references specific list for files to delete. Another option is to run the script through scheduled tasks. You don’t even have to target specific users, you can just set it to run as “users” and any user on any given computer will run their own instance of the script. Honestly, I think this is a bad idea altogether and I would be pushing for a centralized single point to perform the file deletions. You said you don’t have ‘super admin’ but assuming these shares are stored on servers, there are many different ways you can accomplish your task. Just my 2 cents

1 Like

Can’t disagree, but I’m not part of our ICT department, I can only delete what I personally have permissions to delete. If I need to delete 1000 files from networkdrive:\Operations and I’m not part of that team, I can’t even open that filepath let alone RWXD.

We have certainly pushed for a single centralised point to perform these actions but our SteerCo has declined to sponsor that at this point. It’s a large organisation with a lot of politics and territorial disputes :slight_smile:

There has been negotiations going on for a vendor solution with ICT, it’s been a year and we don’t appear to be close to that. Down below the rarefied air of decision makers, we are nonetheless expected to carry on and produce results and report on progress. What we’re hitting with this solution is low hanging fruit and there’s plenty of it.

This is not a final state we’re developing with deletion scripts, this is just keeping things warm until stakeholder consensus lets us move forward with a better solution… probably sometime after the second coming if I reflect on the inertia in our organisation :man_shrugging:

If you’re about to simplify or speed up the work some of yours colleagues have to do you will save your company a lot of money. :point_up:t3: How about talking to the responsible people and get you the parmissions and ressources you need to accomplish this task? :man_shrugging:t3:

Just my 2 cents :wink: :love_you_gesture:t3:

Thank you @grey0ut this worked a treat!

Even better, the code is clear and easy to understand :heart:, I am learning!

I added 2 more lines to the script, first line to create the destination folder of output if it does not exist (I think filepath or folder name is a bit different in Windows 10) and the last line to open that folder when complete.

This worked fine here on my local computer, any reason why this could be problematic out in the wild?

[IO.Directory]::CreateDirectory("C:\temp")

$FilesToDelete = @(
    "D:\test\passports.xlsx",
    "D:\test\medical certificate.pdf",
    "D:\test\sensitive personal information.msg",
    "D:\test\interview notes.docx",
    "D:\test\contract.docx"
)

$Results = Foreach ($File in $FilesToDelete) {
    try {
        Remove-Item -Path $File -Force -ErrorAction Stop
        $Removed = $true
    } catch {
        $Failure = $Error[0].Exception.Message
        $Removed = $false
    }
    [PSCustomObject]@{
        File    = $File
        Removed = $Removed
        Error   = if (-not($Removed)) {$Failure}
    }
}

$Results | Export-Csv -Path "C:\temp\deletion-log.csv" -NoTypeInformation

ii C:\temp

Once again many thanks!

The additions look like they should work. I would avoid using aliases like “ii” in production code because it’s difficult to read. This one is a good example because I had no idea what it was. I popped open Powershell and typed Get-Alias 'ii' in order to see that this is for Invoke-Item.
The only issue I can see is if for some reason the original creation of the “C:\Temp” folder fails, the logs will fail to export there, the script execution will complete, and the logs will be lost forever.
You might consider using a folder that’s likely already present, depending on your environment. The Desktop or Documents folder maybe?

$DestinationDir = Join-Path -Path $Home -ChildPath "Documents"

Then for your export line

$LogFile = Join-Path -Path $DestinationDir -ChildPath "deletion-log.csv"
$Results | Export-Csv -Path $Logfile -NoTypeInformation

If you take a look at the automatic variable $home on your machine you’ll see that it’s the home folder for the current user. If your user is running the script, then it’ll be their home folder and they’ll for sure have rights to it.

1 Like