Question

We had some bad practices when setting up folder redirection on our user shares. I now have hundreds of user folders which contain a standard 'Downloads' folder for our Win7 users. Asking Begging our users to purge their old downloaded files has not been successful. A casual estimate shows around 100Gb of data that needs to go, and we could use that space back. Quotas will be implemented in our next roll-out to help prevent this kind of data sprawl, however are not viable to implement now. This is primarily due to where Chrome has been dumping temporary data. There are a lot of people with 1-20 copies of the same file, very large / old data, etc...

I need to go through every user's Downloads folder and purge anything that is older than / has not been modified in 30 days. I've seen lots of powershell to delete files, but none of them seem to address the structure I'm trying to use. I'm new to powershell, so please pardon my ignorance.

What I need to do is start at D:\userdata\, then navigate a structured series of sub-folders, [site name]\[user name], to get to Downloads, then start purging. There are only 14 sites, but lots of users. For example, if John Doe was in Houston, his path to purge would be D:\userdata\Houston\jdoe\Downloads\.

I need to make sure this script wouldn't go any deeper in the structure than that, however. I would not need to catch D:\userdata\Houston\jdoe\My Documents\Finance\Downloads\, for example. For this reason I simply can't script out deleting contents of any folder named Downloads.

My thought is there may be some way to limit the subfolder depth of the search, or a means to match wildcards to the tune of "if path = D:\userdata\*\*\Downloads\" or something to that effect.

I've toyed with the idea of just doing a logon script that hits this location as mapped to the user. Our boot storms are hard enough on our SAN though (our environment is fully virtualized and centralized), so I'd really like to make this a task I can run from the file server in an after-hours time window. I'd be executing as local SYSTEM account to get around all the NTFS permission lockdowns we have in place, from Server 2008 R2.

Was it helpful?

Solution

Assuming you don't have a list of sites or users to work from, just enumerate the directories under d:\userdata, to get the sites, then the directories under that to get the path for each user.

$root = "d:\userdata";
$sites = Get-ChildItem -Path $root -directory;
foreach ($site in $sites) {
    $siteusers = get-childitem -path $site.FullName -Directory;
    foreach ($siteuser in $siteusers) {
        $downloaddir = Join-Path -Path $siteuser.FullName -ChildPath "Downloads";
        Get-ChildItem -Path $downloaddir -Recurse -File | Where-Object{$_.LastWriteTime -lt $(get-date).AddDays(-30)} | Remove-Item -force;
    }
}

Obviously test this against non-production data first.


If you already have a list of sites (by directory name), it looks pretty much the same:

$sites = @("Site1","Site2");
$root = "d:\userdata";
foreach ($site in $sites) {
    $SitePath = Join-Path -Path $root -ChildPath $site;
    $siteusers = get-childitem -path $SitePath -Directory;
    foreach ($siteuser in $siteusers) {
        $downloaddir = Join-Path -Path $siteuser.FullName -ChildPath "Downloads";
        Get-ChildItem -Path $downloaddir -Recurse -File | Where-Object{$_.LastWriteTime -lt $(get-date).AddDays(-30)} | Remove-Item -force;
    }
}

You could then run multiple instances of this, each with a different list of site names in $sites.

Another option there would be to collect the site names via get-childitem (into an array/collection), then split it up to have one instance of the script process the even-numbered array elements, and another process the odd ones.

OTHER TIPS

My initial thought would be to run a Get-ChildItem -directory to get a general listing of folders (pretty quick results even with thousands of folders due to provider filtering). Then run that against a Where statement that specifies a regex search to make sure you have the right folder, something like ?{$_.FullName -match "^D:\\userdata\\[^\]+?\\[^\]+?\\Downloads"} and pipe that into a ForEach that would look for old files. So, I guess something like:

GCI D:\UserData -recurse -directory | ?{$_.FullName -match "^D:\\userdata\\[^\]+?\\[^\]+?\\Downloads"}|%{
    GCI $_.FullName -recurse | ?{$_.LastWriteTime -le (get-date).adddays(-30)}|Remove-Item
}

Never mind my answer, use alroc's, it's better. Leaving mine as an option and reference for future readers.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top