清空SharePoint回收站
-
09-12-2019 - |
题
我们的SharePoint2007安装有一个特别巨大的第一阶段回收站。我在前一段时间写了一个PowerShell脚本,删除旧项目。只是为了好玩,下面贴着。
更新2:
我们在内容数据库的回收站表中有42,500,000+条记录!!
我发现在我加入团队之前安排的BDLC工作已经在RecycleBin表中抽取了如此多的数据,即使SharePoint也无法在没有超时的情况下正确管理它。即使我们的预定脚本也只能每30分钟删除1000条记录。计算一下,然后为我感到难过。有这么多的记录,现在你甚至不能强制配额没有SharePoint超时。看起来我们将不得不冻结BDLC中记录的删除
更新1: 我执行了我的PowerShell脚本来删除每个项目,并运行SQL Profiler来找出 proc_DeleteRecycleBinItem 正在执行涉及多个表,所以我们将坚持使用预定的PowerShell脚本每N分钟运行/报告一次。
我们在内容数据库的回收站表中有38,500,000+条记录!!
我相信这是如此巨大的原因是因为我们有BDLC通过Layer2同步来自其他系统的数据,它正在回收已删除的记录。此外,它非常大,即使是本机SharePoint计时器作业也无法控制它并超时。..
我知道我会被SharePoint黑手党枪杀,但是。..有没有人曾经从实际删除行 RecycleBin
SharePoint内容数据库中的表?我们的是15430792KB(14.7GB)。
我知道,如果您修改内容数据库,Microsoft将放弃支持。请不要把它作为答案。当然,这不是最佳实践。这个问题很明显:有人试过吗?...
我只是在寻找一种更快的方法来清理回收站。你可以在下面看到我的脚本,所以很明显我一直在努力建立一些维护。Bin变得如此之大,以至于脚本需要永远运行,因为有这么多的数据。
我们的用户数据实际上是203790168KB(194.3GB),我写了一个PS脚本来获取大文件,以便我也可以管理该大小。那,也在下面contibute回到乙醚。
此外,我们有一个BDLC(来自Layer2的w/3rd party工具),可以将数据同步到sql。我相信这项工作会定期删除大量数据,使RecycleBin表变得巨大。
我想我自己的问题的答案可能是。..差不多吧。..安排一个任务来运行我已经编写的maintaince脚本。..嗯..
function RemoveOldItems-SPFirstStageRecycleBin([string]$url, [int]$rowlimit, [int]$days)
{
$siteCollection = New-Object Microsoft.SharePoint.SPSite($url);
$recycleQuery = New-Object Microsoft.SharePoint.SPRecycleBinQuery;
$recycleQuery.ItemState = [Microsoft.SharePoint.SPRecycleBinItemState]::FirstStageRecycleBin
$recycleQuery.OrderBy = [Microsoft.SharePoint.SPRecycleBinOrderBy]::DeletedDate
$recycleQuery.RowLimit = $rowlimit
$recycledItems = $siteCollection.GetRecycleBinItems($recycleQuery);
$count = $recycledItems.Count;
for($i = 0; $i -lt $count; $i++){
$age = ((Get-Date) - $recycledItems[$i].DeletedDate).Days;
if($age -gt $days){
$g = New-Object System.Guid($recycledItems[$i].ID);
$recycledItems.Delete($g);
}
}
$siteCollection.Dispose()
}
function Get-DocInventory() {
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
$farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
foreach ($spService in $farm.Services) {
if (!($spService -is [Microsoft.SharePoint.Administration.SPWebService])) {
continue;
}
foreach ($webApp in $spService.WebApplications) {
if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication]) { continue }
foreach ($site in $webApp.Sites) {
foreach ($web in $site.AllWebs) {
foreach ($list in $web.Lists) {
if ($list.BaseType -ne "DocumentLibrary") {
continue
}
foreach ($item in $list.Items) {
$data = @{
"Web Application" = $webApp.ToString()
"Site" = $site.Url
"Web" = $web.Url
"list" = $list.Title
"Item ID" = $item.ID
"Item URL" = $item.Url
"Item Title" = $item.Title
"Item Created" = $item["Created"]
"Item Modified" = $item["Modified"]
"Size (kb)" = $item.File.Length/1KB
"Size (gb)" = $item.File.Length/1GB
}
Write-Host $item.Url -ForegroundColor DarkGray
# Only add files larger than 100 MB
if($item.File.Length -gt 100MB){
Write-Host $site.Url + $item.Url -ForegroundColor Red
New-Object PSObject -Property $data
}
}
}
$web.Dispose();
}
$site.Dispose()
}
}
}
}
#Get-DocInventory | Out-GridView
Get-DocInventory | Export-Csv -NoTypeInformation -Path D:\Logs\inventory.csv
$jobName = "Get Large Lists"
function Get-LargeLists() {
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
#create stop watch
[System.Diagnostics.Stopwatch] $sw;
$sw = New-Object System.Diagnostics.Stopwatch
$sw.Start()
$lists = @()
$reportSiteUrl = "http://my-site.gov/sites/applications/reporting/"
$reportListUrl = $reportSiteUrl + "Lists/Large%20Lists/"
$farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
foreach ($spService in $farm.Services) {
if (!($spService -is [Microsoft.SharePoint.Administration.SPWebService])) {continue}
foreach ($webApp in $spService.WebApplications) {
if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication]) {continue}
foreach ($site in $webApp.Sites) {
foreach ($web in $site.AllWebs) {
foreach ($list in $web.Lists) {
# only add items that have 1000 items or more
if($list.ItemCount -le 99){continue}
# create new object
$o = New-Object Object
Add-Member -InputObject $o -MemberType NoteProperty -Name SiteCollectionUrl -Value $list.ParentWeb.Site.RootWeb.Url
Add-Member -InputObject $o -MemberType NoteProperty -Name ListURL -Value ($list.ParentWeb.Url + "/" + $list.RootFolder.Url)
Add-Member -InputObject $o -MemberType NoteProperty -Name Title -Value $list.Title
Add-Member -InputObject $o -MemberType NoteProperty -Name Description -Value $list.Description
Add-Member -InputObject $o -MemberType NoteProperty -Name ItemCount -Value $list.ItemCount
# add object to $list global array
$lists += $o
}
$web.Dispose()
}
$site.Dispose()
}
}
}
#export array to csv
$lists | Export-Csv D:\Logs\large_lists.csv -NoTypeInformation -Force
#connect to SharePoint Site and List
$s = New-Object Microsoft.SharePoint.SPSite($reportSiteUrl)
$w = $s.openweb()
$l = $w.GetList($reportListUrl)
#clear SharePoint List
$query = New-Object Microsoft.SharePoint.SPQuery
$query.ViewAttributes = "Scope='Recursive'"
$query.Query = ""
$items = $l.GetItems($query)
$items | % { $l.GetItemById($_.Id).Delete() }
#export to SharePoint List
$lists | ForEach{
$item = $l.Items.Add()
$item["Title"] = $_.Title
$item["SiteCollectionUrl"] = $_.SiteCollectionUrl
$u = New-Object Microsoft.SharePoint.SPFieldUrlValue
$u.Description = "Link"
$u.Url = $_.ListURL
$item["URL"] = $u
$item["Description"] = $_.Description
$item["Count"] = $_.ItemCount
$item.Update()
}
$w.Dispose()
$s.Dispose()
#stop timer and log event
$sw.Stop()
C:\_scripts\sp_log.ps1 -jobName $jobName -message "Reported large lists on SharePoint Farm." -type "Success" -duration $sw.Elapsed.Seconds -linkTitle "Link" -linkUrl "http://my-site.gov/sites/applications/reporting/Lists/Large%20Lists/"
}
#catch exceptions
trap [Exception]{
C:\_scripts\sp_log.ps1 -jobName $jobName -message $_.Exception.Message -type "Error" -duration $sw.Elapsed.Seconds -linkTitle "Link" -linkUrl "http://my-site.gov/sites/applications/reporting/Lists/Large%20Lists/"
}
Get-LargeLists
解决方案
我执行了我的PowerShell脚本来删除每个项目,并运行SQL Profiler来找出 proc_DeleteRecycleBinItem 正在执行涉及多个表,所以我们将坚持使用预定的PowerShell脚本每N分钟运行/报告一次。