Question

Will be going through a migration shortly and I need to identify any file/pages that have a title/page name over 200 charcters long.

I don't have access to power shell so looking for any alternatives ??

Pas de solution correcte

Autres conseils

This is a pretty basic script but should get you started. Comments in line but feel free to ask any questions:

#Specify creds or use current creds
#$password = Read-Host -Prompt "Password:" -AsSecureString
#$creds = New-Object System.Management.Automation.PSCredential ("domain\user", $password)
#$ws = New-WebServiceProxy -uri "http://sharepoinurl.com/_vti_bin/lists.asmx?WSDL" -Credential $creds
#use Windows creds
$ws = New-WebServiceProxy -uri "http://sharepoint-test.com/sites/a-site/_vti_bin/lists.asmx?WSDL" -UseDefaultCredential
#Get a list
$listGuid = $ws.GetList("Documents")
#Build your query
$xml = New-Object System.Xml.XmlDocument
$query = $xml.CreateElement("Query")
#if you need a CAML query add it here:
#$query.InnerXML ="
#   <Where>
#       <Eq>
#           <FieldRef Name='ContentType' />
#           <Value Type='Text'>Document</Value>
#       </Eq>
#   </Where>"
$viewFields = $xml.CreateElement("ViewFields")
$queryOptions = $xml.CreateElement("QueryOptions")
#modify scope if required (this will recurse folders)
$queryOptions.InnerXml = "<ViewAttributes Scope='RecursiveAll' IncludeRootFolder='False' />"
#Row limit. For large lists you will want to do this recursively. I would normally use a rowlimit of about 2k but test your own.
$rows = 10
try {
    #call get items WS here
    $items = $ws.GetListItems($listGuid.name, $null, $query, $viewFields, $rows, $queryOptions, $null)
}
catch {
    $ex = $_.Exception
    $detail = $ex.InnerException.Detail.Errorstring | select '#text'
    Write-Host $detail
}
foreach ($i in $items.data.row) {
    #This is the filename. You could export it to csv or whatever.
    $i.ows_LinkFilename + "`t" + $i.ows_LinkFilename.Length
}
Licencié sous: CC-BY-SA avec attribution
Non affilié à sharepoint.stackexchange
scroll top