Posts

Gradual Site Collection Deletion

Gradual Site Collection Deletion

I had a mission to achieve overnight; move 28 site collections into a new managed path, as well as rename the 28 associated content databases.   Pretty straightforward to do, and to script in advance to run in stages:

  1. Backup site collections
  2. Delete site collections
  3. Dismount content databases
  4. Rename content databases and shuffle around underlying storage including FILESTREAM RBS location
  5. Create the new managed path; reset of IIS
  6. Remount 28 content DBs
  7. Delete the old managed path
  8. Restore the 27 content databases (this is where tinkling glass is heard, followed by painful silence)

After enough jolt cola to get that turtle to beat the hare fair and square, I got to the bottom of the problem.  First the problem.  The site collections could not be restored into their original database, as the site collection purportedly already existed there.  Even though it was deleted.

By midnight I gave up, and did the 28 Restore-SPSite into any random content databases, tossing organization and structure to the winds (temporarily), knowing I’ve got gads of available storage, knowing once I got to the bottom of the issue, a simple set of move-spsite commands would set things right.  No, I don’t like randomness, but I also like a few winks of sleep and happy users…

Now the cause.  Since SharePoint 2010 SP1 and SP2013 and SP2013, has the ability to recover deleted site collections (not through the UI, but only through PowerShell).  I used the -gradualdelete option, thinking I would be nice and gentle with a production farm.  Here’s a sample of my delete commands, where I also disable prompting:

Remove-spsite  href="http ://sharepoint/div/clm/int/A/"  -confirm:$false -gradualdelete

Here’s the kicker.  After the delete, the site collection is indeed still there.   It sticks around actually for the duration of the the Recycle Bin duration (default 30 days).  There’s one good way to see, let’s dive into the forbidden content database, and have a peek:

SELECT [DeletionTime]
,[Id]
,[SiteId]
,[InDeletion]
,[Restorable]
FROM [Content_mydivision_a].[dbo].[SiteDeletion]
where (Restorable=1)

Restorable=1 indicates this site collection could be restored.

The solution?  Well, it’s not the recycle bin job, that has no effect on this.  There is a gradual delete job at the web application level, but that won’t help us either; at least not just yet.  First you have to use the remove-spsite CmdLet to remove each site permanently.  Here’s the syntax:

remove-spdeletedsite f5f7639d-536f-4f76-8f94-57834d177a99 -confirm:$false

Ah, you don’t know your Site Collection GUIDs by heart? well, me neither, I prefer more useful allocation of brain cells, so here’s the command that will give you the Site Collection GUIDs that have been (partially) deleted:

get-spdeletedsite -webapplication "http ://sharepoint/"

So, you got your partially deleted GUIDs, you diligently did a remove-spdeletedsite for each, but the Restore-SPSite still will not work.  Now’s the time to run the handy-dandy Gradual Delete timer job for your web application, in Central Admin, Monitoring.  First thing you might notice is the job is taking a bit of time to run.  That’s good, it’s doing something for you, and actually triggering the Content DB Stored Procedure called proc_DeleteSiteCoreAsync.  It actually deletes in batches.

Here’s how to wipe out all these mildly annoying site collections from your recycle bin for a web application:

get-spdeletedsite -webapplication "http://my-srv-sp10/" | Remove-SPDeletedSite

At this point your Restore-SPSite will work to your target content database, and if you played SharePoint Roulette like me and restored to a random location, a move-SPSite will make fast work of putting things where they should be.

More information on the Gradual Deletion Timer Job can be found in this Technet Article by Bill Baer

Copying folder hierarchy with date filter and elimination of empty folders

Ever need to copy a tree of folders? I had to do this only for files older than a specified date. Plus empty folders were not welcomed. How to go about this?

We’ll approach this in four easy steps. First we’ll set the source, destination and threshold date, followed by recreating the empty folder structure on the target:

$sourceLocation = "\ny-srv-fs3Reserve"
$DestLocation = "D:plautjACTTMP"
$thresh = get-date "December 31, 2006"
xcopy $sourceLocation $DestLocation /T

Next we’ll get the full set of files and folders:

$q = Get-ChildItem $sourceLocation  -Recurse

We will now copy all the files older than the threshold:

foreach ($qItem in $q)
{
if (!$qitem.PSiscontainer)
{
if ($qItem.LastWriteTime -lt $thresh)
{
$DestItemLoc = $qitem.FullName.replace($sourceLocation,$DestLocation)
copy-item $qitem.FullName  $DestItemLoc
}
}
}

Lastly, let’s delete empty folders. The kep is specifying the AllDirectories search option, otherwise it will delete folders that are devoid of immediate files, but which have files in subfolders:

$a = Get-ChildItem $DestLocation -recurse | Where-Object {$_.PSIsContainer -eq $True}
$a | Where-Object {$_.getfiles("*",[System.IO.SearchOption]::AllDirectories).Count -lt 1} | Select-Object FullName | ForEach-Object {remove-item $_.fullname -recurse}

Extracting to a CSV the list of files in folders

It’s easy to extract the set of files in all folders, here’s how using PowerShell:

$Files = Get-ChildItem "C:ChooseAnyFolder" -Recurse
$files  | select name | convertto-csv -notypeinformation | out-file "C:tempreportFile.csv";

Feel free to select your choice of fields within the Pipeline.

Uploading pictures into ActiveDirectory remotely via PowerShell

One can upload pictures into PowerShell remotely. A few requirements:
1. Install RSAT; ; that’s adding the Feature in Computer Management
2. Have remote web services running on AD
3. Have sufficient access; you must be member of either the Organization Management or Recipient Management role groups to upload the pictures.
4. Have a set of photos, preferably named with the user accounts, JPGs, 96×96 and less than 10kb

Here’s what’s needed in PowerShell to get started using AD CmdLets:

Import-module ActiveDirectory
if ((Get-PSSnapin -Name ActiveDirectory -ErrorAction SilentlyContinue) -eq $null ){
import-module ActiveDirectory
}

here’s how to see the existing picture for a given user:

$user = Get-ADUser [user] -Properties thumbnailphoto
$user.thumbnailphoto.length
$user.thumbnailphoto | Set-Content "C:tempJtest3a.jpg" -Encoding byte -Force

Testing in Outlook: Pictures are cached in Outlook for the duration of the session. To check whether the picture is available, one has to totally exit Outlook.

Let’s get all the files to process:

$LocalFiles = get-childitem -Path $WorkingPath -filter $TypeFilter | where {!$_.PSIsContainer} 

To upload the picture, get the user:

$User = Get-ADUser -Filter {SamAccountName -eq $Name}

Let’s get the photo after resizing to 96×96 ensuring it is less than 10k

$Photo = [byte[]](Get-Content "$WorkingPath$File" -Encoding byte)
Set-ADUser $Name -Replace @{thumbnailPhoto=$Photo}

Granting myself site collection admin access

Sometimes one needs to grant themselves site collection admin access, here’s two kinds of site collection admin status assignments:

$meStr = "domain\administrator";
$w = get-spweb "http ://subdomain.domain.com/sites/Projects"
$me = $w.ensureuser($meStr)
$me.issiteadmin
$me.issiteadmin=$true
$me.update()

Or set Primary or secondary:

Get-SPSite -Limit All | ?{$_.url -notlike "*/Departments/"} | %{Set-SPSite $_ -OwnerAlias "<domainuser>" -SecondaryOwnerAlias "<domainuser>"}

Don’t forget to pipe your SPContentDatabases and even all SPDatabases into an AddSPShellAdmin CmdLet to ensure DB access for yourself is granted.

Report on RBS Configuration by Content Database

RBS Configuration reporting by Content Database

Here’s a simple script that will report on the RBS configuration across all your Content DBs. It’s useful to be able to lower the minimum blob threshold size for your largest DBs. Just remember to do a PowerShell Migrate() to force the movement of documents in or out of RBS, and remember this command can take a while to run.

Get-SPContentDatabase | foreach {$_;
try {
$rbs = $_.RemoteBlobStorageSettings;
write-host "Provider Name=$($rbs.GetProviderNames())";
write-host "Enabled=$($rbs.enabled)";
write-host "Min Blob Size=$($rbs.MinimumBlobStorageSize)"
} 
catch {write-host -foregroundcolor red "RBS not installed on this database!`n"}
finally {write-host "------------------------------------------------------------------`n"}
}

Content Type Summary Report

Content Type Summary Report

Sometimes I get challenged with questions as to which fields are used in which Content Types.  All too often I need to know quickly know the internal name of fields used in Content Types.  I wrote a script that generates a report that you can run to generate a CSV that can easily be Pivoted in Excel for answering such questions. I’m a huge fan of using a Content Type Syndication Hub. With all the Content Types in one location, this report becomes very useful.

$rootwebname="http ://SharePoint"
$rootweb=Get-SPWeb $rootwebname
$MyCTSummaryCSV="L:CTSummary.CSV"
Add-Content  $MyCTSummaryCSV "CT Name,CT Group,Parent CT, CT Read-Only,CT Hidden,Field Internal Name,Field Title,Field Type,ShowInDisplayForm,ShowInEditForm,ShowInNewForm"
$CTs=$rootweb.contenttypes
for ($i=0; $i -lt $CTs.count; $i++)
{
$CT=$CTs[$i];
$CTName=$CT.Name;
$Fields=$CT.Fields;
for ($j=0; $j -lt $Fields.count; $j++)
{
$Field=$Fields[$j];
$OutStr="$($CTName),$($CT.group),$($CT.Parent.Name),$($CT.ReadOnly),$($CT.Hidden),$($Field.staticname),$($Field.Title),$($Field.type),$($Field.ShowInDisplayForm),$($Field.ShowInEditForm),$($Field.ShowInNewForm)"
Write-Host "." -NoNewline
Add-Content  $MyCTSummaryCSV $OutStr
#write-host "$($outstr)"
}
}

It’s easy to then import this into an Excel file and Pivot away.

Let’s take it one step further and report on which Content Types and fields are in use within each Content Type enabled library in every web of a Site Collection:

$rootwebname="http ://SharePointdev/div/inv"
$rootweb=Get-SPWeb $rootwebname
$MyCTSummaryCSV="C:UsersplautjDocumentsPowerShellINVCTSummary.CSV"
Add-Content  $MyCTSummaryCSV "web,lib,CT Name,CT Group,Parent CT, CT Read-Only,CT Hidden,Field Internal Name,Field Title,Field Type,ShowInDisplayForm,ShowInEditForm,ShowInNewForm"
$CTs=$rootweb.contenttypes
$site = Get-SPSite $rootwebname
$webs = $site | Get-SPWeb -Limit all
foreach ($web in $webs)
{
$libs = $web.lists;
foreach ($lib in $libs)
{
if ($lib.contenttypesenabled)
{
$CTs = $lib.contenttypes;
for ($i=0; $i -lt $CTs.count; $i++)
{
$CT=$CTs[$i];
$CTName=$CT.Name;
$Fields=$CT.Fields;
for ($j=0; $j -lt $Fields.count; $j++)
{
$Field=$Fields[$j];
$OutStr="$($web.title),$($lib.title),$($CTName),$($CT.group),$($CT.Parent.Name),$($CT.ReadOnly),$($CT.Hidden),$($Field.staticname),$($Field.Title),$($Field.type),$($Field.ShowInDisplayForm),$($Field.ShowInEditForm),$($Field.ShowInNewForm)"
Write-Host "." -NoNewline
Add-Content  $MyCTSummaryCSV $OutStr
#write-host "$($outstr)"
}
}
}
}
}

Set Portal link for navigating above a Site Collection

Set Portal link for navigating

One of the annoyances in SharePoint is that there is no easy way for end users to navigate to outside the site collection. Creating a Portal Connection is one easy way. However configuration is manual. Here’s a way to automate this connection.

First, we get the Web Application object, then use that to iterate through all the Site Collection objects.  the -Limit All ensures we process all, otherwise it will default to the top 20.  Then we set the Portal name and link.  note that no update() is required if you use the methods below.  If you instead update the properties directly, an object update() is required.

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
# Script changes the letter heading in each site collection - Joel Plaut
$WebApp = Get-SPWebApplication "http ://sharepoint";
$SA = $WebApp | Get-SPSite -Limit all
foreach ($MySite in $SA)
{
Write-Host "Fixing Portal for $($MySite.url)" -ForegroundColor darkred
$MySite.set_Portalname("Home")
$MySite.set_PortalURL("http ://sharepoint")
}

Consolidation of Application Pools

Automated Consolidation of Application Pools

SharePoint leverages IIS, and runs within Application Pools.  One should recognize up front that there are two distinct categories of Application Pools used in SharePoint; Web Application pools and Service Application Pools.

Overview

Application Pools consume an estimated 80-100MB RAM each, and possibly a lot more, depending on usage.  These appear as w3wp.exe processes in Task Manager.  When you have a number of w3wp processes running, it can be hard to tell them apart; which is for a given web or service application?  Here’s a way to get the PID (Process ID) for each worker process, along with the user-friendly name, so you can correlate each w3wp process:

c: 
cd C:WindowsSystem32inetsrv appcmd.exe list wp
[/sourcecode ]
A nice listing of Web Application Pools is generated by this single command.  It ensures the fields are not truncated, and is extensible to allow display of any properties/columns you wish:
get-SPWebApplication | select displayname, url, applicationpool | format-table -autosize | out-string -width 2000
[/sourcecode ]
Note the CmdLet "get-SPWebApplication".  For Service Application Pools, the CmdLet is "Get-SPServiceApplicationPool", as in:
Get-SPServiceApplicationPool | select Id, Name, DisplayName, processaccountname
[/sourcecode ]
Within IIS, the Service Application Pools are identified by GUID.  Their mapping can be explored individually by examining the application pool binding, but this is a bit laborious.
<h1>Removing a Service Application Pool</h1>
To remove a Service Application Pool by name, you can use:
Remove-SPServiceApplicationPool -Identity "Your Orphaned SharePoint Service Application Pool Name"</pre>
[/sourcecode ]
My own preference is to first consolidate the application pools, then in IIS quiesce the desired application pools, and only once things are truly running smoothly, they can be removed.  It is important to do the removing and adding in PowerShell and not directly in IIS.  This will ensure that the correct IIS configuration gets propagated to all current and future WFEs (Web Front Ends).
<h1>Consolidating Web Application Pools programmatically</h1>
Consolidating Web Application Polls is quite easy.  If you do not have security driven segregation of Application Pools, you can consider doing so.  Note I have received conflicting advice on doing this.   Todd Klindt who I hold in the highest regard recommends considering consolidation.  Microsoft does advise quite a low maximum number of Application Pools, yet their support staff have advised segregation.
First let's grab an existing Application Pool, and simply reassign it to the target Web Application Pool:
$sourceWebAppPool = (Get-SPWebApplication &lt; URL of a webapp whose application pool you want to use&gt;).ApplicationPool 
$webApp = Get-SPWebApplication &lt; URL of the web application you want to change&gt; 
$webApp.ApplicationPool = $sourceWebAppPool 
$webApp.ProvisionGlobally() 
$webApp.Update() 
iisreset

Lather, rinse, and repeat for each of your Web Apps to be consolidated…

Note that there is no SharePoint CmdLet for creating an Application Pool.  You can use the IIS CmdLet, but I am not convinced this is a safe method, as right away I can see the service account is an IIS Service Identity reference, and the application pools have a different type and cannot be assigned to a web application directly.   here’s the CmdLet for reference:

Import-Module WebAdministration
$appPool = New-WebAppPool “My new App Pool”
[/sourcecode ]
If you need to segregate previously consolidated web application application pools, the following round-about procedure is safe and works:

  1. Create a brand new temporary Web Application and associated pool
  2. Reassign your target web app’s pool, as described above
  3. Destroy the temporary Web Application

The sequence is key.  If you destroy the temporary web application, the associated pool is destroyed with it, because there are no other associated applications.  In contrast, once you assign a second web application to this new application pool, the application pool will not be destroyed when the temporary web application is removed.

Consolidating Service Application Pools programmatically

On some farms I’ve inherited, there is a profusion of unnecessary Service Application Pools.  Note there are reasons to isolate distinct service application pools, generally around security and multi-tenancy.

I wanted to consolidate my Service application pools but in a safe and programmatic manner.  While you can change the account associated with a service application, there is the risk that the new service account won’t quite have the necessary access.  When SharePoint does automatically grant access to the new service account, it won’t remove access from the original service account.  Lastly, one needs to make sure the new service account is configured as a managed account.  Lastly, SharePoint doesn’t support managed service accounts for all service applications.  Exceptions do include the User Profile Service ADSync account, unattended user accounts (Excel, Visio, PerformancePoint), search crawl accounts (Foundation, Enterprise, and FAST), and the two Object Cache Portal Accounts (which are configured for each Web Application for performance reasons).

Not every Service Application runs under an Application Pool; some run under the Farm’s pool, and hence can’t be directly reassigned. Farm-wide service applications have one and only one instance in the farm.  So the Security Token Service Application, Web Usage Application (WS_UsageApplication), State Service and Application Registry Service Application don’t run under their own Application Pools, and their CmdLets are simpler.  So while one can create multiple Session State Service Applications and corresponding databases, there’s only one State Service Application.  The script below lists them by name in an array and makes sure not to even try to remap these.  For good measure I include the FAST Content Service Application in this category.  Note yours may be named differently.

Next, I wanted to start with a clean Service Application Pool named clearly denoting the desired service account.

$MyRefAppPool=new-spserviceapplicationpool -name “SPService Service Application Pool” -account “YourDomainspservice”
$SPaps = get-spserviceapplication 
for ($i=0;$i -lt $SPaps.Count; $i++)
{  
$SPap = $SPaps[$i];     
#Some service applications run at the farm level and don’t have selectable application pools, hence is it wiser to filter these out up front. 
if (@(“SecurityTokenServiceApplication”,”Application Registry Service “,”FASTContent”,”State Service”,”WSS_UsageApplication”,””) -notcontains $SPAP.DisplayName)  
{   
try
{   
$testAppPool=$SPAp.get_applicationpool();  
#Don’t mix & match the application pools and accounts, best is to consolidate along the lines of existing process accounts, to avoid permissions issues  
if ($testAppPool.ProcessAccountname -eq $MyRefAppPool.processaccountname)   
{     
write-host “Processing  $($SPAp.name) because it has the target process account: $($MyRefAppPool.processaccountname)”
     $SPAp.set_applicationpool($MyRefAppPool)
            $SPAp.update()  #update() is actually required.  You will notice a processing delay during the update
   }
   else
{
write-host “Skipping $($SPAp.name) because it has a different process account: $($SPAp.get_applicationpool().ProcessAccountname)”
}
   }
catch
{
$testAppPool=$null;
   write-host “Skipping $($SPAp.name) because it had an error”
   }
 }
}
[/sourcecode ]
An IISReset at this point is advisable.  Lastly, you can go into IIS after running this, view Application pools, and “Stop” the GUID named Application Pools with zero associated Applications.  Another IISReset is advisable to ensure you are recovering your RAM.

For a more general overview of Application Pool configuration, please see TechNet.