Working with Web Parts in PowerShell

Working with web parts programmatically across pages is awkward but possible in PowerShell. Let’s start by generating a report of the web parts. This iterates through non-admin web apps, site collections, and pages:

 
$oContentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService;
[Microsoft.SharePoint.Administration.SPWebApplicationCollection]$waCollection = $oContentService.WebApplications;
$log = ".\results.txt"    # output file name and path
$pagepath = "/default.aspx"    # you can change page name or page path  
"Site URL; WebPart Title ; Webpart ID" | out-file $log
$waCollection1 = $waCollection | where-object {$_.IsAdministrationWebApplication -eq $FALSE}
foreach ($wa in $waCollection1)
{
foreach ($obj in $wa.Sites) 
{
write-host "Processing site: " , $siteURL
$siteURL = $obj.URL
$site=new-object Microsoft.SharePoint.SPSite($siteURL)
$pageURL = $siteURL + $pagepath  
$web=$site.Openweb()   
$webpartmanager=$web.GetLimitedWebPartManager($pageURL,  [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)   
foreach ($webpart in $webpartmanager.WebParts)
{   
$siteURL + "; " + $webpart.Title + " ;  " + $webpart.ID | out-file $log -append   
}                           
}

As an example, we can remove web parts programmatically, by specifying the site collection and Web Part GUID:

$siteURL = "ht tp://sharePoint/sites/specialsite";  # first constant: site URL
$webpartId = "<guid of your choice for webpart>;   # second argument:  webpart GUID
$pagepath =  "/default.aspx"        # change page name or page path here
$pageURL = $siteURL + $pagepath
write-host "Processing site: ", $siteURL
Write-host "Processing page: " , $pageURL
write-host "Processing webpart ID: " , $webpartID
$site=new-object Microsoft.SharePoint.SPSite($siteURL)
$web=$site.Openweb()
$webpartmanager=$web.GetLimitedWebPartManager($pageURL, [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)
$webpartmanager.DeleteWebPart($webpartmanager.Webparts[$webpartId])
$web.Update()
$web.Dispose()
write-host "Finished."

Fixing taxonomy terms after a Metalogix migration

When migrating from SharePoint 2010 to 2013 using the Metalogix Content Matrix 7.0.0.1 migration tool, a taxonomy issue was encountered. The individual term from the SP2010 source was not applied to the SP2013 destination and associated with the correct termset and term.
The destination taxonomy field appeared empty in every sense. It was not displayed when viewing the Item, nor in the list view.
However one remnant of the term did appear in a surprising place. Within the read-only property bag of the item (item.XML) there was a field with what appears to be a GUID devoid of dashes as property name, with a value that seems somewhat usable.
The property value in this case was “a466a2acd62b498291c78829c2bb5fe3”.
item.GetFormattedValue(“a466a2acd62b498291c78829c2bb5fe3”) gets the value, as does item[“a466a2acd62b498291c78829c2bb5fe3”]
This is the GUID of the target field associated with the termset.
To make this more generic, I set the targetField to the field name we are looking for, and derive the GUID generically this way.

[Microsoft.SharePoint.Taxonomy.TaxonomyField]$taxonomyField = $item.Fields.GetField($targetField)

Then stripped off the dashes, to get it into the format that appears in the property bag:

$FieldGUID = $taxonomyField.id.Guid.tostring().replace("-","")

Then I derive the value for the property:

$FQterm = $item[$fieldGUID]

The value appears in the internal format, so I strip off any text appearing after a pipe:

$fqTerm = $FQterm.Substring(0,$FQterm.IndexOf("|"))

To match to the appropriate term, we need to handle the situation where a text value can appear under multiple parents. So we walk the hierarchy to find the single correct match. First we split the term into colon delimited segments:

$termHierarchy = $FQterm.Split(":")

Then check parent term matching to narrow to one appropriate matching term. This assumes checking two levels finds the match. It can be extended to match 7 levels deep if needed:

if ( $TC.count -gt 1)
{
$termIdx=-1;
if ($termHierarchy.count -ge 2)
{
$ParentTerm = $termHierarchy[$termHierarchy.count-2];
}
else
{
$ParentTerm = $null;
}
for ($ti=0; $ti -lt $TC.count; $ti++) #loop to ensure parent is right one
{
if ($TC[$ti].parent.getdefaultlabel(1033) -eq $parentTerm)
{
$termIdx = $ti;
}
}
}

Programmatically Configuring Metadata Navigation

Metadata navigation offers a quite clever way to navigate documents within a library. It cuts across folders, and allows drill-in for taxonomies as configured hierarchies, and also allows for a combination of fields for key filters. One can configure it manually for a library, but how can one do this programmatically? Below are the steps:

# get the SPWeb:
$web = Get-SPWeb "http://WhateverTheSPWebSiteIs"
# get the library:
$JPLib = $web.Lists.TryGetList("WhateverTheListIsCalled")
# Here's the XML object we are coding against:
$listNavSettings = [Microsoft.Office.DocumentManagement.MetadataNavigation.MetadataNavigationSettings]::GetMetadataNavigationSettings($JPLib)
# You can output the XML settings and easily see its configuration each step along the way with this:
$listnavSettings.SettingsXml
# Here's how to clear both Configured Hierarchies, and Key Filters:
$listNavSettings.ClearConfiguredHierarchies()
$listNavSettings.ClearConfiguredKeyFilters()
[Microsoft.Office.DocumentManagement.MetadataNavigation.MetadataNavigationSettings]::SetMetadataNavigationSettings($JPLib, $listNavSettings, $true)
# Let's get ready for a Content Type Hierarchy
$ctHierarchy = [Microsoft.Office.DocumentManagement.MetadataNavigation.MetadataNavigationHierarchy]::CreateContentTypeHierarchy()
$listnavSettings.AddConfiguredHierarchy($ctHierarchy)
# Add a configured Hierarchy:
$listNavSettings.AddConfiguredHierarchy($JPLib.Fields["Field Name"])
# Add a Content Type Key Filter; I chose this on purpose, as using "Content Type" will not work, the field to use here is "Content Type ID":
$listNavSettings.AddConfiguredKeyFilter($JPLib.Fields["Content Type ID"])
# Now the party ends happily with an update; note no $list.update() or $web.update() is needed:
[Microsoft.Office.DocumentManagement.MetadataNavigation.MetadataNavigationSettings]::SetMetadataNavigationSettings($JPLib, $listNavSettings, $true)

Checking for a specific permission for a specific user or group in SharePoint

While the UI allows one to easily check permissions for a given user, how can one do that iteratively?

Here’s the heart of the magic:

# first grab the user principal:
$user = $TargetWeb.Groups[$GroupToAdd];
# Now let's get the Role Assignments for that user on the folder:
$RA = $folder.RoleAssignments.GetAssignmentByPrincipal($user);
#Role bindings are useful
$RoleDefBindings = $RA.get_RoleDefinitionBindings();
#Now let's grab the Role Definition for Contribute permission in this SPWeb:
$roledef = $TargetWeb.RoleDefinitions["Contribute"];
Lastly we can check whether the role bindings for this user on this folder contains the Contribute Role Definition:
if ($RoleDefBindings.Contains($roledef)) {...}

Some useful routines first. Note I like to predefine a “Write” permission that allows creation and editing but not deletion:

function PermRole([string] $RoleChar)
{
switch ($RoleChar)
{
"R" {$res="Read"}
"C" {$res="Contribute"}
"W" {$res="Contribute wo delete"}
"D" {$res="Manage Hierarchy"}  #aka design, for setting permissions
default {$res=$null}
}
return $res;
}
# Routine for adding permission based on passing in a character for the role definition to be granted:
function AddPerm ([string] $RoleChar, [string] $RoleGroup)
{ #JPItem/f and TargetWeb are implied and not passed as parms for efficiency!
if ((!$RoleChar) -or (!$RoleGroup))
{
return; #race to be efficient on NullOp
}
$RoleValue=PermRole($RoleChar);
if (!$RoleValue) 
{
Write-Host -ForegroundColor -darkred "ok, expected Role, but got none, for $($RoleChar)"
return; 
}
try
{
#CONTROVERSIAL!
if ($RoleChar -eq "W")  #wipes out reads etc.
{
RemovePerm $RoleGroup
}
try
{
$user = $TargetWeb.ensureuser($RoleGroup)
}
catch  #if the above fails, user is likely not a user, but in fact a group, let's retry as group
{
$user = $TargetWeb.Groups[$RoleGroup]
}
$roledef = $TargetWeb.RoleDefinitions[$RoleValue]
$roleass = New-Object Microsoft.SharePoint.SPRoleAssignment($user)
$roleass.RoleDefinitionBindings.Add($roledef)
$f1.RoleAssignments.Add($roleass)  #This is SPFolder specific in this routine
}
catch
{
Write-Host -ForegroundColor DarkRed "ERR: Can't Assign $($RoleGroup)"
}
}

Let’s first establish the libraries to look at across all webs and site collections:

$libsArrStr="Library name 1|Library name 2"
$LibsArr=$libsArrStr.split("|")
$GroupToAdd = "Department Contributors"
$Site = "ht tp://SharePoint/sites/SiteOfInterest"
$TargetWeb=$web=get-spweb $Site;
Write-Host "==&gt;working in $($web.url)"
for ($j=0; $j -lt $LibsArr.count; $j++)
{
$libStr=$LibsArr[$j];
$list=$web.Lists.TryGetList($libStr)
if ($list -eq $null)
{
Write-Host -ForegroundColor DarkRed "List not found"
}
else
{
for ($fi=0; $fi -lt $list.Folders.Count; $fi++)
{
$f1 = $list.Folders.get_Item($fi)
$f = $f1.folder;
write-host -f green "The Library $($listName) exists in the site $($web.url), about to set folder Perms"  
try
{
#the rule is if this field has data, make the user a Contributor
$f1.ResetRoleInheritance(); #badda-bing, security is inherited
$isWritable = ($f.item["TargetMetadata"] -ne $null);
if (!$isWritable)
{
# nul op, already inherited
}
else  #let's see whether to break perms, based on whether the group already has Contribute
{
#let's see if the user has Contributor rights already; if so, no need to break inheritence
$user = $TargetWeb.Groups[$GroupToAdd]
$RA = $f1.RoleAssignments.GetAssignmentByPrincipal($user)
$RoleDefBindings = $RA.get_RoleDefinitionBindings()
$roledef = $TargetWeb.RoleDefinitions["Contribute"]
if ($RoleDefBindings.Contains($roledef))  # user is already a Contributor, let's do nothing
{
}
else
{
$f1.BreakRoleInheritance($true);  #minimalist approach
addPerm	"C" 	$GroupToAdd								
}
}
}
catch
{
Write-Host problems setting perms
}
} #Folder processing for loop $fi
} # list found
} #for loop $j

Fixing checked out files

Fixing checked out files

I ran into a challenge this evening with a monstrously large library filled with 5,000+ folders with 47,000+ files. What to do? Firstly, things don’t work correctly until the list view threshold is temporarily lifted. Once that is done, we can iterate through the files, take the checked out ones over, and force the check in. Here’s how:

$root = get-spweb "ht tp://sharepoint/sites/site"
$lib = $root.lists["LibraryName"]
$x = $lib.CheckedOutFiles
$count = $x.Count
for ($i=$count-1; $i -ge 0; $i--)
{
$Checkeditem = $x.get_Item($i)
$Checkeditem.TakeOverCheckOut()
$libitem = $lib.GetItemById($Checkeditem.listitemid)
$libitem.File.CheckIn("")
Write-Host -NoNewline "."
}

The SharePoint internals behind an Append text field

SharePoint internals of the Append text field

In configuring a SharePoint field, there’s an option to “Append Changes to Existing Text” for Multi-line text fields. This setting requires versioning be enabled. Let’s delve into why versioning must be enabled for this feature to be enabled.

In a nutshell, SharePoint puts the latest value within the property bag, which you can see if you grab the SPItem object, and dump the read-only SPItem.xml. First let’s examine in PowerShell.

# let's get the web:
$web=Get-SPWeb http ://SharePoint/sites/Site/web
# let's get the list:
$list = $web.Lists["Name of List"]
# let's get the SPItem:
$q=$list.GetItemById(4567)
#Let's set the internal field name for later reference:
$FName = "Internal Field Name"

A more elegant way is by addressing by URL directly:

$url="http ://SharePoint/sites/Site/web/Listname/4567_.000"
$item = $web.GetListItem($url)

Then we can peek into the property bag:

$item.xml

It is worth noting there are times when SharePoint does not put the latest value in the property bag. In fact the property is absent altogether, although the versions continue to capture the append text. One theory is that this occurs to SP2007 lists that have been upgraded. If you experience this behavior, read below for accessing all of the append text.

But where are the previous appends? How can we see them? The trick is to walk through the versions, grabbing the version SPItem, and then grab the field value.

foreach ($v in $item.versions)
{
$v.get_Item($FName)
}

In C# we extract in a function such as this:

public static string GetVersionedMultiLineTextAsPlainText(int ID, string field,SPList list)
{
SPListItem item = list.GetItemById(ID);
StringBuilder sb = new StringBuilder();
foreach (SPListItemVersion version in item.Web.Lists[item.ParentList.ID].Items[item.UniqueId].Versions)
{
SPFieldMultiLineText CommentsField = version.Fields.GetFieldByInternalName(field) as SPFieldMultiLineText;
if (CommentsField != null)
{
string comment = CommentsField.GetFieldValueAsText(version[field]);
if (comment != null &amp;&amp; comment.Trim() != string.Empty)
{
sb.Append("");
sb.Append(version.CreatedBy.User.Name).Append(" (");
sb.Append(version.Created.ToString("MM/dd/yyyy hh:mm tt"));
sb.Append(") ");
sb.Append(comment);
}
}
}
return sb.ToString();
}

Smoothly activating a Feature across SharePoint Site Collections

It’s useful to be able to activate features across site collections. This two line script grabs the desired feature, then grabs the collection of Site Collections and pipes them into a Feature Activation CmdLet masking errors; errors occur such as when activating a feature that is already activated.

$defaultOpenBehaviorFeatureId = $(Get-SPFeature -limit all | where {$_.displayname -eq "OpenInClient"}).Id
Get-SPSite -limit ALL | foreach { enable-SPFeature $defaultOpenBehaviorFeatureId -url $_.URL -ErrorAction SilentlyContinue }

This is with a sample filter:

$defaultOpenBehaviorFeatureId = $(Get-SPFeature -limit all | where {$_.displayname -eq "OpenInClient"}).Id
Get-SPSite  -limit all | where {$_.url -like "ht tp://sharepointdev/ThisPath/*"}  | foreach { enable-SPFeature $defaultOpenBehaviorFeatureId -url $_.URL -ErrorAction SilentlyContinue }

Programmatically Targeting Audiences for Audience Enabled Libraries in SharePoint

Overview

Targeting an audience for a given document is a great capability within SharePoint. There’s a simple Document Library setting to enable Audience Targetting.

This enables a dedicated Audience Targeting field to be configured per document. I customized a Content Query Web Part (CQWP) that honors audience targeting, while displaying the fields in a grid view, enabling a targeted and custom portal home page. While the CQWP is harder to work with than a Data View Web Part, CQWP is the primary way to leverage Audience targetting.

Reporting

I prefer to first output a CSV of all documents, with one column for Audience. Note I also output the absolute URL. Here’s a function to do it. Note the proper use of memory management:

$LineHeader= 'Site,Lib,URL,ID,Versions,Name,Title,Created,Modified By,Modified,Audiences'
$LineHeader | Out-file -Filepath $ReportFile
function generate-VerReport ($WebUrl, $ListName)
{
$ac=Start-SPAssignment
$web = $ac | Get-SPWeb $WebUrl
$list = $web.Lists[$ListName]
$VerOut=$null;
Write-Host "+" -NoNewline #each plus is a full look through all Docs in a library
$xSit=$WebUrl;
$xLis=$ListName;
#Go through each item in the list
$Items = $list.Items
$ItemCount = $Items.count;
for ($i=0; $i -lt $ItemCount; $i++)
{
#Check if the item content type currently equals the old content type specified
$item=$items[$i];
$sep='","';
$xURL = $Item["EncodedAbsUrl"]
$RawAud = $item["Target Audiences"];
$xNam=$item['Name']
$xTit=$item['Title']
$xCre=$item['Created']
$xEdi=$item["Editor"]
$xMod=$item["Modified"]
$xID= $item.ID
$xVer= $item.Versions.count;
$Line1='"'+$xSit+$sep+$xLib+$sep+$xURL+$sep+$xID+$sep+$xVer+$sep+$xNam+$sep+$xTit+$sep+$xCre+$sep+$xEdi+$sep+$xMod+$sep+$rawAud,'"';
$Line1 | Out-file -Filepath $ReportFile -Append
#$LineHeader+= '`r`n'
#$VerOut+=$Line1;
}
$LineHeader+= '`r`n'
$LineHeader | Out-file -Filepath $ReportFile -Append
$web.Dispose()
$ac | Stop-SPAssignment
}

Here’s the function call:

generate-VerReport -weburl $web.url -listname $JPlib

The output enables an end user to modify the CSV to specify the audience per document.

Audience Shuffling

I created a script to apply the Audiences specified per document in the control spreadsheet. The Audiences are expected to be in semi-colon separated format (multiple values possible). There is no error checking on spelling of Audiences. It turns out Document Audiences in SharePoint are weakly-typed. That is to say, it is simply a string. The string has four semi-colons. In this script, we only apply SharePoint Groups, and they appear as comma separated after four semi-colons. If multiple values appear, these are comma separated. If an AD group is used, then that appears nestled between two pairs of semi-colons, as in this example:
;;CN=GroupName Membership,OU=Security Groups,OU=Information Technology,DC=MyDomain,DC=com;;
Under the covers, SharePoint will accept not just AD groups and SharePoint groups, but also UPS (User Profile Service) Audiences, specified as GUIDs. So two semi-colons are used as delimiters (beats me, but this is the first time I’ve seen this convention). Here’s how it is structured:
[Guids, comma separated];;[AD group LDAP paths separated by line breaks];;[SharePoint security group names, comma separated]

Here’s the script to apply the Audiences. It assumes all are SharePoint Groups, but is easily extensible to support AD groups if desired. Note it does not do checking for validity of SharePoint groups. Note the “Action” column to determine whether to “D”elete or “A”ssign audiences:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
Start-SPAssignment –Global
$ShuffleSource = "C:UsersplautjDocumentsPowerShellAudienceShuffle.txt"
$siteUrl = "ht tp://sharepoint"
Write-Host "script starting $(get-date)"
$myheader = "STARTING: $(get-date)"
$ShuffArr = Import-Csv -Path $ShuffleSource -Delimiter "`t"
$ShuffCount = $ShuffArr.count;
for ($i=0; $i -lt $ShuffCount; $i++)
{
$Row = $ShuffArr[$i]
if (($Row.action -ne $null) -and ($Row.action.length -ne 0))
{
$docurl = $row.url;
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$list = $item.ParentList
if ($Row.action -eq "D")
{
$item["Target Audiences"] = $null;
$item.SystemUpdate();
}
elseif ($Row.action -eq "A")
{
if ($Row.Audiences -gt 0) #ignore if empty
#actual Target Audience property has four semicolons, followed by comma delimited sharepoint groups
#for AD group is how semicolons work ;;CN=ADW Membership,OU=Security Groups,OU=Information Technology,DC=DOMAIN,DC=com;;
{
$AudBuilder = ";;;;"
$AudArr = $Row.Audiences.Split(";");
for ($ai=0; $ai-lt $AudArr.Count; $ai++)
{
if ($ai-gt 0)
{
$AudBuilder = $AudBuilder + ","
}
$AudBuilder = $AudBuilder + $AudArr[$ai]
}
$item["Target Audiences"] = $AudBuilder;
$item.SystemUpdate();
} #IF Audiences is not null
} #Add action
} #action exists
} #loop of rows
Write-Host "script finishing $(get-date)"
Stop-SPAssignment –Global
###########################

Inconsistent SharePoint timestamps with WebDAV

There are situations when moving documents using Explorer mode can retain the correct “Modified” timestamp in the browser, yet show an updated timestamp in Explorer Mode.

This is due to WebDAV showing the date associated with the File (SPFile) rather than the date associated with the SPItem. Explorer mode can update the modified date in the SPFile.

When you probe further, the “Item” has the correct timestamp (item[“Modified”] is in the local timezone. However the SPFile has a property called vti_timelastmodified that has the GMT timestamp.

Note the File has a property called vti_nexttolasttimemodified as well.

Both item.file.properties[“vti_timelastmodified”] and item[“Modified”] are the same type (DateTime) so I can compare them, which is precisely what I did in a special report looking for such timestamp divergence. I generated a filtered report custom written in PowerShell to show files that are more or less than 4 or 5 hours off (depending on the time of year, the hours diverge by either 4 or 5 hours) given my ET timezone. I only look to the “minute” and not the “second” on purpose, so I don’t flag false positives.
Here is how to fix a single instance of this issue:

$docurl = "http ://sharepoint/site/list/TimestampTest/test2.docx"
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$list = $item.ParentList
[System.DateTime]$date = $item["Modified"]
$user = New-Object microsoft.SharePoint.SPFieldUserValue($web, $item["Editor"])
$item["Modified"] = $date;
$item["Editor"] = $user;
$item.Update()
try { $item.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item['Name'])"}

Here’s a function that reports on all URLs that suffer from this issue.  Note the use of date comparison, and the check for timestamp being off by either 4 or 5 hours, and matching on the delta of both days and minutes:

function Reset-Dates ($WebUrl, $ListName)
{
#Get web, list and content type objects
$web = Get-SPWeb $WebUrl
$list = $web.Lists[$ListName]
IF ($ReportFile -eq $null) {$reportFile = C:report.csv"}
#Check if the values specified for the content types actually exist on the list
$xSit=$WebUrl;
$xLis=$ListName;
#Go through each item in the list
$list.Items | ForEach-Object {
$item = $_;
$fd = $item.file.properties["vti_timelastmodified"]
$id = $item["Modified"]
$dd = ($id-$fd)
$hoursMatch = (($dd.hours -eq -4) -or ($dd.hours -eq -5))
$daysMatch = ($dd.days -eq 0)
$minutesMatch = ($dd.minutes -eq 0)
if ($hoursMatch -and $daysMatch -and $minutesMatch)
{
Write-Host '.' -NoNewline
}
else
{
$xURL=$item.url;
$xNam=$item['Name']
$xMod=$item["Modified"]
$xfMod = $item.file.properties["vti_timelastmodified"]
try {$xEdi=$item["ows_Modified_x0020_By"].replace("DOMAIN",$null)} catch {$xEdi=$item["ows_Modified_x0020_By"]}
try {$xAut=$item["ows_Created_x0020_By"].replace("DOMAIN",$null)} catch {$xAut=$item["ows_Created_x0020_By"]}
$Line1=$xURL+$sep+$xSit+$sep+$xLis+$sep+$xNam+$sep+$xAut+$sep+$xEdi+$sep+$xMod+$sep+$xfMod+$sep+$dd.days+$sep+$dd.hours+$sep+$dd.minutes+$sep+'1';
$Line1 | Out-file -Filepath $ReportFile -Append
$Line1=$null;
}
}
$web.Dispose()
}

Tuning SharePoint Search Ranking

Tuning SharePoint Search Ranking in the object model

SharePoint Search results are returned in order of relevancy, which is determined by a ranking model. There are a number of ranking models cooked into SharePoint 2010. These can be refined to a limited extent, with a bit of insight, to better serve users.

To see the models and their definition, let’s query the SharePoint Search application DB:

SELECT * FROM [Search_Service_Application_DB].[dbo].[MSSRankingModels]

The resultset has the models; the GUID, whether it is default, and the underlying XML that specifies the model. The model name is at the beginning of the XML.

Using PowerShell, we can get the array of ranking models, and is the only supported approach for manipulating the models, changing the default, and for creating new ranking models. Here’s how to get the models:

Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel

Now we can assign the ranking model array to a variable and index into it:

$A = Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel

Or we can grab the one ranking model that we like by using the GUID, which we have to predetermine, but that’s easy, as it’s returned by the above query and is unchanging. For new models, we get to specify the GUID as well.

Once you know your rank model GUID, you can switch to it by getting it, and setting it as default:

$r = Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel 8f6fd0bc-06f9-43cf-bbab-08c377e083f4
$r.MakeDefault()

To create a custom rank model, first identify the Managed Properties, by PID. The name is part of the XML, but it is the PID that drives the ranking. Here’s how to get all the Managed Properties and their PIDs:

Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchMetadataManagedProperty

Now we create a new ranking model called MyRank, note i want LastModified property to be relevant.

Get-SPEnterpriseSearchServiceApplication | New-SPEnterpriseSearchRankingModel –rankingmodelxml "<?xml version='1.0'?><rankingModel name='MyRank2' id='8447b4bc-3582-45c5-9cb8-ba2a319d850e' description='CustomJoelRank2' xmlns='http://schemas.microsoft.com/office/2009/rankingModel'>
<queryDependentFeatures>
<queryDependentFeature name='Body' pid='1' weight='0.00125145559138435' lengthNormalization='0.0474870346616999'/>
<queryDependentFeature name='LastModifiedTime' pid='4' weight='3.46602125767061' lengthNormalization='0.549393313908594'/>
<queryDependentFeature name='Title' pid='2' weight='1.46602125767061' lengthNormalization='0.549393313908594'/>
<queryDependentFeature name='Author' pid='3' weight='0.410225403867996' lengthNormalization='1.0563226501349'/>
<queryDependentFeature name='DisplayName' pid='56' weight='0.570071355441683' lengthNormalization='0.552529462971364'/>
<queryDependentFeature name='ExtractedTitle' pid='302' weight='1.67377875011698' lengthNormalization='0.600572652201123'/>
<queryDependentFeature name='SocialTag' pid='264' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
<queryDependentFeature name='QLogClickedText' pid='100' weight='1.87179361911171' lengthNormalization='3.31081658691434'/>
<queryDependentFeature name='AnchorText' pid='10' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
</queryDependentFeatures>
<queryIndependentFeatures>
<queryIndependentFeature name='ClickDistance' pid='96' default='5' weight='1.86902034145632'>
<transformInvRational k='0.0900786349287429'/>
</queryIndependentFeature>
<queryIndependentFeature name='URLDepth' pid='303' default='3' weight='1.68597497899313'>
<transformInvRational k='0.0515178916330992'/>
</queryIndependentFeature>
<queryIndependentFeature name='Lastclick' pid='341' default='0' weight='0.219043069749249'>
<transformRational k='5.44735200915216'/>
</queryIndependentFeature>
<languageFeature name='Language' pid='5' default='1' weight='-0.56841237556044'/>
</queryIndependentFeatures>
</rankingModel>"

There are two parts to the model; the query dependent section that is associated with the actual query and it’s metadata, and the query independent part that ranks based on number of slashes (URLDepth) and click frequency etc.

As soon as a model is default, you can see the effect of the new ranking model.

Here’s how to change this model, note I add a new field called MyCompany and boost its relevance:

<?xml version="1.0" encoding="utf-8"?>
Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel 8447b4bc-3582-45c5-9cb8-ba2a319d850e | Set-SPEnterpriseSearchRankingModel –rankingmodelxml "<?xml version='1.0'?><rankingModel name='CustomJoelRank2' id='8447b4bc-3582-45c5-9cb8-ba2a319d850e' description='MyRank2' xmlns='http://schemas.microsoft.com/office/2009/rankingModel'>
<queryDependentFeatures>
<queryDependentFeature name='Body' pid='1' weight='0.00125145559138435' lengthNormalization='0.0474870346616999'/>
<queryDependentFeature name='MyCompany' pid='414' weight='3.610225403867996' lengthNormalization='1.0563226501349'/>
<queryDependentFeature name='Title' pid='2' weight='0.46602125767061' lengthNormalization='0.549393313908594'/>
<queryDependentFeature name='Author' pid='3' weight='0.410225403867996' lengthNormalization='1.0563226501349'/>
<queryDependentFeature name='DisplayName' pid='56' weight='0.570071355441683' lengthNormalization='0.552529462971364'/>
<queryDependentFeature name='ExtractedTitle' pid='302' weight='1.67377875011698' lengthNormalization='0.600572652201123'/>
<queryDependentFeature name='SocialTag' pid='264' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
<queryDependentFeature name='QLogClickedText' pid='100' weight='1.87179361911171' lengthNormalization='3.31081658691434'/>
<queryDependentFeature name='AnchorText' pid='10' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
</queryDependentFeatures>
<queryIndependentFeatures>
<queryIndependentFeature name='ClickDistance' pid='96' default='5' weight='1.86902034145632'>
<transformInvRational k='0.0900786349287429'/>
</queryIndependentFeature>
<queryIndependentFeature name='URLDepth' pid='303' default='3' weight='1.68597497899313'>
<transformInvRational k='0.0515178916330992'/>
</queryIndependentFeature>
<queryIndependentFeature name='Lastclick' pid='341' default='0' weight='0.219043069749249'>
<transformRational k='5.44735200915216'/>
</queryIndependentFeature>
<queryIndependentFeature name='CustomJoelModified' pid='445' default='1' weight='2.56841237556044'>
<transformRational k='5.44735200915216'/>
</queryIndependentFeature>
<languageFeature name='Language' pid='5' default='1' weight='1.5'/>
</queryIndependentFeatures>
</rankingModel>"

I admittedly did not have success ranking by how recent a document was updated. This is known as “Freshness”. SP2010 has very limited ability to customize ranking. I have not succeeded in getting it to respect “freshness”. A simple freshness ranking seems infuriatingly out of reach. However SP2013 supports it explicitly. While the default SharePoint 2013 ranking model doesn’t boost the rank of search results based on their freshness, we can achieve this by adding a tuning of the static rank that combines information from the LastModifiedTime managed property with the DateTimeUtcNow query property, using the freshness transform function. These Transform functions are used to customize ranking in SP2013. The freshness transform function is the only transform that we can use for this freshness rank feature, because it converts the age of the item from an internal representation into days. In SP2010 the transforms are much more obscure and not really usable. Microsoft reports that the freshness transform in SP2013 can be used. Even before getting to SP2013, we can have an SP2013 farm configured to crawl production SP2010 and return results tuned in this way, and can use the SP2013 search results to serve any client we choose to point to SP2013, including a simple search site in SP2013.