Features in SharePoint and best practices on how to use them

Scripting SharePoint logging to ULS and Event Log

It’s easy to dump output to a text file in a script, but for enterprise-class logging, the two standards are the Event Log and ULS (Unified Logging System). First ULS.

Below in PowerShell I grab a reference to the SPDiagnosticsService, define a SPDiagnosticsCategory, then call the writeTrace() method:

$diagSrc = [Microsoft.SharePoint.Administration.SPDiagnosticsService]::Local
$diacategory = new-object Microsoft.SharePoint.Administration.SPDiagnosticsCategory("MyTestCategory",[Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, [Microsoft.SharePoint.Administration.EventSeverity]::ErrorCritical)
$diagSrc.WriteTrace(98765, $diacategory, [Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, "Write your log description here" )

ULS is a good standard central way to go, but let’s move onto writing into the Event Log, which is extra useful given we are first going to create a custom application log:

New-EventLog -LogName MyCustomScripts -Source scripts

First challenge is if the logfile exists, then it will throw an error, even if you try encapsulating in a try/catch. the trick is to leverage the Get-EventLog CmdLet.

First, to see what exists, format as a list:

Get-EventLog -list

You now have your very own Event Log, and can write into it with your own event IDs, and messages, and severity levels. Here’s two worknig examples:

Write-EventLog -LogName MyCustomScripts -Source Scripts -Message "trying 4142 it works ... COOL!" -EventId 4142 -EntryType information
Write-EventLog -LogName MyCustomScripts  -Source Scripts -Message "trying 4942 as an error" -EventId 4942 -EntryType error

Now let’s simplify for re-use and consistency. Let’s declare some basics at the top of all scripts:

$Eventing = $true;  #determine if any events are written to the event log
$LogName = "JoelScripts"
$SourceName = "Scripts"
$ScriptID = 3; # unique number per script

Here’s a one-line function to make life simpler for our scripts:

function Write-MyLog([int] $EventID, [string] $Description, [system.Diagnostics.EventLogEntryType] $Severity)
{
if ($Eventing)
{
Write-EventLog -LogName $LogName -Source $SourceName -Message $Description -EventId $EventID -EntryType $Severity
}
}

Now let’s add a line at the start and end of the scripts to trigger an information event on what’s running. Note that references to $MyInvocation contain information about the currently running script:

Write-MyLog -Description "Start of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information
Write-MyLog -Description "End of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information

Lastly, here’s a sample normal message evented for warning, and next for error:

Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Warning
Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Error

A nice way to write events is to use the “Source” to map back to the script name or some other useful value for filtering. However sources need to be pre-defined. Here’s how to define a source:

New-EventLog -LogName $LogName -Source "X"

The challenge is when to create this source. I find it’s best to declare the source only if it does not exist:

try
{
$Sources = Get-EventLog -LogName $logname | Select-Object Source -Unique
$found = $false;
foreach ($OneSource in $Sources)
{
if ($OneSource.source -eq $Source)
{
$found=$true;
}
}	
}
catch
{
Write-Host "cannot find logfile, so we are in deep trouble"
}
if (!$found)
{
New-EventLog -LogName $LogName -Source $Source
Write-Host "Created new Source $($Source) in log name $($LogName)"
}

Propagating legacy Document IDs into SharePoint

Document IDs in SharePoint

Document IDs are generated automatically by SharePoint when configured, but what if you have your own legacy Document IDs that you want to use? There is a way to push them in.

The first trick is recognizing there are two internal fields created when Document IDs are enabled.
ows__dlc_DocId : The actual Document ID
ows__dlc_DocIdUrl : The special Redir.ASPX based URL with the document ID within it

Let’s examine the later field in a bit more detail. This contains a URL with a reference to “DocIdRedir.aspx” which is an ASPX ghosted page that will redirect either via local service lookup, or a fallback to a search based lookup. Here’s what it looks like:

http ://server/_layouts/DocIdRedir.aspx?ID=DOC-1000-3109, DOC-1000-3109″

Note the comma, then the repeated Document ID.

Now imagine if there’s a field in our document library that has the legacy Document ID we want to use.

The script below goes through, and systematically replaces the existign Document ID with the Legacy one we want to use.

[system.reflection.assembly]::LoadWithPartialName("Microsoft.Sharepoint") 
Write-Host ("$(get-date) Running script to assing DOcIDs")
$SiteFilter = "*"
$siteUrl = "http ://SharePoint" #use your own Web App URL
$ThisLibOnly = $null; #allows for filtered selection of libraries
Write-Host ("$(get-date) Assigning DocIDs from Legacy IDs")
$webApp=Get-SPWebApplication $SiteURL
$Sites = $webApp | Get-SPSite -Limit all
foreach ($site in $Sites)
{
$OutputSuffix=$loopLetter;
$mylogfile="C:Tempoutput.log"
write-host $site.Url
Write-Host -foregroundcolor darkblue "$($site.id) - $($site.Url) - $($site.contentdatabase.id) - $($site.contentdatabase.name)"   
$WebScope = Start-SPAssignment
foreach ( $TargetWeb in $site.AllWebs)
{
Write-Host("Working in site: " + $TargetWeb)
$ListOfLists = @();
# Loop through all doc libs
$lists=$TargetWeb.lists;
$listsCount=$lists.count
for ($ii=0; $ii -lt $listsCount; $ii++)
{
$JPlib=$lists[$ii];
if ( ($JPlib.BaseType -ne "DocumentLibrary") -or ($JPlib.hidden) )
{
# forget the rest and return to top
Write-Host -foregroundcolor darkred "fast test skipping Library: $($JPlib)";   
}
elseif ($JPLib.Title -Match "Photo|Image|SitesAssets|CustomizedsReports|Templates|Pages|Picture|cache|style")
{
# forget the rest and return to top
Write-Host -foregroundcolor darkred "fast test skipping Library because it mentions $Matches: $($JPlib)";   
}
elseif (!$JPlib.contenttypesenabled)
{
continue; # no content types, no interest
}
else
{  
Write-Host -foregroundcolor green "Processing Library: $($JPlib)";   
$ListOfLists += $JPlib.title;
}
}
foreach ($CurrentLib in $ListofLists)
{
$JPlib=$TargetWeb.lists[$CurrentLib];
$JPlib.title
if ($JPlib -eq $null)
{
Write-Host "COULD NOT GET LIB $($CurrentLib)"
continue;
}
Write-Host -foregroundcolor green "Processing Library: $($JPlib) in $($TargetWeb)";   
if (($ThisLibOnly -eq $null) -or
($JPlib.title -eq $ThisLibOnly)) 
{
$JPItems=$JPlib.Items;
$JPCount=$JPItems.get_count();
if ($JPCount -eq 0) {continue} #can't do much on a library with no items!
for ($i=0; $i -lt $JPCount; $i++)  #Do not use ItemCount, that one includes folders!
{	
$JPItem=$JPItems[$i];
$SourceValue=$JPItem["LegacyDocID"];
if (($SourceValue -eq $null) -or ($SourceValue.length -le 0))
{
write-host "-" -nonewline
continue; #nothing to assign
}
elseif ($JPItem["ows__dlc_DocId"] -ne $SourceValue) #avoid reassigning same value
{
Write-Host "Old DocID=$($JPItem['ows__dlc_DocId']),LegacyDocID=$($SourceValue)"
$oldDocIDValue=$JPItem["ows__dlc_DocId"]
$JPItem["ows__dlc_DocId"] = $SourceValue;
if ($JPItem["ows__dlc_DocIdUrl"].length -gt 1)
{
$JPItem["ows__dlc_DocIdUrl"]= $JPItem["ows__dlc_DocIdUrl"].replace($oldDocIDValue,$SourceValue);
}
$JPItem.systemupdate() #without generating version
Write-Host "$($i): $($JPItem.url)"
}
else
{  #special!  $JPItem["ows__dlc_DocIdUrl"]=$null;
Write-Host "DOcID Match! $($JPItem.url)"
Write-Host "Old DocID=$($JPItem['ows__dlc_DocId']),LegacyDocID=$($SourceValue)"
}
}		
}
Write-Host "+" -NoNewline
}# Lib loop
$JPlib.Update();
$JPlib=$null;
} #all libs?
try
{
$TargetWeb.dispose()
} catch{}
$TargetWeb=$null;
Stop-SPAssignment $WebScope
} #all webs?
} # all sites?
$site.Dispose()
$site=$null;
Stop-SPAssignment $SiteScope
Write-Host "SCRIPT COMPLETE $(get-date)"

Generating automatic emails with embedded reports and link to uploaded CSV

Generating automatic emails with embedded content

There’s often a need to automate the generation of reports on SharePoint. I typically need to generate such a report based on:
1. Content using search: acquired via a Search Query
2. Content via manual crawl: walking through libraries, even sites, site collections and farms
3. Based on scripted actions taken, such as moving or copying documents on an automated basis.

Other examples I’ve done including reports on Checked Out documents, custom delivered to the people who have documents checked out for 30+days.

There are a few parts to the script below:
1. Acquire the dataset for the report. In this example I walk through app SPItems in a given list.
2. Define how the email gets sent, and to whom, leveraging HTML email
3. Generate the HTML for this report
4. Create a temporary CSV of this report, and upload it to a SharePoint library
5. Send the email, ensuring it contains the summary report and a link to the uploaded CSV

To store the dataset, I use the system.Data.DataTable object, and build from there. It’s easy to copy a DataTable, or drop columns after the HTML is constructed.

I use ConvertToHTML CmdLet to convert the DataTable, explicitly referencing the fields and the order I’d like in the HTML table. However I have to fix up the XML within it. For that I pipeline it through a function called Convert-HTMLEscape

I prefer to tune up the DataTable and drop any unnecessary fields, as ConvertTo-CSV does not allow fields to be specified. I also specify -NoTypeInformation so the first row is the actual field names, allowing it to open correctly in Excel.

Note how the URLs are prettied up on the fly, so they read nicely in the HTML table.

Claims Authentication has a different account format, which I clean up in a separate Function Strip-ClaimsHeader.

Whether a file is checked out, the type of checkout, and who checked it out are some optional attributes I capture in the DataTable. This table is easily extended to support whichever metadata is desired.

here’s the script:

if(!(Get-PSSnapin Microsoft.SharePoint.PowerShell -ea 0)) 
{ 
Write-Progress -Activity "Loading Modules" -Status "Loading Microsoft.SharePoint.PowerShell" 
Add-PSSnapin Microsoft.SharePoint.PowerShell 
} 
$ToRecipients = "joelplaut@MyDomain.com"
$CCRecipients = $ToRecipients; #set to any audience you want, semicolon separated
$ToRecipientsArray = $ToRecipients.Split(",");
$CCRecipientsArray = $CCRecipients.Split(",");
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
function Save-File([string] $Dir, [string] $PartialName, [system.Data.DataTable] $MyTable)
{
#writes file to the a temporary disk based location.  I add a timestamp to name to ensure uniqueness.  $PartialName is a text description used to start the filename
# export-csv does all the heavy lifting.  Note the attribute "-NoTypeInformation" is needed so there isn't a header on top of the field headers
$timestamp = Get-Date
$timestampstr = $timestamp.Month.ToString("00") + "_" + $timestamp.Day.ToString() + "_" + $timestamp.Year.ToString("0000") + " " + $timestamp.Hour.ToString("00") + "_" + $timestamp.Minute.ToString("00") + "_" + $timestamp.Second.ToString("00") 
$FileName = $Dir + '' + $PartialName+'_'+$timestampstr+'.CSV';
$MyTable.rows | Export-Csv -Path $FileName -NoTypeInformation
return $FileName;
}
#writes file to the Reporting Library within SharePOint , adds timestamp to name to ensure uniqueness
function Upload-File([string] $FilePath)
{
$Files = $RepList.rootfolder.Files
$DocLibName = $RepList.Title; #this is not necessarily accurate 
#Get File Name from Path
$FileName = $FilePath.Substring($FilePath.LastIndexOf("")+1)
#delete the File from library, if already exist.  This should never occur
try
{
if($Files.Item($DocLibName +"/" + $FileName))
{
$Files.delete($DocLibName +"/" + $FileName)
}
} catch {}
#Get the File
$File= Get-ChildItem $FilePath
#Add File to the collection
$x=$Files.Add($DocLibName +"/" + $FileName,$File.OpenRead(),$false)
$ReportFilename = $RepList.ParentWeb.Url + '/'+ $DocLibName + '/' + $Filename;
return $reportFileName;
}
# If we are running in a Claims Authentication environment, we can strip off the claims tags
Function Strip-ClaimsHeader ($s)
{
if ($s.IndexOf('#') -gt 0)  #handle stripping claims tags off name
{
return $s.Substring($s.IndexOf("#")+1)
}
else
{
return $s
}
}
#used for HTML processing and fixup; this is a bit dense, but it supports the PowerShell pipeline, and restores XML tags
Function Convert-HTMLEscape {
<#
convert &amp;lt; and &amp;gt; to < and >
It is assumed that these will be in pairs
#>
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
<#
replace codes with actual symbols. This line is a shortcut to do two replacements
with one line of code. The code in the first set of parentheses revised text with "<". This
normally gets written to the pipeline. By wrapping it in parentheses it tells PowerShell to treat it
as an object so I can then call the Replace() method again and add the >.
#>
(($item.Replace("&amp;lt;","<")).Replace("&amp;gt;",">")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function
$env = "Prod"  # I like setting flags for changing environment settings
if ($env -eq "Prod")
{
$RefListLocation = "http ://sharepoint/sites/mySiteCollection/Subsite"
$TempLoc = "C:TempLocation"  # disk location to temporararily store the file
}
$table = New-Object system.Data.DataTable “JoelFancyReport”  #this is an internal name, that is not very important to set
#let's define a field (column) for every possible field; even those we don't yet use in a given report
$col1 = New-Object system.Data.DataColumn Link,([string])
$col2 = New-Object system.Data.DataColumn BusinessProcess,([string])
$col3 = New-Object system.Data.DataColumn DocType,([string])
$col4 = New-Object system.Data.DataColumn Modified,([string])
$col5 = New-Object system.Data.DataColumn Title,([string])
$col6 = New-Object system.Data.DataColumn FileName,([string])
$col7 = New-Object system.Data.DataColumn Editor,([string])
$col8 = New-Object system.Data.DataColumn CheckedOut,([string])
$col9 = New-Object system.Data.DataColumn CheckedOutBy,([string])
$col10= New-Object system.Data.DataColumn FileFolder,([string])
$col11= New-Object system.Data.DataColumn FileLink,([string])
$col12= New-Object system.Data.DataColumn ReportAction,([string])
# Just add the columns to the table
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
$table.columns.add($col10)
$table.columns.add($col11)
$table.columns.add($col12)
#we can create multiple tables with the same layout easily. 
# Copy is shown here for reference, and is not used in this simple example
$TableCopy = $table.Copy() 
#loop through whatever SPItems need to be looped
# here is where the basic logic gets put, before generating the actual report
# this is just a placeholder loop of items in a list, but this could be anything, including SPQuery results
$JPWeb = get-spweb $RefListLocation
$JPList  = $JPWeb.lists["MyList"]
$JPItems = $JPList.items;
$JPItemCount = $JPItems.count;
for ($k=0; $k -lt $JPItemsCount; $k++)
{
$SourceItem = JPItems[$k];
if ($SourceItem.File.CheckOutStatus -ne "None")
{
$ReportAction += ", CheckedOut "+$SourceItem.File.CheckOutStatus+" to:"+$SourceItem.File.CheckedOutBy.displayname;
$CheckedOut = "Yes"
$CheckedOutBy = $SourceItem.File.CheckedOutBy.displayname;
}
else
{
$CheckedOut = "No"
$CheckedOutBy = $null;
}
#let's create a new row	
$row = $Table.NewRow()
$FileLink =   $($JPWeb.Url)+'/'+$($SourceItem.url) ;
$row.FileLink = $FileLink;
$row.Link = "<a href="+'"'+ $($FileLink) +'">'+ " Link " + "</a>";
$FolderURL = $FileLink.substring(0,$FileLink.LastIndexOf('/'));
$row.FileFolder = $FolderURL;
$row.Title = $item2.Title;
$row.FileName= $item2.Name;
$row.Editor = strip-claimsHeader $item2["Editor"];
$row.Modified= $item2["Modified"];
$row.BusinessProcess = $item2["BusinessProcess"] #sample field; add as many as you'd like
$row.DocType = $item2["DocType"]
$row.checkedOut = $CheckedOut;
$row.checkedOutBy = $CheckedOutBy;
$Table.Rows.Add($row)
}
$RunStatus = "All the processing completed $(get-date)"
$Header = "Weekly report for my friends"
$emailTitle = "Weekly Report email $($RunSummary)"
# this is a nice table style, collapsed, colored header.  	
$a = "<style>"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + "</style>"
# Let's creat the HTML table, specifying only the columns from the table that we'd like to see.  Any other columns won't appear in this report 
# By piping it to the Convert-HTMLEscape function, it fixes up the HTML    
$MyOutput = $table| ConvertTo-Html Link, BusinessProcess, DocType, Editor, Modified, FileName, Action -head $a -body "<H1>$($rows.count) $($Header)</H1>"  | Convert-HTMLEscape
# if we need to reuse table, just make a copy of it first.  
# While Convert-HTML offered the luxury of column selection, the Convert-CSV converts every darn column, so let's remove the extra ones now.
$table.Columns.Remove("Link") #how to remove a Column. links don't look too pretty within a CSV, so for an example I remove it here.
$f2 = save-file $TempLoc ($FileDescription) $Table  #Saves the temp file to disk, driven out of the $Table
$Report2URL = upload-file $f2
$ReportSummary = "Rolled Report Available at: "+"<a href="+'"' +$Report2URL +'">'+ $FileDescription+" Custom Report " + "</a>"+"<p></p>";
#debug technique; it's easy to route the HTML to a file for examination, see the commented out line below:
#$MyOutput > C:A.html 
[string] $From = "SharePointSupport@MyDomain.com"
[string] $To = $null; #$Recipients #"joelplaut@MyDomain.com"
[string] $Title = $emailTitle;
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.MyDomain.com"
$SmtpClient.host = $SmtpServer
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)

Customized scheduled SharePoint Search Alerts in HTML

Customized scheduled SharePoint Search Alerts in HTML

Occasionally I get requests for customized notification about documents within a SharePoint farm. Regular Alerts and SharePoint Search Alerts work great out of the box, but sometimes users want something more such as:
– Complex criteria
– Custom sort sequence
– Custom metadata
– Customized notification frequency
– Custom message text, or subject line
– Refined layout

The solution I’ve used is to script a search query in powerShell, and load it into a formatted HTML table, and schedule it to the desired frequency. I’ll outline the framework below that is easily adapted and extended. Note I wrote this for FAST, but using SharePoint Search Query classes you can achieve similar results in regular SharePoint search.

First, let’s establish some basics about search and notification. For this solution, I only want to return documents that are up to two or three days old, so I grab the date, take two off it, and put it into a format we can use later for the query:

$Mydate=get-date
$MyDate = $Mydate.AddDays(-2)
$MyDateStr = $Mydate.Year.ToString("0000") + "-" + $Mydate.Month.ToString("00")  + "-" + $Mydate.day.ToString("00")  #formatted YYYY-MM-DD

Let’s now set up for the search querty. I chose to use FQL (FAST Query Language), but you can use Keyword Search. Note SQL Search is deprecated. I chose 50 results, but you can choose whatever amount you prefer:

$site = New-Object Microsoft.SharePoint.SPSite $webappurl
$vc =New-Object Microsoft.Office.Server.Search.Query.KeywordQuery $site
$vc.ResultsProvider = [Microsoft.Office.Server.Search.Query.SearchProvider]::FASTSearch
$vc.ResultTypes = [Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults
#In my case I enabled the FQL syntax and set some other parameters:
$vc.EnableFQL = $true # enable FQL
$vc.RowLimit = 50 # sets the limit of results
$vc.StartRow = 0 # 0 is the default

Now let’s make sure the query returns the fields you want. These must be Managed Properties configured followed by a Full Crawl.

$vc.SelectProperties.Add("Company Name")
$vc.SelectProperties.Add("URL")
$vc.SelectProperties.Add("Title")
$vc.SelectProperties.Add("Filename")
$vc.SelectProperties.Add("Company ClaimNumber")
$vc.SelectProperties.Add("Company PolicyNumber")
$vc.SelectProperties.Add("Company Modified")
$vc.SelectProperties.Add("Company EffectiveYear")

Now let’s piece together the XML of the FQL. Note two strings ($q1 and $q2) are used to construct the query and put into $BigQ, with the date we formatted earlier. We’re looking for documents newer than two days ago, where a particular field CompanyClaimDocumentType equals a specific value (“Specific Value”). Then we execute the FQL:

$q1='and(filter(Company modified:range(datetime("'
$q2='"), max, from="GT")), filter(CompanyClaimDocumentType:equals("Specific Value")))'
$BigQ=$q1+$MyDateStr+$q2
$vc.QueryText = $BigQ
$results = $vc.Execute()

Now let’s convert the search results into a DataTable to make it easy to shape into an HTML table for the outbound email alert notification, we’ll define the columns, and load the values. One nice thing is to shape the link column to allow a hyperlink embedded in the table for easy user access to the documents. I also structure a special link using the DMF:// protocol supported by MacroView DMF:

$resultsTable = $results.Item([Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults)
$resultsDataTable = $resultsTable.Table
$rows = $resultsDataTable.Rows
$table = New-Object system.Data.DataTable “SearchReport”
$col1 = New-Object system.Data.DataColumn Title,([string])
$col2 = New-Object system.Data.DataColumn CompanyName,([string])
$col3 = New-Object system.Data.DataColumn ClaimNumber,([string])
$col4 = New-Object system.Data.DataColumn Link,([string])
$col5 = New-Object system.Data.DataColumn PolicyNumber,([string])
$col6 = New-Object system.Data.DataColumn Modified,([string])
$col7 = New-Object system.Data.DataColumn EffectiveYear,([string])
$col8 = New-Object system.Data.DataColumn FileName,([string])
$col9 = New-Object system.Data.DataColumn DMF,([string])
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
if ($rows.count -gt 0)
{
for ($i=0; $i -lt $rows.Count; $i++)
{
$row = $table.NewRow()
$row.Link = "&lt;a&gt;'+ " Link " + "&lt;/a&gt;";
#$row.DMF = ($row.Link.Replace("http://","DMF://")).replace(" File Link", "SP Explorer")  Took out, doesn't appear to work quite right
$row.Title = $rows[$i].Title;
$row.InsuredName= $rows[$i].CompanyName;
$row.ClaimNumber= $rows[$i].CompanyClaimNumber;
$row.PolicyNumber= $rows[$i].CompanyPolicyNumber;
$row.EffectiveYear= $rows[$i].CompanyEffectiveYear;
$row.FileName= $rows[$i].FileName;
$row.Modified= $rows[$i].Company Modified.substring(0,10);
$table.Rows.Add($row)
#("&lt;a&gt;" + $_.Portname.SubString(3) + "&lt;/a&gt;"}}
}

Now, we want to shape this into an outbound table in the email. To do that we’ll use the handy CovertTo-HTML CmdLet. Just pass in the column names you want to appear. Above I map more than we use below. I shape the borders and colors for a really professional look. However we have a sticky problem where we don’t want it to mess with the embedded HTML such as brackets around the a href for links. To solve that, I pipe the stream into a routine called Convert-HTMLEscape that I will outline shortly. Note the nice H1 header, and we’ll nicely handle single/plural number of rows in the subject line and top of the email:

$a = ""
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ""
#$MyOutput = $rows | ConvertTo-Html Title, Author, URL, Link -body "&lt;H1&gt;Recent Custom Reports&lt;/H1&gt;" -PostContent "Goodbye and thanks for all the fish"
if ($rows.count -eq 1)
{
$Plural=$null;
}
else
{
$Plural="s";
}
$MyOutput = $table| ConvertTo-Html Title, Link, InsuredName, ClaimNumber, PolicyNumber, EffectiveYear, Modified -head $a -body "&lt;H1&gt;$($rows.count) Recent Property Adjuster Report$($Plural)&lt;/H1&gt;"  | Convert-HTMLEscape

Here’s the Convert-HTMLEscape function. It’s a little dense, but works within the pipeline and does what we need, by converting the XML equivalent of the two common XML characters right back to the correct characters, basically undoing the bit of a mess made by ConvertTo-HTML:

Function Convert-HTMLEscape {
# convert &amp;lt; and &amp;gt; to &lt;&gt; It is assumed that these will be in pairs
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
&lt;#
replace codes with actual symbols. This line is a shortcut to do two replacements with one line of code. The code in the first
set of parentheses revised text with &amp;quot;.
#&gt;
(($item.Replace("&amp;lt;","")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function

Now the easy part, let’s generate the email. Note the format sets isHTML to $true, and uses SMTP:

#email setup, can move to top
#param( 
[string] $From = "SharePointSupport@MyDomain.com"
[string] $To = $null; #$Recipients #"joelplaut@MyDomain.com"
[string] $Title = "Daily Report of updated custom Reports"
#[string] $Body = "body"
#)
$Body = $rows | ConvertTo-Html
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.Company limited.com"
$SmtpClient.host = $SmtpServer
#    $SmtpClient.Send($From,$To,$Title,$Body)
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)

Now let’s put it all together. Note I always set an $env for the environment, to make it easy to test in Dev, before deploying in Production.

# SharePoint Search Alerts
Clear-Host
Write-Host "Start LC Alerts" -ForegroundColor darkblue
Add-PsSnapin Microsoft.SharePoint.PowerShell -erroraction silentlycontinue
$ToRecipients = "MyADGroup@MyDomain.com"
$CCRecipients = "joelplaut@MyDomain.com,Bozo@MyDomain.com"
$ToRecipientsArray = $ToRecipients.Split(",");
$CCRecipientsArray = $CCRecipients.Split(",");
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
$env="Prod"
if ($env -eq "Dev")
{
$webappurl = "http ://devFarm/" # Replace with URL of web application that you wish to warm up
$filterpath = "http ://devFarm/insureds"
}
else
{
$webappurl = "http ://SharePoint/" # Replace with URL of web application that you wish to warm up
$filterpath = "http ://SharePoint/insureds" #path for exercising previews
}
Function Convert-HTMLEscape {
&amp;lt;#
convert &amp;lt; and &amp;gt; to 
It is assumed that these will be in pairs
#&amp;gt;
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
&amp;lt;#
replace codes with actual symbols
This line is a shortcut to do two replacements
with one line of code. The code in the first
set of parentheses revised text with &amp;quot;.
#&amp;gt;
(($item.Replace("&amp;lt;","")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function
$Mydate=get-date
$MyDate = $Mydate.AddDays(-2)
$MyDateStr = $Mydate.Year.ToString("0000") + "-" + $Mydate.Month.ToString("00")  + "-" + $Mydate.day.ToString("00")  #formatted YYYY-MM-DD
$site = New-Object Microsoft.SharePoint.SPSite $webappurl
$vc =New-Object Microsoft.Office.Server.Search.Query.KeywordQuery $site
$vc.ResultsProvider = [Microsoft.Office.Server.Search.Query.SearchProvider]::FASTSearch
$vc.ResultTypes = [Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults
#In my case I enabled the FQL syntax and set some other parameters:
$vc.EnableFQL = $true # enable FQL
$vc.RowLimit = 50 # sets the limit of results
$vc.StartRow = 0 # 0 is the default
$vc.SelectProperties.Add("Company Name")
$vc.SelectProperties.Add("URL")
$vc.SelectProperties.Add("Title")
$vc.SelectProperties.Add("Filename")
$vc.SelectProperties.Add("Company ClaimNumber")
$vc.SelectProperties.Add("Company PolicyNumber")
$vc.SelectProperties.Add("Company Modified")
$vc.SelectProperties.Add("Company EffectiveYear")
#Query / Result
$q1='and(filter(Company modified:range(datetime("'
$q2='"), max, from="GT")), filter(Company claimdocumenttype:equals("Property Adjuster Reports")))'
$BigQ=$q1+$MyDateStr+$q2
$vc.QueryText = $BigQ
$results = $vc.Execute()
#$results
$resultsTable = $results.Item([Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults)
$resultsDataTable = $resultsTable.Table
$rows = $resultsDataTable.Rows
$table = New-Object system.Data.DataTable “SearchReport”
$col1 = New-Object system.Data.DataColumn Title,([string])
$col2 = New-Object system.Data.DataColumn InsuredName,([string])
$col3 = New-Object system.Data.DataColumn ClaimNumber,([string])
$col4 = New-Object system.Data.DataColumn Link,([string])
$col5 = New-Object system.Data.DataColumn PolicyNumber,([string])
$col6 = New-Object system.Data.DataColumn Modified,([string])
$col7 = New-Object system.Data.DataColumn EffectiveYear,([string])
$col8 = New-Object system.Data.DataColumn FileName,([string])
$col9 = New-Object system.Data.DataColumn DMF,([string])
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
if ($rows.count -gt 0)
{
for ($i=0; $i -lt $rows.Count; $i++)
{
$row = $table.NewRow()
$row.Link = "&lt;a&gt;'+ " Link " + "&lt;/a&gt;";
#$row.DMF = ($row.Link.Replace("http://","DMF://")).replace(" File Link", "SP Explorer")  Took out, doesn't appear to work quite right
$row.Title = $rows[$i].Title;
$row.InsuredName= $rows[$i].CompanyName;
$row.ClaimNumber= $rows[$i].CompanyClaimNumber;
$row.PolicyNumber= $rows[$i].CompanyPolicyNumber;
$row.EffectiveYear= $rows[$i].CompanyEffectiveYear;
$row.FileName= $rows[$i].FileName;
$row.Modified= $rows[$i].Company Modified.substring(0,10);
$table.Rows.Add($row)
#("&lt;a&gt;" + $_.Portname.SubString(3) + "&lt;/a&gt;"}}
}
$a = ""
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ""
#Filename removed at My's suggestion
#$MyOutput = $rows | ConvertTo-Html Title, Author, URL, Link -body "&lt;H1&gt;Recent Custom Reports&lt;/H1&gt;" -PostContent "Goodbye and thanks for all the fish"
if ($rows.count -eq 1)
{
$Plural=$null;
}
else
{
$Plural="s";
}
$MyOutput = $table| ConvertTo-Html Title, Link, InsuredName, ClaimNumber, PolicyNumber, EffectiveYear, Modified -head $a -body "&lt;H1&gt;$($rows.count) Recent Custom Report$($Plural)&lt;/H1&gt;"  | Convert-HTMLEscape
#$MyOutput &amp;gt; C:A.html #debug technique
#email setup, can move to top
#param( 
[string] $From = "SharePointSupport@Company limited.com"
[string] $To = $null; #$Recipients #"joelplaut@Company limited.com"
[string] $Title = "Daily Report of updated Property Adjuster Reports"
#[string] $Body = "body"
#)
$Body = $rows | ConvertTo-Html
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.Company limited.com"
$SmtpClient.host = $SmtpServer
#    $SmtpClient.Send($From,$To,$Title,$Body)
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)
} #don't bother, no new hits

Structure Document IDs to be unique across Site Collections

Configure Document IDs to be unique across the farm

Document IDs are only guaranteed unique within a single site collection. SharePoint tries to ensure uniqueness by putting a random prefix in front of each Docuemnt ID, and setting that at the Site Collection level. However you can easily rationalize these, and make the Document IDs a bit easier to read. The script below assigns a prefix, and sets up SharePoint to reissue Document IDs. Note the actual regeneration of Document IDs will wait until the Document ID Assignment Timer Job runs. This job can take a long time to run, depending on the number of items in your Site Collections.

[system.reflection.assembly]::LoadWithPartialName("Microsoft.Office.DocumentManagement") 
$siteUrl = "http ://sharepoint/ManagedPath"  #this is the header prefixing my Site Collections
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"  #Here are the individual Site Collections
$LoopStringArr = $LoopString.Split(“,”)
foreach ($letter in $LoopStringArr)
{
$SiteName=$siteurl+$letter
write-host $SiteName
$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
[Microsoft.Office.DocumentManagement.DocumentID]::EnableAssignment($Site,$false)   #First disable, then enable DocID assignment
[Microsoft.Office.DocumentManagement.DocumentID]::EnableAssignment($Site,$true)
$rootweb=$site.rootweb
$rootweb.properties["docid_msft_hier_siteprefix"]="Ins$letter"  # This is the property holding the Document ID Prefix which we use to ensure uniqueness
$rootweb.properties.Update()
$rootweb.Update()
[Microsoft.Office.DocumentManagement.DocumentID]::EnableAssignment($Site,$true,$true,$true)  # now we can force all Document IDs to be reissued
}

For more information on the methods used above, please reference MSDN

Trusted MySite Host Location

How often can one make a change in Development that blows up Production?  Within the User Profile Service Application one can configure the MySite Host Location.  However watch out, as I made a change in a Dev environment that propogated and affected all Production users.

In Dev, I set the Trusted Host Location without setting a target audience.  This broadcast to all farms (including Production) that this location should be the default.  Users attempting to go to their MySite were instantly redirected to Dev!  Of course DNS and Production didn’t offer a clue.  Removing the Trusted Host location in Dev eliminated the problem.

In a related annoyance, the My Site Host location in Setup MySite defaults back to the Default Zone for the MySite web app.  With AAM set up correctly, it will change any entry back to the default zone.  So to change the MySite Host Location, it seems I’ll need to rebuild the web app with the desired default MySite URL.  Not too convenient since I have quite a few Web App custom settings such as Maximum file upload size and Super User Publishing Cache account definitions.