Limiting Search Crawling to a subsite

I had an interesting challenge.  I was asked to limit Search Crawling to a single subsite.  The underlying issue was that a great deal of security in this farm was implemented via Audiences which is not a secure method of locking down content. Audiences expose documents and items to users, but don’t prevent the user from actually accessing the documents or items.  Search Content Sources expect to have nice and simple Web Application URLs to crawl.  So how best to restrict crawling to a subsite?

The simple answer is set up the Content Source to crawl the whole Web Application, but set up Crawl Rules to exclude everything else.  Only two rules are needed:

  1. Include: List the site to include, such as “http ://sharepoint/sites/site1/site2”
    Note the * at the end to ensure all sub-content is crawled.  Being the first crawl rule, this takes precedence over the next. Don’t forget the *.*
    It seems the testing of the crawl rule with just a * will appear to capture all content, but at crawl time, only a *.* will capture content with a file extension.
  2. Exclude: List everything else: http://*.*
    This will exclude anything not captured in the first rule.
  3. If you have a content source that includes people (sps3://sharepoint) be sure to use a wildcard on the protocol as well.

Voila!

Quickly finding the SharePoint Search index

It is useful to quickly locate the SharePoint 2013 search index on a farm.

For example, it’s useful to exclude it from AntiVirus scans.
This can also be useful for checking security, disk usage, or for relocating the index.
Here’s how:

This set of commands will give you details on the search topology:

$ssa = Get-SPServiceApplication –Name “Search Service Application”
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
Get-SPEnterpriseSearchComponent -SearchTopology $active

To simply locate the search index on the file system, use these commands:

$ssi = Get-SPEnterpriseSearchServiceInstance
$ssi.Components

Running SharePoint 2013 search within a limited RAM footprint

Running SharePoint 2013 search with limited RAM

SharePoint 2013 search is very powerful, however if you have limited server resources, it can easily get the better of your environment.  I’ve seen a small SharePoint 2013 environment go unstable, with w3p processes crashing, ULS logs filling with low RAM errors, and search index going into “Degraded” mode during a crawl, and end-user search attempts returning correlation errors, and even sites and Central Admin returning 500 errors; all for wont of a few more GB of RAM.  An IIS Reset gets the server responsive again, and an index reset will get SharePoint crawling again, but outside of tossing in precious RAM chips, what’s a caring administrator to do?  Let’s first see how to determine whether your search index is degraded:

Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchStatus
Name State Description
---- ----- -----------
IndexComponent1 Degraded
Cell:IndexComponent1-SPb5b3474c2cdcI.0.0 Degraded
Partition:0 Degraded
AdminComponent1 Active
QueryProcessingComponent1 Active
ContentProcessingComponent1 Active
AnalyticsProcessingComponent1 Active
CrawlComponent0 Active

In the example above, note the Index component is degraded.  In Central Admin, simply do an Index Reset to get things back on foot, and restart the World Web Publishing to kick-start IIS and its app pools.  In the  command below, we’ll lower the priority of Search, so it doesn’t blow up our underresourced farm:

set-SPEnterpriseSearchService -PerformanceLevel Reduced

Next, let’s limit the RAM utilized by the NodeRunners; these are the processes that handle search crawling.  You can find this on C: or perhaps a different drive letter on your system:

C:Program FilesMicrosoft Office Servers15.0SearchRuntime1.0

Open text file (notepad is fine, especially if your farm is wheezing from being RAM challenged, here’ the XML file: file noderunner.exe.CONFIG
Change value from 0 to 180. Note I would not run with less than 180MB per nodeRunner, as I’ve seen search components fail to start as a result.

<nodeRunnerSettings memoryLimitMegabytes="180" />

Try another crawl, with a more RAM stable experience.

Here’s how to tell where your index is located on disk:

$ssi = Get-SPEnterpriseSearchServiceInstance
$ssi.Components

Here’s how to get the topology and the Index component:

$ssa = Get-SPEnterpriseSearchServiceApplication
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
$iComponent = $active | Get-SPEnterpriseSearchComponent IndexComponent1

Report on all Search Site references across SharePoint Site Collections

I got an interesting request recently to find all search centers configured for all Site Collections. I thought I would share the very simple script to do this:

Get-SPSite -limit all | % {
write-host "$($_.url),$($_.rootweb.AllProperties["SRCH_ENH_FTR_URL"])"
}

You can set the search drop down property using this assignment

$web.AllProperties[“SRCH_SITE_DROPDOWN_MODE”] = HideScopeDD_Defaultcontextual

Here are the possible HideScopeDD_Defaultcontextual Values and what they mean

Site Collection Search Dropdown Mode Property Value Search Results URL
Do Not Show Scopes Dropdown, and default to contextual scope HideScopeDD_DefaultContextual Y
Do Not Show Scopes Dropdown, and default to target results page HideScopeDD N
Show scopes Dropdown ShowDD Y
Show, and default to ‘s’ URL parameter ShowDD_DefaultURL Y
Show and default to contextual scope ShowDD_DefaultContextual Y
Show, do not include contextual scopes ShowDD_NoContextual N
Show, do not include contextual scopes, and default to ‘s’ URL parameter ShowDD_NoContextual_DefaultURL N

Here’s the full PowerShell script to set these values:

$web = Get-SPWeb http://sharepoint/managedpath/site
$web.AllProperties[“SRCH_ENH_FTR_URL”] = “/search/”
$web.AllProperties[“SRCH_SITE_DROPDOWN_MODE”] = HideScopeDD_Defaultcontextual
$web.AllProperties[“SRCH_TRAGET_RESULTS_PAGE”] =”/_layouts/OSSSearchResults.aspx”
$web.update()

A link straight to a SharePoint document’s metadata

Often users want a link direct to a document’s metadata. That’s easily done using this format:
ht tp://SharePoint/sites/SiteCol/DemoMajorV2/forms/DispForm.aspx?ID=[x]

Here’s a sample link to a document’s metadata properties, just add the ID:
ht tp://SharePoint/sites/SiteCol/DemoMajorV2/forms/DispForm.aspx?ID=285

I took a random document:
h ttp://SharePoint/sites/SiteCol/DemoMajorV2/TestDoc.docx

Found its ID in the browser by adding it to a View:
ht tp://SharePoint/sites/SiteCol/DemoMajorV2/Forms/My%20Documents.aspx

Then took the format:
ht tp://SharePoint/sites/SiteCol/DemoMajorV2/forms/DispForm.aspx?ID=[x] and added the number to it:

ht tp://SharePoint/sites/SiteCol/DemoMajorV2/forms/DispForm.aspx?ID=285

That same format can be used within the search XSL to add a reference to view the document’s metadata in search results. Here’s the XSL to paste into the XSL field in Core Search Results:

<div class=”srch-Title3″>
<xsl:variable name=”itemid” select=”ItemID”/>
<xsl:choose>
<xsl:when test=”contentclass[. = 'STS_ListItem_DocumentLibrary']“>
<xsl:choose>
<xsl:when test=”contains(basic4,’http’)”>
<xsl:variable name=”library” select=”substring-after(substring-after(url,basic4),’/')” />
<xsl:variable name=”displayUrl” select=”concat(basic4, ‘/’, substring-before($library,’/'),’/Forms/DispForm.aspx?ID=’,itemid)” />
<a href=”{$displayUrl}”>
Show properties
</a>
</xsl:when>
<xsl:otherwise>
<xsl:variable name=”DocLib” select=”substring-after(substring-after(url,sitename),’/')” />
<xsl:variable name=”MetaDataPath” select=”concat(sitename, ‘/’, substring-before($DocLib,’/'),’Forms/DispForm.aspx?ID=’,itemid)” />
<a href=”{$MetaDataPath}”>
Show properties
</a>
</xsl:otherwise>
</xsl:choose>
<a href=”{sitename}”>
Show library
</a>
<br></br>
</xsl:when>
<xsl:otherwise>
</xsl:otherwise>
</xsl:choose>
</div>

Tuning SharePoint Search Ranking

Tuning SharePoint Search Ranking in the object model

SharePoint Search results are returned in order of relevancy, which is determined by a ranking model. There are a number of ranking models cooked into SharePoint 2010. These can be refined to a limited extent, with a bit of insight, to better serve users.

To see the models and their definition, let’s query the SharePoint Search application DB:

SELECT * FROM [Search_Service_Application_DB].[dbo].[MSSRankingModels]

The resultset has the models; the GUID, whether it is default, and the underlying XML that specifies the model. The model name is at the beginning of the XML.

Using PowerShell, we can get the array of ranking models, and is the only supported approach for manipulating the models, changing the default, and for creating new ranking models. Here’s how to get the models:

Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel

Now we can assign the ranking model array to a variable and index into it:

$A = Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel

Or we can grab the one ranking model that we like by using the GUID, which we have to predetermine, but that’s easy, as it’s returned by the above query and is unchanging. For new models, we get to specify the GUID as well.

Once you know your rank model GUID, you can switch to it by getting it, and setting it as default:

$r = Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel 8f6fd0bc-06f9-43cf-bbab-08c377e083f4
$r.MakeDefault()

To create a custom rank model, first identify the Managed Properties, by PID. The name is part of the XML, but it is the PID that drives the ranking. Here’s how to get all the Managed Properties and their PIDs:

Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchMetadataManagedProperty

Now we create a new ranking model called MyRank, note i want LastModified property to be relevant.

Get-SPEnterpriseSearchServiceApplication | New-SPEnterpriseSearchRankingModel –rankingmodelxml "<?xml version='1.0'?><rankingModel name='MyRank2' id='8447b4bc-3582-45c5-9cb8-ba2a319d850e' description='CustomJoelRank2' xmlns='http://schemas.microsoft.com/office/2009/rankingModel'>
<queryDependentFeatures>
<queryDependentFeature name='Body' pid='1' weight='0.00125145559138435' lengthNormalization='0.0474870346616999'/>
<queryDependentFeature name='LastModifiedTime' pid='4' weight='3.46602125767061' lengthNormalization='0.549393313908594'/>
<queryDependentFeature name='Title' pid='2' weight='1.46602125767061' lengthNormalization='0.549393313908594'/>
<queryDependentFeature name='Author' pid='3' weight='0.410225403867996' lengthNormalization='1.0563226501349'/>
<queryDependentFeature name='DisplayName' pid='56' weight='0.570071355441683' lengthNormalization='0.552529462971364'/>
<queryDependentFeature name='ExtractedTitle' pid='302' weight='1.67377875011698' lengthNormalization='0.600572652201123'/>
<queryDependentFeature name='SocialTag' pid='264' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
<queryDependentFeature name='QLogClickedText' pid='100' weight='1.87179361911171' lengthNormalization='3.31081658691434'/>
<queryDependentFeature name='AnchorText' pid='10' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
</queryDependentFeatures>
<queryIndependentFeatures>
<queryIndependentFeature name='ClickDistance' pid='96' default='5' weight='1.86902034145632'>
<transformInvRational k='0.0900786349287429'/>
</queryIndependentFeature>
<queryIndependentFeature name='URLDepth' pid='303' default='3' weight='1.68597497899313'>
<transformInvRational k='0.0515178916330992'/>
</queryIndependentFeature>
<queryIndependentFeature name='Lastclick' pid='341' default='0' weight='0.219043069749249'>
<transformRational k='5.44735200915216'/>
</queryIndependentFeature>
<languageFeature name='Language' pid='5' default='1' weight='-0.56841237556044'/>
</queryIndependentFeatures>
</rankingModel>"

There are two parts to the model; the query dependent section that is associated with the actual query and it’s metadata, and the query independent part that ranks based on number of slashes (URLDepth) and click frequency etc.

As soon as a model is default, you can see the effect of the new ranking model.

Here’s how to change this model, note I add a new field called MyCompany and boost its relevance:

<?xml version="1.0" encoding="utf-8"?>
Get-SPEnterpriseSearchServiceApplication | Get-SPEnterpriseSearchRankingModel 8447b4bc-3582-45c5-9cb8-ba2a319d850e | Set-SPEnterpriseSearchRankingModel –rankingmodelxml "<?xml version='1.0'?><rankingModel name='CustomJoelRank2' id='8447b4bc-3582-45c5-9cb8-ba2a319d850e' description='MyRank2' xmlns='http://schemas.microsoft.com/office/2009/rankingModel'>
<queryDependentFeatures>
<queryDependentFeature name='Body' pid='1' weight='0.00125145559138435' lengthNormalization='0.0474870346616999'/>
<queryDependentFeature name='MyCompany' pid='414' weight='3.610225403867996' lengthNormalization='1.0563226501349'/>
<queryDependentFeature name='Title' pid='2' weight='0.46602125767061' lengthNormalization='0.549393313908594'/>
<queryDependentFeature name='Author' pid='3' weight='0.410225403867996' lengthNormalization='1.0563226501349'/>
<queryDependentFeature name='DisplayName' pid='56' weight='0.570071355441683' lengthNormalization='0.552529462971364'/>
<queryDependentFeature name='ExtractedTitle' pid='302' weight='1.67377875011698' lengthNormalization='0.600572652201123'/>
<queryDependentFeature name='SocialTag' pid='264' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
<queryDependentFeature name='QLogClickedText' pid='100' weight='1.87179361911171' lengthNormalization='3.31081658691434'/>
<queryDependentFeature name='AnchorText' pid='10' weight='0.593169953073459' lengthNormalization='2.28258134389272'/>
</queryDependentFeatures>
<queryIndependentFeatures>
<queryIndependentFeature name='ClickDistance' pid='96' default='5' weight='1.86902034145632'>
<transformInvRational k='0.0900786349287429'/>
</queryIndependentFeature>
<queryIndependentFeature name='URLDepth' pid='303' default='3' weight='1.68597497899313'>
<transformInvRational k='0.0515178916330992'/>
</queryIndependentFeature>
<queryIndependentFeature name='Lastclick' pid='341' default='0' weight='0.219043069749249'>
<transformRational k='5.44735200915216'/>
</queryIndependentFeature>
<queryIndependentFeature name='CustomJoelModified' pid='445' default='1' weight='2.56841237556044'>
<transformRational k='5.44735200915216'/>
</queryIndependentFeature>
<languageFeature name='Language' pid='5' default='1' weight='1.5'/>
</queryIndependentFeatures>
</rankingModel>"

I admittedly did not have success ranking by how recent a document was updated. This is known as “Freshness”. SP2010 has very limited ability to customize ranking. I have not succeeded in getting it to respect “freshness”. A simple freshness ranking seems infuriatingly out of reach. However SP2013 supports it explicitly. While the default SharePoint 2013 ranking model doesn’t boost the rank of search results based on their freshness, we can achieve this by adding a tuning of the static rank that combines information from the LastModifiedTime managed property with the DateTimeUtcNow query property, using the freshness transform function. These Transform functions are used to customize ranking in SP2013. The freshness transform function is the only transform that we can use for this freshness rank feature, because it converts the age of the item from an internal representation into days. In SP2010 the transforms are much more obscure and not really usable. Microsoft reports that the freshness transform in SP2013 can be used. Even before getting to SP2013, we can have an SP2013 farm configured to crawl production SP2010 and return results tuned in this way, and can use the SP2013 search results to serve any client we choose to point to SP2013, including a simple search site in SP2013.

Reporting on all SharePoint Search Scopes

Search scopes are often created to refine the results returned on SharePont Search. I’ve written this small snippet of PowerShell as an easy way to get a report on all Scopes. I decided not to embelish it, and keep it quick and (not too) dirty, here goes:

$a = Get-SPEnterpriseSearchServiceApplication #grabs Content and Query
$scopes = $a | Get-SPEnterpriseSearchQueryScope
foreach ($Scope in $scopes)
{
write-host $Scope.name
write-host "=======================";
$scope.Rules  #outputs all the rules
}

FAST SharePoint Property Mapping Report

Generating a FAST SharePoint Property Mapping Report

In Search, it is not an exageration to say the property mapping is the heart of the customized search intelligence. Managed properties allow search to be customized, to serve an organization’s needs. I thought it would be useful to report on the mapping of the managed properties to crawled properties. I use a CSV format for the report, where a ‘|’ is used as the delimiter, and a semi-colon is used to separate crawled properties. Using Excel, one can easily convert pipe-delimited into columns.

The first thing we want to do is get the collection of managed properties using Get-FASTSearchMetadataManagedProperty. Then for each Managed Property, we get the associated crawled properties using the getcrawledPropertyMappings() method. Here’s the full script:

$ReportFileName = &quot;C:tempMappingReport.csv&quot;
$sep = '|'
cls
$LineOut = &quot;Name$($sep)Description$($sep)Type$($sep)Mapping&quot;
add-content $ReportFileName $($LineOut)
$mps = Get-FASTSearchMetadataManagedProperty
Foreach ($MP in $mps)
{
$q = $MP.getcrawledPropertyMappings();
$CPs=$null
if ($q.gettype().Name -eq &quot;CrawledPropertyMappingImpl&quot;) 
{
foreach ($cp in $q)
{
$CPs = &quot;$($cps);$($cp.name)&quot;
}
if ($CPs -ne $null)
{
$cps = $CPs.remove(0,1);
}
}
else
{
$CPs = $q.gettype().Name;
}
$LineOUt = &quot;$($MP.Name)$($sep)$($MP.Description)$($sep)$($MP.Type)$($sep)$($CPS)&quot;
add-content $ReportFileName $($LineOut)
}

View all Crawled Properties for a given SharePoint Document

View all Crawled Properties for a SharePoint Document

I often need to examine all the properties of a document. This is most useful for researching issues relating to the crawl property values.

In this little PowerShell function I grab the SPItem, and split the internal XML with line feeds. Here’s the function:

Function Get-CrawledPropertyNames([string]$DocURL){  
$DocURL = $DocURL.Replace("%20"," ") 
$webfound = $false 
$weburl = $DocURL 
while ($webfound -eq $false) { 
if ($weburl.Contains("/")){ 
$weburl = $weburl.Substring(0,$weburl.LastIndexOf("/")) 
$web = get-spweb -identity $weburl -ea 0 
if ($web -ne $null){ 
$webfound = $true 
} 
}else{ 
Write-Host -ForegroundColor Red "The Web could not be found" 
return -1 
} 
} 
$web.GetFile($DocURL).item.xml.Replace("' ", "' `n").Replace("`" ", "`" `n") 
}  

#To use, simply replace with the url of a file within a document library, here’s an example:

#Get-CrawledPropertyNames "http ://SharePoint/sites/SPWeb/Library/folder/FileName.DOC"

Customized scheduled SharePoint Search Alerts in HTML

Customized scheduled SharePoint Search Alerts in HTML

Occasionally I get requests for customized notification about documents within a SharePoint farm. Regular Alerts and SharePoint Search Alerts work great out of the box, but sometimes users want something more such as:
– Complex criteria
– Custom sort sequence
– Custom metadata
– Customized notification frequency
– Custom message text, or subject line
– Refined layout

The solution I’ve used is to script a search query in powerShell, and load it into a formatted HTML table, and schedule it to the desired frequency. I’ll outline the framework below that is easily adapted and extended. Note I wrote this for FAST, but using SharePoint Search Query classes you can achieve similar results in regular SharePoint search.

First, let’s establish some basics about search and notification. For this solution, I only want to return documents that are up to two or three days old, so I grab the date, take two off it, and put it into a format we can use later for the query:

$Mydate=get-date
$MyDate = $Mydate.AddDays(-2)
$MyDateStr = $Mydate.Year.ToString("0000") + "-" + $Mydate.Month.ToString("00")  + "-" + $Mydate.day.ToString("00")  #formatted YYYY-MM-DD

Let’s now set up for the search querty. I chose to use FQL (FAST Query Language), but you can use Keyword Search. Note SQL Search is deprecated. I chose 50 results, but you can choose whatever amount you prefer:

$site = New-Object Microsoft.SharePoint.SPSite $webappurl
$vc =New-Object Microsoft.Office.Server.Search.Query.KeywordQuery $site
$vc.ResultsProvider = [Microsoft.Office.Server.Search.Query.SearchProvider]::FASTSearch
$vc.ResultTypes = [Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults
#In my case I enabled the FQL syntax and set some other parameters:
$vc.EnableFQL = $true # enable FQL
$vc.RowLimit = 50 # sets the limit of results
$vc.StartRow = 0 # 0 is the default

Now let’s make sure the query returns the fields you want. These must be Managed Properties configured followed by a Full Crawl.

$vc.SelectProperties.Add("Company Name")
$vc.SelectProperties.Add("URL")
$vc.SelectProperties.Add("Title")
$vc.SelectProperties.Add("Filename")
$vc.SelectProperties.Add("Company ClaimNumber")
$vc.SelectProperties.Add("Company PolicyNumber")
$vc.SelectProperties.Add("Company Modified")
$vc.SelectProperties.Add("Company EffectiveYear")

Now let’s piece together the XML of the FQL. Note two strings ($q1 and $q2) are used to construct the query and put into $BigQ, with the date we formatted earlier. We’re looking for documents newer than two days ago, where a particular field CompanyClaimDocumentType equals a specific value (“Specific Value”). Then we execute the FQL:

$q1='and(filter(Company modified:range(datetime("'
$q2='"), max, from="GT")), filter(CompanyClaimDocumentType:equals("Specific Value")))'
$BigQ=$q1+$MyDateStr+$q2
$vc.QueryText = $BigQ
$results = $vc.Execute()

Now let’s convert the search results into a DataTable to make it easy to shape into an HTML table for the outbound email alert notification, we’ll define the columns, and load the values. One nice thing is to shape the link column to allow a hyperlink embedded in the table for easy user access to the documents. I also structure a special link using the DMF:// protocol supported by MacroView DMF:

$resultsTable = $results.Item([Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults)
$resultsDataTable = $resultsTable.Table
$rows = $resultsDataTable.Rows
$table = New-Object system.Data.DataTable “SearchReport”
$col1 = New-Object system.Data.DataColumn Title,([string])
$col2 = New-Object system.Data.DataColumn CompanyName,([string])
$col3 = New-Object system.Data.DataColumn ClaimNumber,([string])
$col4 = New-Object system.Data.DataColumn Link,([string])
$col5 = New-Object system.Data.DataColumn PolicyNumber,([string])
$col6 = New-Object system.Data.DataColumn Modified,([string])
$col7 = New-Object system.Data.DataColumn EffectiveYear,([string])
$col8 = New-Object system.Data.DataColumn FileName,([string])
$col9 = New-Object system.Data.DataColumn DMF,([string])
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
if ($rows.count -gt 0)
{
for ($i=0; $i -lt $rows.Count; $i++)
{
$row = $table.NewRow()
$row.Link = "&lt;a&gt;'+ " Link " + "&lt;/a&gt;";
#$row.DMF = ($row.Link.Replace("http://","DMF://")).replace(" File Link", "SP Explorer")  Took out, doesn't appear to work quite right
$row.Title = $rows[$i].Title;
$row.InsuredName= $rows[$i].CompanyName;
$row.ClaimNumber= $rows[$i].CompanyClaimNumber;
$row.PolicyNumber= $rows[$i].CompanyPolicyNumber;
$row.EffectiveYear= $rows[$i].CompanyEffectiveYear;
$row.FileName= $rows[$i].FileName;
$row.Modified= $rows[$i].Company Modified.substring(0,10);
$table.Rows.Add($row)
#("&lt;a&gt;" + $_.Portname.SubString(3) + "&lt;/a&gt;"}}
}

Now, we want to shape this into an outbound table in the email. To do that we’ll use the handy CovertTo-HTML CmdLet. Just pass in the column names you want to appear. Above I map more than we use below. I shape the borders and colors for a really professional look. However we have a sticky problem where we don’t want it to mess with the embedded HTML such as brackets around the a href for links. To solve that, I pipe the stream into a routine called Convert-HTMLEscape that I will outline shortly. Note the nice H1 header, and we’ll nicely handle single/plural number of rows in the subject line and top of the email:

$a = ""
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ""
#$MyOutput = $rows | ConvertTo-Html Title, Author, URL, Link -body "&lt;H1&gt;Recent Custom Reports&lt;/H1&gt;" -PostContent "Goodbye and thanks for all the fish"
if ($rows.count -eq 1)
{
$Plural=$null;
}
else
{
$Plural="s";
}
$MyOutput = $table| ConvertTo-Html Title, Link, InsuredName, ClaimNumber, PolicyNumber, EffectiveYear, Modified -head $a -body "&lt;H1&gt;$($rows.count) Recent Property Adjuster Report$($Plural)&lt;/H1&gt;"  | Convert-HTMLEscape

Here’s the Convert-HTMLEscape function. It’s a little dense, but works within the pipeline and does what we need, by converting the XML equivalent of the two common XML characters right back to the correct characters, basically undoing the bit of a mess made by ConvertTo-HTML:

Function Convert-HTMLEscape {
# convert &amp;lt; and &amp;gt; to &lt;&gt; It is assumed that these will be in pairs
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
&lt;#
replace codes with actual symbols. This line is a shortcut to do two replacements with one line of code. The code in the first
set of parentheses revised text with &amp;quot;.
#&gt;
(($item.Replace("&amp;lt;","")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function

Now the easy part, let’s generate the email. Note the format sets isHTML to $true, and uses SMTP:

#email setup, can move to top
#param( 
[string] $From = "SharePointSupport@MyDomain.com"
[string] $To = $null; #$Recipients #"joelplaut@MyDomain.com"
[string] $Title = "Daily Report of updated custom Reports"
#[string] $Body = "body"
#)
$Body = $rows | ConvertTo-Html
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.Company limited.com"
$SmtpClient.host = $SmtpServer
#    $SmtpClient.Send($From,$To,$Title,$Body)
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)

Now let’s put it all together. Note I always set an $env for the environment, to make it easy to test in Dev, before deploying in Production.

# SharePoint Search Alerts
Clear-Host
Write-Host "Start LC Alerts" -ForegroundColor darkblue
Add-PsSnapin Microsoft.SharePoint.PowerShell -erroraction silentlycontinue
$ToRecipients = "MyADGroup@MyDomain.com"
$CCRecipients = "joelplaut@MyDomain.com,Bozo@MyDomain.com"
$ToRecipientsArray = $ToRecipients.Split(",");
$CCRecipientsArray = $CCRecipients.Split(",");
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
$env="Prod"
if ($env -eq "Dev")
{
$webappurl = "http ://devFarm/" # Replace with URL of web application that you wish to warm up
$filterpath = "http ://devFarm/insureds"
}
else
{
$webappurl = "http ://SharePoint/" # Replace with URL of web application that you wish to warm up
$filterpath = "http ://SharePoint/insureds" #path for exercising previews
}
Function Convert-HTMLEscape {
&amp;lt;#
convert &amp;lt; and &amp;gt; to 
It is assumed that these will be in pairs
#&amp;gt;
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
&amp;lt;#
replace codes with actual symbols
This line is a shortcut to do two replacements
with one line of code. The code in the first
set of parentheses revised text with &amp;quot;.
#&amp;gt;
(($item.Replace("&amp;lt;","")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function
$Mydate=get-date
$MyDate = $Mydate.AddDays(-2)
$MyDateStr = $Mydate.Year.ToString("0000") + "-" + $Mydate.Month.ToString("00")  + "-" + $Mydate.day.ToString("00")  #formatted YYYY-MM-DD
$site = New-Object Microsoft.SharePoint.SPSite $webappurl
$vc =New-Object Microsoft.Office.Server.Search.Query.KeywordQuery $site
$vc.ResultsProvider = [Microsoft.Office.Server.Search.Query.SearchProvider]::FASTSearch
$vc.ResultTypes = [Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults
#In my case I enabled the FQL syntax and set some other parameters:
$vc.EnableFQL = $true # enable FQL
$vc.RowLimit = 50 # sets the limit of results
$vc.StartRow = 0 # 0 is the default
$vc.SelectProperties.Add("Company Name")
$vc.SelectProperties.Add("URL")
$vc.SelectProperties.Add("Title")
$vc.SelectProperties.Add("Filename")
$vc.SelectProperties.Add("Company ClaimNumber")
$vc.SelectProperties.Add("Company PolicyNumber")
$vc.SelectProperties.Add("Company Modified")
$vc.SelectProperties.Add("Company EffectiveYear")
#Query / Result
$q1='and(filter(Company modified:range(datetime("'
$q2='"), max, from="GT")), filter(Company claimdocumenttype:equals("Property Adjuster Reports")))'
$BigQ=$q1+$MyDateStr+$q2
$vc.QueryText = $BigQ
$results = $vc.Execute()
#$results
$resultsTable = $results.Item([Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults)
$resultsDataTable = $resultsTable.Table
$rows = $resultsDataTable.Rows
$table = New-Object system.Data.DataTable “SearchReport”
$col1 = New-Object system.Data.DataColumn Title,([string])
$col2 = New-Object system.Data.DataColumn InsuredName,([string])
$col3 = New-Object system.Data.DataColumn ClaimNumber,([string])
$col4 = New-Object system.Data.DataColumn Link,([string])
$col5 = New-Object system.Data.DataColumn PolicyNumber,([string])
$col6 = New-Object system.Data.DataColumn Modified,([string])
$col7 = New-Object system.Data.DataColumn EffectiveYear,([string])
$col8 = New-Object system.Data.DataColumn FileName,([string])
$col9 = New-Object system.Data.DataColumn DMF,([string])
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
if ($rows.count -gt 0)
{
for ($i=0; $i -lt $rows.Count; $i++)
{
$row = $table.NewRow()
$row.Link = "&lt;a&gt;'+ " Link " + "&lt;/a&gt;";
#$row.DMF = ($row.Link.Replace("http://","DMF://")).replace(" File Link", "SP Explorer")  Took out, doesn't appear to work quite right
$row.Title = $rows[$i].Title;
$row.InsuredName= $rows[$i].CompanyName;
$row.ClaimNumber= $rows[$i].CompanyClaimNumber;
$row.PolicyNumber= $rows[$i].CompanyPolicyNumber;
$row.EffectiveYear= $rows[$i].CompanyEffectiveYear;
$row.FileName= $rows[$i].FileName;
$row.Modified= $rows[$i].Company Modified.substring(0,10);
$table.Rows.Add($row)
#("&lt;a&gt;" + $_.Portname.SubString(3) + "&lt;/a&gt;"}}
}
$a = ""
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ""
#Filename removed at My's suggestion
#$MyOutput = $rows | ConvertTo-Html Title, Author, URL, Link -body "&lt;H1&gt;Recent Custom Reports&lt;/H1&gt;" -PostContent "Goodbye and thanks for all the fish"
if ($rows.count -eq 1)
{
$Plural=$null;
}
else
{
$Plural="s";
}
$MyOutput = $table| ConvertTo-Html Title, Link, InsuredName, ClaimNumber, PolicyNumber, EffectiveYear, Modified -head $a -body "&lt;H1&gt;$($rows.count) Recent Custom Report$($Plural)&lt;/H1&gt;"  | Convert-HTMLEscape
#$MyOutput &amp;gt; C:A.html #debug technique
#email setup, can move to top
#param( 
[string] $From = "SharePointSupport@Company limited.com"
[string] $To = $null; #$Recipients #"joelplaut@Company limited.com"
[string] $Title = "Daily Report of updated Property Adjuster Reports"
#[string] $Body = "body"
#)
$Body = $rows | ConvertTo-Html
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.Company limited.com"
$SmtpClient.host = $SmtpServer
#    $SmtpClient.Send($From,$To,$Title,$Body)
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)
} #don't bother, no new hits