Scripting SharePoint logging to ULS and Event Log

It’s easy to dump output to a text file in a script, but for enterprise-class logging, the two standards are the Event Log and ULS (Unified Logging System). First ULS.

Below in PowerShell I grab a reference to the SPDiagnosticsService, define a SPDiagnosticsCategory, then call the writeTrace() method:

$diagSrc = [Microsoft.SharePoint.Administration.SPDiagnosticsService]::Local
$diacategory = new-object Microsoft.SharePoint.Administration.SPDiagnosticsCategory("MyTestCategory",[Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, [Microsoft.SharePoint.Administration.EventSeverity]::ErrorCritical)
$diagSrc.WriteTrace(98765, $diacategory, [Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, "Write your log description here" )

ULS is a good standard central way to go, but let’s move onto writing into the Event Log, which is extra useful given we are first going to create a custom application log:

New-EventLog -LogName MyCustomScripts -Source scripts

First challenge is if the logfile exists, then it will throw an error, even if you try encapsulating in a try/catch. the trick is to leverage the Get-EventLog CmdLet.

First, to see what exists, format as a list:

Get-EventLog -list

You now have your very own Event Log, and can write into it with your own event IDs, and messages, and severity levels. Here’s two worknig examples:

Write-EventLog -LogName MyCustomScripts -Source Scripts -Message "trying 4142 it works ... COOL!" -EventId 4142 -EntryType information
Write-EventLog -LogName MyCustomScripts  -Source Scripts -Message "trying 4942 as an error" -EventId 4942 -EntryType error

Now let’s simplify for re-use and consistency. Let’s declare some basics at the top of all scripts:

$Eventing = $true;  #determine if any events are written to the event log
$LogName = "JoelScripts"
$SourceName = "Scripts"
$ScriptID = 3; # unique number per script

Here’s a one-line function to make life simpler for our scripts:

function Write-MyLog([int] $EventID, [string] $Description, [system.Diagnostics.EventLogEntryType] $Severity)
{
if ($Eventing)
{
Write-EventLog -LogName $LogName -Source $SourceName -Message $Description -EventId $EventID -EntryType $Severity
}
}

Now let’s add a line at the start and end of the scripts to trigger an information event on what’s running. Note that references to $MyInvocation contain information about the currently running script:

Write-MyLog -Description "Start of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information
Write-MyLog -Description "End of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information

Lastly, here’s a sample normal message evented for warning, and next for error:

Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Warning
Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Error

A nice way to write events is to use the “Source” to map back to the script name or some other useful value for filtering. However sources need to be pre-defined. Here’s how to define a source:

New-EventLog -LogName $LogName -Source "X"

The challenge is when to create this source. I find it’s best to declare the source only if it does not exist:

try
{
$Sources = Get-EventLog -LogName $logname | Select-Object Source -Unique
$found = $false;
foreach ($OneSource in $Sources)
{
if ($OneSource.source -eq $Source)
{
$found=$true;
}
}	
}
catch
{
Write-Host "cannot find logfile, so we are in deep trouble"
}
if (!$found)
{
New-EventLog -LogName $LogName -Source $Source
Write-Host "Created new Source $($Source) in log name $($LogName)"
}

Generating automatic emails with embedded reports and link to uploaded CSV

Generating automatic emails with embedded content

There’s often a need to automate the generation of reports on SharePoint. I typically need to generate such a report based on:
1. Content using search: acquired via a Search Query
2. Content via manual crawl: walking through libraries, even sites, site collections and farms
3. Based on scripted actions taken, such as moving or copying documents on an automated basis.

Other examples I’ve done including reports on Checked Out documents, custom delivered to the people who have documents checked out for 30+days.

There are a few parts to the script below:
1. Acquire the dataset for the report. In this example I walk through app SPItems in a given list.
2. Define how the email gets sent, and to whom, leveraging HTML email
3. Generate the HTML for this report
4. Create a temporary CSV of this report, and upload it to a SharePoint library
5. Send the email, ensuring it contains the summary report and a link to the uploaded CSV

To store the dataset, I use the system.Data.DataTable object, and build from there. It’s easy to copy a DataTable, or drop columns after the HTML is constructed.

I use ConvertToHTML CmdLet to convert the DataTable, explicitly referencing the fields and the order I’d like in the HTML table. However I have to fix up the XML within it. For that I pipeline it through a function called Convert-HTMLEscape

I prefer to tune up the DataTable and drop any unnecessary fields, as ConvertTo-CSV does not allow fields to be specified. I also specify -NoTypeInformation so the first row is the actual field names, allowing it to open correctly in Excel.

Note how the URLs are prettied up on the fly, so they read nicely in the HTML table.

Claims Authentication has a different account format, which I clean up in a separate Function Strip-ClaimsHeader.

Whether a file is checked out, the type of checkout, and who checked it out are some optional attributes I capture in the DataTable. This table is easily extended to support whichever metadata is desired.

here’s the script:

if(!(Get-PSSnapin Microsoft.SharePoint.PowerShell -ea 0)) 
{ 
Write-Progress -Activity "Loading Modules" -Status "Loading Microsoft.SharePoint.PowerShell" 
Add-PSSnapin Microsoft.SharePoint.PowerShell 
} 
$ToRecipients = "joelplaut@MyDomain.com"
$CCRecipients = $ToRecipients; #set to any audience you want, semicolon separated
$ToRecipientsArray = $ToRecipients.Split(",");
$CCRecipientsArray = $CCRecipients.Split(",");
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
function Save-File([string] $Dir, [string] $PartialName, [system.Data.DataTable] $MyTable)
{
#writes file to the a temporary disk based location.  I add a timestamp to name to ensure uniqueness.  $PartialName is a text description used to start the filename
# export-csv does all the heavy lifting.  Note the attribute "-NoTypeInformation" is needed so there isn't a header on top of the field headers
$timestamp = Get-Date
$timestampstr = $timestamp.Month.ToString("00") + "_" + $timestamp.Day.ToString() + "_" + $timestamp.Year.ToString("0000") + " " + $timestamp.Hour.ToString("00") + "_" + $timestamp.Minute.ToString("00") + "_" + $timestamp.Second.ToString("00") 
$FileName = $Dir + '' + $PartialName+'_'+$timestampstr+'.CSV';
$MyTable.rows | Export-Csv -Path $FileName -NoTypeInformation
return $FileName;
}
#writes file to the Reporting Library within SharePOint , adds timestamp to name to ensure uniqueness
function Upload-File([string] $FilePath)
{
$Files = $RepList.rootfolder.Files
$DocLibName = $RepList.Title; #this is not necessarily accurate 
#Get File Name from Path
$FileName = $FilePath.Substring($FilePath.LastIndexOf("")+1)
#delete the File from library, if already exist.  This should never occur
try
{
if($Files.Item($DocLibName +"/" + $FileName))
{
$Files.delete($DocLibName +"/" + $FileName)
}
} catch {}
#Get the File
$File= Get-ChildItem $FilePath
#Add File to the collection
$x=$Files.Add($DocLibName +"/" + $FileName,$File.OpenRead(),$false)
$ReportFilename = $RepList.ParentWeb.Url + '/'+ $DocLibName + '/' + $Filename;
return $reportFileName;
}
# If we are running in a Claims Authentication environment, we can strip off the claims tags
Function Strip-ClaimsHeader ($s)
{
if ($s.IndexOf('#') -gt 0)  #handle stripping claims tags off name
{
return $s.Substring($s.IndexOf("#")+1)
}
else
{
return $s
}
}
#used for HTML processing and fixup; this is a bit dense, but it supports the PowerShell pipeline, and restores XML tags
Function Convert-HTMLEscape {
<#
convert &amp;lt; and &amp;gt; to < and >
It is assumed that these will be in pairs
#>
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
<#
replace codes with actual symbols. This line is a shortcut to do two replacements
with one line of code. The code in the first set of parentheses revised text with "<". This
normally gets written to the pipeline. By wrapping it in parentheses it tells PowerShell to treat it
as an object so I can then call the Replace() method again and add the >.
#>
(($item.Replace("&amp;lt;","<")).Replace("&amp;gt;",">")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function
$env = "Prod"  # I like setting flags for changing environment settings
if ($env -eq "Prod")
{
$RefListLocation = "http ://sharepoint/sites/mySiteCollection/Subsite"
$TempLoc = "C:TempLocation"  # disk location to temporararily store the file
}
$table = New-Object system.Data.DataTable “JoelFancyReport”  #this is an internal name, that is not very important to set
#let's define a field (column) for every possible field; even those we don't yet use in a given report
$col1 = New-Object system.Data.DataColumn Link,([string])
$col2 = New-Object system.Data.DataColumn BusinessProcess,([string])
$col3 = New-Object system.Data.DataColumn DocType,([string])
$col4 = New-Object system.Data.DataColumn Modified,([string])
$col5 = New-Object system.Data.DataColumn Title,([string])
$col6 = New-Object system.Data.DataColumn FileName,([string])
$col7 = New-Object system.Data.DataColumn Editor,([string])
$col8 = New-Object system.Data.DataColumn CheckedOut,([string])
$col9 = New-Object system.Data.DataColumn CheckedOutBy,([string])
$col10= New-Object system.Data.DataColumn FileFolder,([string])
$col11= New-Object system.Data.DataColumn FileLink,([string])
$col12= New-Object system.Data.DataColumn ReportAction,([string])
# Just add the columns to the table
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
$table.columns.add($col10)
$table.columns.add($col11)
$table.columns.add($col12)
#we can create multiple tables with the same layout easily. 
# Copy is shown here for reference, and is not used in this simple example
$TableCopy = $table.Copy() 
#loop through whatever SPItems need to be looped
# here is where the basic logic gets put, before generating the actual report
# this is just a placeholder loop of items in a list, but this could be anything, including SPQuery results
$JPWeb = get-spweb $RefListLocation
$JPList  = $JPWeb.lists["MyList"]
$JPItems = $JPList.items;
$JPItemCount = $JPItems.count;
for ($k=0; $k -lt $JPItemsCount; $k++)
{
$SourceItem = JPItems[$k];
if ($SourceItem.File.CheckOutStatus -ne "None")
{
$ReportAction += ", CheckedOut "+$SourceItem.File.CheckOutStatus+" to:"+$SourceItem.File.CheckedOutBy.displayname;
$CheckedOut = "Yes"
$CheckedOutBy = $SourceItem.File.CheckedOutBy.displayname;
}
else
{
$CheckedOut = "No"
$CheckedOutBy = $null;
}
#let's create a new row	
$row = $Table.NewRow()
$FileLink =   $($JPWeb.Url)+'/'+$($SourceItem.url) ;
$row.FileLink = $FileLink;
$row.Link = "<a href="+'"'+ $($FileLink) +'">'+ " Link " + "</a>";
$FolderURL = $FileLink.substring(0,$FileLink.LastIndexOf('/'));
$row.FileFolder = $FolderURL;
$row.Title = $item2.Title;
$row.FileName= $item2.Name;
$row.Editor = strip-claimsHeader $item2["Editor"];
$row.Modified= $item2["Modified"];
$row.BusinessProcess = $item2["BusinessProcess"] #sample field; add as many as you'd like
$row.DocType = $item2["DocType"]
$row.checkedOut = $CheckedOut;
$row.checkedOutBy = $CheckedOutBy;
$Table.Rows.Add($row)
}
$RunStatus = "All the processing completed $(get-date)"
$Header = "Weekly report for my friends"
$emailTitle = "Weekly Report email $($RunSummary)"
# this is a nice table style, collapsed, colored header.  	
$a = "<style>"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + "</style>"
# Let's creat the HTML table, specifying only the columns from the table that we'd like to see.  Any other columns won't appear in this report 
# By piping it to the Convert-HTMLEscape function, it fixes up the HTML    
$MyOutput = $table| ConvertTo-Html Link, BusinessProcess, DocType, Editor, Modified, FileName, Action -head $a -body "<H1>$($rows.count) $($Header)</H1>"  | Convert-HTMLEscape
# if we need to reuse table, just make a copy of it first.  
# While Convert-HTML offered the luxury of column selection, the Convert-CSV converts every darn column, so let's remove the extra ones now.
$table.Columns.Remove("Link") #how to remove a Column. links don't look too pretty within a CSV, so for an example I remove it here.
$f2 = save-file $TempLoc ($FileDescription) $Table  #Saves the temp file to disk, driven out of the $Table
$Report2URL = upload-file $f2
$ReportSummary = "Rolled Report Available at: "+"<a href="+'"' +$Report2URL +'">'+ $FileDescription+" Custom Report " + "</a>"+"<p></p>";
#debug technique; it's easy to route the HTML to a file for examination, see the commented out line below:
#$MyOutput > C:A.html 
[string] $From = "SharePointSupport@MyDomain.com"
[string] $To = $null; #$Recipients #"joelplaut@MyDomain.com"
[string] $Title = $emailTitle;
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.MyDomain.com"
$SmtpClient.host = $SmtpServer
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)

Customized scheduled SharePoint Search Alerts in HTML

Customized scheduled SharePoint Search Alerts in HTML

Occasionally I get requests for customized notification about documents within a SharePoint farm. Regular Alerts and SharePoint Search Alerts work great out of the box, but sometimes users want something more such as:
– Complex criteria
– Custom sort sequence
– Custom metadata
– Customized notification frequency
– Custom message text, or subject line
– Refined layout

The solution I’ve used is to script a search query in powerShell, and load it into a formatted HTML table, and schedule it to the desired frequency. I’ll outline the framework below that is easily adapted and extended. Note I wrote this for FAST, but using SharePoint Search Query classes you can achieve similar results in regular SharePoint search.

First, let’s establish some basics about search and notification. For this solution, I only want to return documents that are up to two or three days old, so I grab the date, take two off it, and put it into a format we can use later for the query:

$Mydate=get-date
$MyDate = $Mydate.AddDays(-2)
$MyDateStr = $Mydate.Year.ToString("0000") + "-" + $Mydate.Month.ToString("00")  + "-" + $Mydate.day.ToString("00")  #formatted YYYY-MM-DD

Let’s now set up for the search querty. I chose to use FQL (FAST Query Language), but you can use Keyword Search. Note SQL Search is deprecated. I chose 50 results, but you can choose whatever amount you prefer:

$site = New-Object Microsoft.SharePoint.SPSite $webappurl
$vc =New-Object Microsoft.Office.Server.Search.Query.KeywordQuery $site
$vc.ResultsProvider = [Microsoft.Office.Server.Search.Query.SearchProvider]::FASTSearch
$vc.ResultTypes = [Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults
#In my case I enabled the FQL syntax and set some other parameters:
$vc.EnableFQL = $true # enable FQL
$vc.RowLimit = 50 # sets the limit of results
$vc.StartRow = 0 # 0 is the default

Now let’s make sure the query returns the fields you want. These must be Managed Properties configured followed by a Full Crawl.

$vc.SelectProperties.Add("Company Name")
$vc.SelectProperties.Add("URL")
$vc.SelectProperties.Add("Title")
$vc.SelectProperties.Add("Filename")
$vc.SelectProperties.Add("Company ClaimNumber")
$vc.SelectProperties.Add("Company PolicyNumber")
$vc.SelectProperties.Add("Company Modified")
$vc.SelectProperties.Add("Company EffectiveYear")

Now let’s piece together the XML of the FQL. Note two strings ($q1 and $q2) are used to construct the query and put into $BigQ, with the date we formatted earlier. We’re looking for documents newer than two days ago, where a particular field CompanyClaimDocumentType equals a specific value (“Specific Value”). Then we execute the FQL:

$q1='and(filter(Company modified:range(datetime("'
$q2='"), max, from="GT")), filter(CompanyClaimDocumentType:equals("Specific Value")))'
$BigQ=$q1+$MyDateStr+$q2
$vc.QueryText = $BigQ
$results = $vc.Execute()

Now let’s convert the search results into a DataTable to make it easy to shape into an HTML table for the outbound email alert notification, we’ll define the columns, and load the values. One nice thing is to shape the link column to allow a hyperlink embedded in the table for easy user access to the documents. I also structure a special link using the DMF:// protocol supported by MacroView DMF:

$resultsTable = $results.Item([Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults)
$resultsDataTable = $resultsTable.Table
$rows = $resultsDataTable.Rows
$table = New-Object system.Data.DataTable “SearchReport”
$col1 = New-Object system.Data.DataColumn Title,([string])
$col2 = New-Object system.Data.DataColumn CompanyName,([string])
$col3 = New-Object system.Data.DataColumn ClaimNumber,([string])
$col4 = New-Object system.Data.DataColumn Link,([string])
$col5 = New-Object system.Data.DataColumn PolicyNumber,([string])
$col6 = New-Object system.Data.DataColumn Modified,([string])
$col7 = New-Object system.Data.DataColumn EffectiveYear,([string])
$col8 = New-Object system.Data.DataColumn FileName,([string])
$col9 = New-Object system.Data.DataColumn DMF,([string])
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
if ($rows.count -gt 0)
{
for ($i=0; $i -lt $rows.Count; $i++)
{
$row = $table.NewRow()
$row.Link = "&lt;a&gt;'+ " Link " + "&lt;/a&gt;";
#$row.DMF = ($row.Link.Replace("http://","DMF://")).replace(" File Link", "SP Explorer")  Took out, doesn't appear to work quite right
$row.Title = $rows[$i].Title;
$row.InsuredName= $rows[$i].CompanyName;
$row.ClaimNumber= $rows[$i].CompanyClaimNumber;
$row.PolicyNumber= $rows[$i].CompanyPolicyNumber;
$row.EffectiveYear= $rows[$i].CompanyEffectiveYear;
$row.FileName= $rows[$i].FileName;
$row.Modified= $rows[$i].Company Modified.substring(0,10);
$table.Rows.Add($row)
#("&lt;a&gt;" + $_.Portname.SubString(3) + "&lt;/a&gt;"}}
}

Now, we want to shape this into an outbound table in the email. To do that we’ll use the handy CovertTo-HTML CmdLet. Just pass in the column names you want to appear. Above I map more than we use below. I shape the borders and colors for a really professional look. However we have a sticky problem where we don’t want it to mess with the embedded HTML such as brackets around the a href for links. To solve that, I pipe the stream into a routine called Convert-HTMLEscape that I will outline shortly. Note the nice H1 header, and we’ll nicely handle single/plural number of rows in the subject line and top of the email:

$a = ""
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ""
#$MyOutput = $rows | ConvertTo-Html Title, Author, URL, Link -body "&lt;H1&gt;Recent Custom Reports&lt;/H1&gt;" -PostContent "Goodbye and thanks for all the fish"
if ($rows.count -eq 1)
{
$Plural=$null;
}
else
{
$Plural="s";
}
$MyOutput = $table| ConvertTo-Html Title, Link, InsuredName, ClaimNumber, PolicyNumber, EffectiveYear, Modified -head $a -body "&lt;H1&gt;$($rows.count) Recent Property Adjuster Report$($Plural)&lt;/H1&gt;"  | Convert-HTMLEscape

Here’s the Convert-HTMLEscape function. It’s a little dense, but works within the pipeline and does what we need, by converting the XML equivalent of the two common XML characters right back to the correct characters, basically undoing the bit of a mess made by ConvertTo-HTML:

Function Convert-HTMLEscape {
# convert &amp;lt; and &amp;gt; to &lt;&gt; It is assumed that these will be in pairs
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
&lt;#
replace codes with actual symbols. This line is a shortcut to do two replacements with one line of code. The code in the first
set of parentheses revised text with &amp;quot;.
#&gt;
(($item.Replace("&amp;lt;","")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function

Now the easy part, let’s generate the email. Note the format sets isHTML to $true, and uses SMTP:

#email setup, can move to top
#param( 
[string] $From = "SharePointSupport@MyDomain.com"
[string] $To = $null; #$Recipients #"joelplaut@MyDomain.com"
[string] $Title = "Daily Report of updated custom Reports"
#[string] $Body = "body"
#)
$Body = $rows | ConvertTo-Html
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.Company limited.com"
$SmtpClient.host = $SmtpServer
#    $SmtpClient.Send($From,$To,$Title,$Body)
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)

Now let’s put it all together. Note I always set an $env for the environment, to make it easy to test in Dev, before deploying in Production.

# SharePoint Search Alerts
Clear-Host
Write-Host "Start LC Alerts" -ForegroundColor darkblue
Add-PsSnapin Microsoft.SharePoint.PowerShell -erroraction silentlycontinue
$ToRecipients = "MyADGroup@MyDomain.com"
$CCRecipients = "joelplaut@MyDomain.com,Bozo@MyDomain.com"
$ToRecipientsArray = $ToRecipients.Split(",");
$CCRecipientsArray = $CCRecipients.Split(",");
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
$env="Prod"
if ($env -eq "Dev")
{
$webappurl = "http ://devFarm/" # Replace with URL of web application that you wish to warm up
$filterpath = "http ://devFarm/insureds"
}
else
{
$webappurl = "http ://SharePoint/" # Replace with URL of web application that you wish to warm up
$filterpath = "http ://SharePoint/insureds" #path for exercising previews
}
Function Convert-HTMLEscape {
&amp;lt;#
convert &amp;lt; and &amp;gt; to 
It is assumed that these will be in pairs
#&amp;gt;
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
&amp;lt;#
replace codes with actual symbols
This line is a shortcut to do two replacements
with one line of code. The code in the first
set of parentheses revised text with &amp;quot;.
#&amp;gt;
(($item.Replace("&amp;lt;","")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function
$Mydate=get-date
$MyDate = $Mydate.AddDays(-2)
$MyDateStr = $Mydate.Year.ToString("0000") + "-" + $Mydate.Month.ToString("00")  + "-" + $Mydate.day.ToString("00")  #formatted YYYY-MM-DD
$site = New-Object Microsoft.SharePoint.SPSite $webappurl
$vc =New-Object Microsoft.Office.Server.Search.Query.KeywordQuery $site
$vc.ResultsProvider = [Microsoft.Office.Server.Search.Query.SearchProvider]::FASTSearch
$vc.ResultTypes = [Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults
#In my case I enabled the FQL syntax and set some other parameters:
$vc.EnableFQL = $true # enable FQL
$vc.RowLimit = 50 # sets the limit of results
$vc.StartRow = 0 # 0 is the default
$vc.SelectProperties.Add("Company Name")
$vc.SelectProperties.Add("URL")
$vc.SelectProperties.Add("Title")
$vc.SelectProperties.Add("Filename")
$vc.SelectProperties.Add("Company ClaimNumber")
$vc.SelectProperties.Add("Company PolicyNumber")
$vc.SelectProperties.Add("Company Modified")
$vc.SelectProperties.Add("Company EffectiveYear")
#Query / Result
$q1='and(filter(Company modified:range(datetime("'
$q2='"), max, from="GT")), filter(Company claimdocumenttype:equals("Property Adjuster Reports")))'
$BigQ=$q1+$MyDateStr+$q2
$vc.QueryText = $BigQ
$results = $vc.Execute()
#$results
$resultsTable = $results.Item([Microsoft.Office.Server.Search.Query.ResultType]::RelevantResults)
$resultsDataTable = $resultsTable.Table
$rows = $resultsDataTable.Rows
$table = New-Object system.Data.DataTable “SearchReport”
$col1 = New-Object system.Data.DataColumn Title,([string])
$col2 = New-Object system.Data.DataColumn InsuredName,([string])
$col3 = New-Object system.Data.DataColumn ClaimNumber,([string])
$col4 = New-Object system.Data.DataColumn Link,([string])
$col5 = New-Object system.Data.DataColumn PolicyNumber,([string])
$col6 = New-Object system.Data.DataColumn Modified,([string])
$col7 = New-Object system.Data.DataColumn EffectiveYear,([string])
$col8 = New-Object system.Data.DataColumn FileName,([string])
$col9 = New-Object system.Data.DataColumn DMF,([string])
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
if ($rows.count -gt 0)
{
for ($i=0; $i -lt $rows.Count; $i++)
{
$row = $table.NewRow()
$row.Link = "&lt;a&gt;'+ " Link " + "&lt;/a&gt;";
#$row.DMF = ($row.Link.Replace("http://","DMF://")).replace(" File Link", "SP Explorer")  Took out, doesn't appear to work quite right
$row.Title = $rows[$i].Title;
$row.InsuredName= $rows[$i].CompanyName;
$row.ClaimNumber= $rows[$i].CompanyClaimNumber;
$row.PolicyNumber= $rows[$i].CompanyPolicyNumber;
$row.EffectiveYear= $rows[$i].CompanyEffectiveYear;
$row.FileName= $rows[$i].FileName;
$row.Modified= $rows[$i].Company Modified.substring(0,10);
$table.Rows.Add($row)
#("&lt;a&gt;" + $_.Portname.SubString(3) + "&lt;/a&gt;"}}
}
$a = ""
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ""
#Filename removed at My's suggestion
#$MyOutput = $rows | ConvertTo-Html Title, Author, URL, Link -body "&lt;H1&gt;Recent Custom Reports&lt;/H1&gt;" -PostContent "Goodbye and thanks for all the fish"
if ($rows.count -eq 1)
{
$Plural=$null;
}
else
{
$Plural="s";
}
$MyOutput = $table| ConvertTo-Html Title, Link, InsuredName, ClaimNumber, PolicyNumber, EffectiveYear, Modified -head $a -body "&lt;H1&gt;$($rows.count) Recent Custom Report$($Plural)&lt;/H1&gt;"  | Convert-HTMLEscape
#$MyOutput &amp;gt; C:A.html #debug technique
#email setup, can move to top
#param( 
[string] $From = "SharePointSupport@Company limited.com"
[string] $To = $null; #$Recipients #"joelplaut@Company limited.com"
[string] $Title = "Daily Report of updated Property Adjuster Reports"
#[string] $Body = "body"
#)
$Body = $rows | ConvertTo-Html
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.Company limited.com"
$SmtpClient.host = $SmtpServer
#    $SmtpClient.Send($From,$To,$Title,$Body)
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)
} #don't bother, no new hits