Generating automatic emails with embedded reports and link to uploaded CSV

Generating automatic emails with embedded content

There’s often a need to automate the generation of reports on SharePoint. I typically need to generate such a report based on:
1. Content using search: acquired via a Search Query
2. Content via manual crawl: walking through libraries, even sites, site collections and farms
3. Based on scripted actions taken, such as moving or copying documents on an automated basis.

Other examples I’ve done including reports on Checked Out documents, custom delivered to the people who have documents checked out for 30+days.

There are a few parts to the script below:
1. Acquire the dataset for the report. In this example I walk through app SPItems in a given list.
2. Define how the email gets sent, and to whom, leveraging HTML email
3. Generate the HTML for this report
4. Create a temporary CSV of this report, and upload it to a SharePoint library
5. Send the email, ensuring it contains the summary report and a link to the uploaded CSV

To store the dataset, I use the system.Data.DataTable object, and build from there. It’s easy to copy a DataTable, or drop columns after the HTML is constructed.

I use ConvertToHTML CmdLet to convert the DataTable, explicitly referencing the fields and the order I’d like in the HTML table. However I have to fix up the XML within it. For that I pipeline it through a function called Convert-HTMLEscape

I prefer to tune up the DataTable and drop any unnecessary fields, as ConvertTo-CSV does not allow fields to be specified. I also specify -NoTypeInformation so the first row is the actual field names, allowing it to open correctly in Excel.

Note how the URLs are prettied up on the fly, so they read nicely in the HTML table.

Claims Authentication has a different account format, which I clean up in a separate Function Strip-ClaimsHeader.

Whether a file is checked out, the type of checkout, and who checked it out are some optional attributes I capture in the DataTable. This table is easily extended to support whichever metadata is desired.

here’s the script:

if(!(Get-PSSnapin Microsoft.SharePoint.PowerShell -ea 0)) 
{ 
Write-Progress -Activity "Loading Modules" -Status "Loading Microsoft.SharePoint.PowerShell" 
Add-PSSnapin Microsoft.SharePoint.PowerShell 
} 
$ToRecipients = "joelplaut@MyDomain.com"
$CCRecipients = $ToRecipients; #set to any audience you want, semicolon separated
$ToRecipientsArray = $ToRecipients.Split(",");
$CCRecipientsArray = $CCRecipients.Split(",");
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
function Save-File([string] $Dir, [string] $PartialName, [system.Data.DataTable] $MyTable)
{
#writes file to the a temporary disk based location.  I add a timestamp to name to ensure uniqueness.  $PartialName is a text description used to start the filename
# export-csv does all the heavy lifting.  Note the attribute "-NoTypeInformation" is needed so there isn't a header on top of the field headers
$timestamp = Get-Date
$timestampstr = $timestamp.Month.ToString("00") + "_" + $timestamp.Day.ToString() + "_" + $timestamp.Year.ToString("0000") + " " + $timestamp.Hour.ToString("00") + "_" + $timestamp.Minute.ToString("00") + "_" + $timestamp.Second.ToString("00") 
$FileName = $Dir + '' + $PartialName+'_'+$timestampstr+'.CSV';
$MyTable.rows | Export-Csv -Path $FileName -NoTypeInformation
return $FileName;
}
#writes file to the Reporting Library within SharePOint , adds timestamp to name to ensure uniqueness
function Upload-File([string] $FilePath)
{
$Files = $RepList.rootfolder.Files
$DocLibName = $RepList.Title; #this is not necessarily accurate 
#Get File Name from Path
$FileName = $FilePath.Substring($FilePath.LastIndexOf("")+1)
#delete the File from library, if already exist.  This should never occur
try
{
if($Files.Item($DocLibName +"/" + $FileName))
{
$Files.delete($DocLibName +"/" + $FileName)
}
} catch {}
#Get the File
$File= Get-ChildItem $FilePath
#Add File to the collection
$x=$Files.Add($DocLibName +"/" + $FileName,$File.OpenRead(),$false)
$ReportFilename = $RepList.ParentWeb.Url + '/'+ $DocLibName + '/' + $Filename;
return $reportFileName;
}
# If we are running in a Claims Authentication environment, we can strip off the claims tags
Function Strip-ClaimsHeader ($s)
{
if ($s.IndexOf('#') -gt 0)  #handle stripping claims tags off name
{
return $s.Substring($s.IndexOf("#")+1)
}
else
{
return $s
}
}
#used for HTML processing and fixup; this is a bit dense, but it supports the PowerShell pipeline, and restores XML tags
Function Convert-HTMLEscape {
<#
convert &amp;lt; and &amp;gt; to < and >
It is assumed that these will be in pairs
#>
[cmdletbinding()]
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
Process {
foreach ($item in $text) {
if ($item -match "&amp;lt;") {
<#
replace codes with actual symbols. This line is a shortcut to do two replacements
with one line of code. The code in the first set of parentheses revised text with "<". This
normally gets written to the pipeline. By wrapping it in parentheses it tells PowerShell to treat it
as an object so I can then call the Replace() method again and add the >.
#>
(($item.Replace("&amp;lt;","<")).Replace("&amp;gt;",">")).Replace("&amp;quot;",'"')
}
else {
#otherwise just write the line to the pipeline
$item
}
}
} #close process
} #close function
$env = "Prod"  # I like setting flags for changing environment settings
if ($env -eq "Prod")
{
$RefListLocation = "http ://sharepoint/sites/mySiteCollection/Subsite"
$TempLoc = "C:TempLocation"  # disk location to temporararily store the file
}
$table = New-Object system.Data.DataTable “JoelFancyReport”  #this is an internal name, that is not very important to set
#let's define a field (column) for every possible field; even those we don't yet use in a given report
$col1 = New-Object system.Data.DataColumn Link,([string])
$col2 = New-Object system.Data.DataColumn BusinessProcess,([string])
$col3 = New-Object system.Data.DataColumn DocType,([string])
$col4 = New-Object system.Data.DataColumn Modified,([string])
$col5 = New-Object system.Data.DataColumn Title,([string])
$col6 = New-Object system.Data.DataColumn FileName,([string])
$col7 = New-Object system.Data.DataColumn Editor,([string])
$col8 = New-Object system.Data.DataColumn CheckedOut,([string])
$col9 = New-Object system.Data.DataColumn CheckedOutBy,([string])
$col10= New-Object system.Data.DataColumn FileFolder,([string])
$col11= New-Object system.Data.DataColumn FileLink,([string])
$col12= New-Object system.Data.DataColumn ReportAction,([string])
# Just add the columns to the table
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
$table.columns.add($col7)
$table.columns.add($col8)
$table.columns.add($col9)
$table.columns.add($col10)
$table.columns.add($col11)
$table.columns.add($col12)
#we can create multiple tables with the same layout easily. 
# Copy is shown here for reference, and is not used in this simple example
$TableCopy = $table.Copy() 
#loop through whatever SPItems need to be looped
# here is where the basic logic gets put, before generating the actual report
# this is just a placeholder loop of items in a list, but this could be anything, including SPQuery results
$JPWeb = get-spweb $RefListLocation
$JPList  = $JPWeb.lists["MyList"]
$JPItems = $JPList.items;
$JPItemCount = $JPItems.count;
for ($k=0; $k -lt $JPItemsCount; $k++)
{
$SourceItem = JPItems[$k];
if ($SourceItem.File.CheckOutStatus -ne "None")
{
$ReportAction += ", CheckedOut "+$SourceItem.File.CheckOutStatus+" to:"+$SourceItem.File.CheckedOutBy.displayname;
$CheckedOut = "Yes"
$CheckedOutBy = $SourceItem.File.CheckedOutBy.displayname;
}
else
{
$CheckedOut = "No"
$CheckedOutBy = $null;
}
#let's create a new row	
$row = $Table.NewRow()
$FileLink =   $($JPWeb.Url)+'/'+$($SourceItem.url) ;
$row.FileLink = $FileLink;
$row.Link = "<a href="+'"'+ $($FileLink) +'">'+ " Link " + "</a>";
$FolderURL = $FileLink.substring(0,$FileLink.LastIndexOf('/'));
$row.FileFolder = $FolderURL;
$row.Title = $item2.Title;
$row.FileName= $item2.Name;
$row.Editor = strip-claimsHeader $item2["Editor"];
$row.Modified= $item2["Modified"];
$row.BusinessProcess = $item2["BusinessProcess"] #sample field; add as many as you'd like
$row.DocType = $item2["DocType"]
$row.checkedOut = $CheckedOut;
$row.checkedOutBy = $CheckedOutBy;
$Table.Rows.Add($row)
}
$RunStatus = "All the processing completed $(get-date)"
$Header = "Weekly report for my friends"
$emailTitle = "Weekly Report email $($RunSummary)"
# this is a nice table style, collapsed, colored header.  	
$a = "<style>"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + "</style>"
# Let's creat the HTML table, specifying only the columns from the table that we'd like to see.  Any other columns won't appear in this report 
# By piping it to the Convert-HTMLEscape function, it fixes up the HTML    
$MyOutput = $table| ConvertTo-Html Link, BusinessProcess, DocType, Editor, Modified, FileName, Action -head $a -body "<H1>$($rows.count) $($Header)</H1>"  | Convert-HTMLEscape
# if we need to reuse table, just make a copy of it first.  
# While Convert-HTML offered the luxury of column selection, the Convert-CSV converts every darn column, so let's remove the extra ones now.
$table.Columns.Remove("Link") #how to remove a Column. links don't look too pretty within a CSV, so for an example I remove it here.
$f2 = save-file $TempLoc ($FileDescription) $Table  #Saves the temp file to disk, driven out of the $Table
$Report2URL = upload-file $f2
$ReportSummary = "Rolled Report Available at: "+"<a href="+'"' +$Report2URL +'">'+ $FileDescription+" Custom Report " + "</a>"+"<p></p>";
#debug technique; it's easy to route the HTML to a file for examination, see the commented out line below:
#$MyOutput > C:A.html 
[string] $From = "SharePointSupport@MyDomain.com"
[string] $To = $null; #$Recipients #"joelplaut@MyDomain.com"
[string] $Title = $emailTitle;
$SmtpClient = New-Object System.Net.Mail.SmtpClient
$SmtpServer = "mail.MyDomain.com"
$SmtpClient.host = $SmtpServer
$MailMessage = New-Object system.net.mail.mailmessage
$mailmessage.from = $From;
foreach ($recip in $ToRecipientsArray)
{
$mailmessage.To.add($recip)
}
foreach ($recip in $CCRecipientsArray)
{
$mailmessage.CC.add($recip)
}
$mailmessage.Subject = $Title
$mailmessage.Body = $myoutput #"Body"
$MailMessage.set_IsBodyHtml($true)
$smtpclient.Send($mailmessage)
1 reply

Trackbacks & Pingbacks

  1. […] this post: Generating automatic emails with embedded reports and link to uploaded CSV we explored sending emails and attachments with HTML tables. Now let’s have a look at […]

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *