Custom development solutions

Excel corruption writing DIF files

Restore SharePoint document timestamp and author from feedfile

Often an administrator during maintenance or checking in a document for a user, “stomps” on a timestamp and who edited the document. In a perfect world we take the time to restore authorship and timestamp. Here’s a script that reads in a CSV of the URL, timestamp and user of any number of documents to correct. it will also try to remove the previous incorrect version, if possible.

$actionlist= Import-Csv "C:scriptsNameDateTag.csv" 
for ($Ai=0; $Ai -lt $actionlist.Count; $Ai++)
{
$ActionRow=$ActionList[$Ai]
$docurl=$ActionRow.DocURL;
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$list = $item.ParentList
[System.DateTime] $dat = Get-Date $ActionRow.Timestamp
$usr = $web.ensureuser($ActionRow.Editor)
$item["Modified"] = $dat;
$item["Editor"] = $usr;
$item.Update()
try { $item.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item['Name'])"}
}

Gradual Site Collection Deletion

Gradual Site Collection Deletion

I had a mission to achieve overnight; move 28 site collections into a new managed path, as well as rename the 28 associated content databases.   Pretty straightforward to do, and to script in advance to run in stages:

  1. Backup site collections
  2. Delete site collections
  3. Dismount content databases
  4. Rename content databases and shuffle around underlying storage including FILESTREAM RBS location
  5. Create the new managed path; reset of IIS
  6. Remount 28 content DBs
  7. Delete the old managed path
  8. Restore the 27 content databases (this is where tinkling glass is heard, followed by painful silence)

After enough jolt cola to get that turtle to beat the hare fair and square, I got to the bottom of the problem.  First the problem.  The site collections could not be restored into their original database, as the site collection purportedly already existed there.  Even though it was deleted.

By midnight I gave up, and did the 28 Restore-SPSite into any random content databases, tossing organization and structure to the winds (temporarily), knowing I’ve got gads of available storage, knowing once I got to the bottom of the issue, a simple set of move-spsite commands would set things right.  No, I don’t like randomness, but I also like a few winks of sleep and happy users…

Now the cause.  Since SharePoint 2010 SP1 and SP2013 and SP2013, has the ability to recover deleted site collections (not through the UI, but only through PowerShell).  I used the -gradualdelete option, thinking I would be nice and gentle with a production farm.  Here’s a sample of my delete commands, where I also disable prompting:

Remove-spsite  href="http ://sharepoint/div/clm/int/A/"  -confirm:$false -gradualdelete

Here’s the kicker.  After the delete, the site collection is indeed still there.   It sticks around actually for the duration of the the Recycle Bin duration (default 30 days).  There’s one good way to see, let’s dive into the forbidden content database, and have a peek:

SELECT [DeletionTime]
,[Id]
,[SiteId]
,[InDeletion]
,[Restorable]
FROM [Content_mydivision_a].[dbo].[SiteDeletion]
where (Restorable=1)

Restorable=1 indicates this site collection could be restored.

The solution?  Well, it’s not the recycle bin job, that has no effect on this.  There is a gradual delete job at the web application level, but that won’t help us either; at least not just yet.  First you have to use the remove-spsite CmdLet to remove each site permanently.  Here’s the syntax:

remove-spdeletedsite f5f7639d-536f-4f76-8f94-57834d177a99 -confirm:$false

Ah, you don’t know your Site Collection GUIDs by heart? well, me neither, I prefer more useful allocation of brain cells, so here’s the command that will give you the Site Collection GUIDs that have been (partially) deleted:

get-spdeletedsite -webapplication "http ://sharepoint/"

So, you got your partially deleted GUIDs, you diligently did a remove-spdeletedsite for each, but the Restore-SPSite still will not work.  Now’s the time to run the handy-dandy Gradual Delete timer job for your web application, in Central Admin, Monitoring.  First thing you might notice is the job is taking a bit of time to run.  That’s good, it’s doing something for you, and actually triggering the Content DB Stored Procedure called proc_DeleteSiteCoreAsync.  It actually deletes in batches.

Here’s how to wipe out all these mildly annoying site collections from your recycle bin for a web application:

get-spdeletedsite -webapplication "http://my-srv-sp10/" | Remove-SPDeletedSite

At this point your Restore-SPSite will work to your target content database, and if you played SharePoint Roulette like me and restored to a random location, a move-SPSite will make fast work of putting things where they should be.

More information on the Gradual Deletion Timer Job can be found in this Technet Article by Bill Baer

SharePoint Group Management

Managing SharePoint Groups in PowerShell

SharePoint Groups are a great mechanism for managing user permissions, however they exist within a single site collection. What if you have hundreds of site collections? We can easily script a range of common operations.

I prefer to use a CSV fed approach to manage groups and users. I create a CSV with the name of the group, and the users, which I list in pipe separated format (commas are already being used for the CSV). To read in a CSV use:

Import-Csv "L:PowerShellAD and SP group mapping.csv"

Let’s get the Site, Root Web, as well as an SPUser for the group owner, and get the groups object:

$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;

Here’s how to add a Group:

$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group for Members")

Here’s how to give the group Read access, for example:

$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
$MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
$MyAcctrole = $RootWeb.RoleDefinitions["Read"]
$MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
$RootWeb.RoleAssignments.Add($MyAcctassignment)
}

Here’s how to add a Member to a Group:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj) #if it exists
{
$GroupToAddTo.addUser($UserObj)  
}

Note that a duplicate addition of a member is a null-op, throwing no errors.

Here’s how to remove a member:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  
}

Here’s how to remove all the members from a given group. This wipes the users from the whole site collection, so use this approach with care and consideration:

$user1 = $RootWeb.EnsureUser($MyUser)
try
{
$RootWeb.SiteUsers.Remove($MyUser)
$RootWeb.update()
}

Here’s the full script, with flags to setting the specific actions described above:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
# uses feedfile to load and create set of SharePoint Groups.
$mylogfile="L:PowerShellongoinglogfile.txt"
$ADMap= Import-Csv "L:PowerShellAD and SP group mapping.csv"
$OwnerName = "DOMAIN\sp2013farm"
$AddGroups = $false;
$AddMembers = $false;  # optionally populates those groups, Comma separated list
$GrantGroupsRead = $true; #grants read at top rootweb level
$RemoveMembers = $false; # optionally  removes Comma separated list of users from the associated group
$WipeMembers = $false;	# wipes the groups clean		
$WipeUsersOutOfSite = $false;  #The Nuclear option. Useful to eliminate AD groups used directly as groups
#we do not need a hashtable for this work, but let's load it for extensibility
$MyMap=@{}  #load CSV contents into HashTable
for ($i=0; $i -lt $AD.Count; $i++)
{
$MyMap[$ADMap[$i].SharePointGroup] = $ADMap[$i].ADGroup;
}
# Script changes the letter heading for each site collection
$envrun="Dev"			# selects environment to run in
if ($envrun -eq "Dev")
{
$siteUrl = "h ttp://DevServer/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "ht tp://sharepoint/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
Write-Host "script starting" 
$myheader = "STARTING: $(get-date)"
foreach ($letter in $LoopStringArr)
{
$SiteName=$siteurl+$letter
$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;
for ($ADi = 0; $ADi -lt $ADMap.count; $ADi++)
{
$SPGroupName = $ADMap[$ADi].SharePointGroup;
if ($AddGroups)
{
if (!$Groups[$SPGroupName]) #no exist, so create
{
try
{
$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group members")
}
catch
{
Write-Host -ForegroundColor DarkRed "Ouch, could not create $($SPgroupName)"
}
}
else
{
Write-Host -ForegroundColor DarkGreen "Already exists: $($SPgroupName)"
}
} #endif Add Groups
if ($GrantGroupsRead)
{
$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
$MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
$MyAcctrole = $RootWeb.RoleDefinitions["Read"]
$MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
$RootWeb.RoleAssignments.Add($MyAcctassignment)
} #if the group exists in the first place
} #ActionFlagTrue
if ($AddMembers)
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
$usersToAdd = $ADMap[$ADi].ADGroup;
if ($usersToAdd.length -gt 0) #if no users to add, skip
{
$usersToAddArr = $usersToAdd.split("|")
foreach ($userName in $usersToAddArr)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.addUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
}
}
} #users to add
} #if the group exists in the first place
} #ActionFlagTrue
if ($RemoveMembers)
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
$usersToAdd = $ADMap[$ADi].SharePointGroup;
if ($usersToAdd.length -gt 0) #if no users to add, skip
{
$usersToAddArr = $usersToAdd.split("|")
foreach ($userName in $usersToAddArr)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
}
}
} #users to add
} #if the group exists in the first place
} #ActionFlagTrue
if ($WipeMembers)  #Nukes all users in the group
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
foreach ($userName in $GroupToAddTo.Users)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot remove user ($($userName) to $($GroupToAddTo)"
}
}
} #if the group exists in the first place
} #ActionFlagTrue
if ($WipeUsersOutOfSite)  #Nukes all users in the group
{
$usersToNuke = $ADMap[$ADi].ADGroup;
if ($usersToNuke.length -gt 0) #if no users to add, skip
{
$usersToNukeArr = $usersToNuke.split("|")
foreach ($MyUser in $usersToNukeArr)
{
try
{
try
{
$user1 = $RootWeb.EnsureUser($MyUser)
}
catch
{
Write-Host "x1: Failed to ensure user $($MyUser) in $($Site.url)"
}
try
{
$RootWeb.SiteUsers.Remove($MyUser)
$RootWeb.update()
}
catch
{
Write-Host "x2: Failed to remove $($MyUser) from all users in $($Site.url)"
}
}
catch
{
Write-Host "x4: other failure for $($MyUser) in $($Site.url)"
}
} #if user is not null
} #foreach user to nuke
} #ActionFlagTrue
}
$rootWeb.dispose()
$site.dispose()
} #foreach site

Incrementing repeating table field values

How to increment a field for new rows added into a repeating table in InfoPath

It is really helpful to users to incrementally auto-populate newly added rows. In InfoPath, XPath is used to specify calculated values. To count the number of rows that exist, we can use the following expression:

count(../preceding-sibling::*)

This returns the 0 based count of existing rows. First row would be zero. To get a one-based count, simply use:

count(../preceding-sibling::*)+1

What if you want to have letters that increase? One way is to use the XPath translate(0 function such as:

translate(count(../preceding-sibling::*), "012345", "ABCDEF")

Uploading attachments to tasks in SharePoint

Uploading attachments to tasks in SharePoint programmatically

Uploading attachments to tasks in SharePoint programmatically can be done if the documents are stored in a structured way. The script described below assumes the documents are stored in folders that are named using the ID of the task in SharePoint.

For how to download attachments from a task list, please see: Downloading attachments from Task list

First, we get all files and folders, then we use the folder name to do a CAML Query lookup to get the Task by ID, then we binary upload the attachments.

$Web = Get-SPWeb "http: location" #SPWeb site location as URL
$fileDirectory = "D:\PROD";    # location holding the attachments
$spList = $web.lists["Tasks"]  #replace with the name of your task list
foreach($folder in Get-ChildItem $fileDirectory)
{
$folderID = $folder.Name
$spQuery = New-Object Microsoft.SharePoint.SPQuery;
$camlQuery = "<Where><Eq><FieldRef Name='TaskID'/><Value Type='Number'>$folderID</Value></Eq></Where>"
$spQuery.Query = $camlQuery
$processListItems = $spList.GetItems($spQuery)                                                                                                                             
$item = $processListItems[0];
$folderPath = $fileDirectory+"\"+$folder
foreach($file in Get-ChildItem $folderPath )
{
#$fileStream = ([System.IO.FileInfo] (Get-Item $File.FullName)).OpenRead()
$bytes = [System.IO.File]::ReadAllBytes($File.FullName)
$item.Attachments.Add([System.IO.Path]::GetFileName($File.FullName), $bytes)
$item.Update()
write-host File Uploaded.. $File.FullName -> $item.ID
}
}
$web.Dispose()

How to download all attachments for all tasks in a list

Downloading all attachments for a SharePoint task list
Tasks can have attachments, in fact they can have multiple attachments. However these are stored in an “AttachmentCollection”. We can iterate through all items in the task list to download all attachments. What we do is create a folder for each of the items, and name the folder by the ID of the task.

$webUrl = "http:.."            # this is the URL of the SPWeb
$library = "Compliance Tasks"  # this is the SPList display name
$tempLocation = "D:\PROD"      # Local Folder to dump files
$s = new-object Microsoft.SharePoint.SPSite($webUrl)   
$w = $s.OpenWeb()        
$l = $w.Lists[$library]   
foreach ($listItem in $l.Items)
{
Write-Host "    Content: " $listItem.ID
$destinationfolder = $tempLocation + "\" + $listItem.ID         
if($listItem.Attachments.Count -gt 0)
{
if (!(Test-Path -path $destinationfolder))       
{           
$dest = New-Item $destinationfolder -type directory         
}
foreach ($attachment in $listItem.Attachments)   
{       
$file = $w.GetFile($listItem.Attachments.UrlPrefix + $attachment)       
$bytes = $file.OpenBinary()               
$path = $destinationfolder + "\" + $attachment
Write "Saving $path"
$fs = new-object System.IO.FileStream($path, "OpenOrCreate")
$fs.Write($bytes, 0 , $bytes.Length)   
$fs.Close()   
}
}
}

A folder for each task was created to allow for multiple attachments. The ID was applied to each folder, to allow a subsequent script to traverse and upload the attachments by ID, or for any linkage preservation.

For how to upload attachments from a task list, please see: Uploading attachments to tasks

Solution for SharePoint incompatibility with Chrome

Solution for SharePoint incompatibility with Chrome

There have been a number of issues relating to how SharePoint behaves within Chrome. Most relate to legacy applications. These include SilverLight and ActiveX controls.
One significant issue is that Chrome won’t open InfoPath Filler forms in InfoPath Filler, and instead tries to open it in the web version of InfoPath.

A solution to these can be found in this 3rd party Chrome plug-in: IE Tab

It does have a per-user cost, but it enables IE to activated from within Chrome. One just specifies the URL string pattern that should be opened in IE.

Setting a Site Collection to not be read-only

How to set a site collection as not read-only

Is your site collection read-only?

It is critical to be able to set a site collection to not be read-only. This situation can occur if a site backup is interrupted, as an SPSite backup is made read-only temporarily during backups.

$site=Get-SPSite "http://sharepoint/managedpath/sitename"
$site.set_ReadOnly($false)

To turn it back to read-only:

$site=Get-SPSite "http://sharepoint/managedpath/sitename"
$site.set_ReadOnly($true)

SSRS Deployment Automation

SQL Server reporting comes to life when SSRS (SQL Server Reporting Services) runs in SharePoint integrated mode

Report deployment is a manual process from SQL Server Data Tools within Visual Studio.  If a report has to go to multiple locations, it can be a hassle to track all the locations and ensure proper deployment and updates.

Companies often have a wide range of SSDT (SQL Server Data Tool) Solutions that contain reports. These reports today are deployed to SSRS running in Native Mode. We have developed a deployment solution to enable a single-stage deployment of reports to SharePoint to the correct and secure location per report. This tool facilitates report deployments to multiple locations across multiple VS solutions. It utilizes a single database table that contains the full mapping of Visual Studio reports to all the SharePoint locations each report may be published.

The report deployment tool is able to run from any location that can connect to both the destination SharePoint and the SQL Server housing the mapping table. The tool requires credentials to be able to publish to SharePoint.

The report deployment tool can be used to deploy and manage report publishing from multiple Visual Studio solutions to multiple SharePoint site collections:
1. Automate SharePoint Location Inventory
2. Automated inventory of all SSRS Visual Studio reports across solutions
3. Each SharePoint location can have a description associated, as can each report for publishing
4. Automatically create libraries and create the views, metadata and add the content types for as many libraries as needed
5. Each report can be published to more than one location. Reports can be mapped in bulk
6. Update all reports in SharePoint
7. Reports in SharePoint need to be rebound to data sources when published. The rebinding can be automatically done for you
8. Define the sequence that reports appear in SharePoint
9. Category and Title fields are published automatically to SharePoint
10. Creation of libraries and folders fully configured for reports can be done visually within the tool
11. Retract or delete a report from all associated locations
The tool defaults to not overwriting data sources. This is to avoid overwriting credentials entered for use by the data source. A checkbox is available to force the overwriting of data sources.

Read More