challenges in the Hybrid sourcing model

Overcoming cultural challenges in the Hybrid sourcing model

The focus of this article is how team members within offshore organizations can be more effective in the USA working within a hybrid model. Without active efforts to address the cultural differences leads to eroded USA client satisfaction and general frustration.

While the cultural differences in a fully offshore model are commonly understood, the differences could be mare stark and even destructive in the hybrid model, where staff are present both on and off shore.

It is understood that avoiding cultural miscues requires at times going against instinct, and doing what feels unnatural. However success depends on closing the cultural gap, so it is a goal worthy of effort.

In this article I try to offer specific guidance for the Indian audience that can be put to immediate use to improve effectiveness when working in the USA.

Punctuality

In USA, being late for a meeting can be fatal to a relationship. People are expected to arrive not only on-time, but preferably slightly early. Not only to accommodate unanticipated delays, but also to get through security and be fresh and ready for the meeting. In USA, meetings can run over, but generally start on time.

Language

While Indian written and spoke English is generally excellent, Americans often have a hard time parsing English spoken by Indians. This seems to be a greater problem over the phone. Note the human ear adjusts to accents, so over time, this issue is reduced. For example, many American companies use Philippine-based offshore call centers, as that accent is considered easier for Americans to understand.
Lastly, some expressions are unknown in the West, and cause puzzlement, and should be avoided in conversation. These include:
• “Kindly Revert”
Try instead “Please get back to me”
In general, outside of mathematics and programming, “Revert” is rarely ever used in spoken American English.

  • “Do the needful”
    Try instead “Please do ‘x’”
  • “Discuss about”
    Try instead “Can we discuss it?”
  • “Veg”
    That acronym is unknown in the west. Instead refer to it as “Vegetarian” and “Non-Vegetarian”
  • “Holiday”
    Use the word “Vacation” instead.
  • “Rest is fine”
    Instead, use the phrase “All the rest is fine”

Suggestions for success:

  • Visit clients in person when possible
  • Use Skype over telephone, as that connection is often clearer
  • Avoid some phrases commonly used by Indians
  • Add in “Please” and “Thank you” in verbal and written correspondence

 

Being direct

The nature of client/service provider puts the USA in the driver’s seat. The perception in India is that the client is generally right, and it would be rude to question the client. However in the USA, the expectation is that the service provider should challenge guidance, and offer insights, alternatives, possible improvements and even critique of the guidance. The problem becomes more subtle and insidious with delivering bad news. There are two general approaches to managing the delivery of bad news in the USA:

  1. Try to deliver all the bad news at once
    Better to declare a delivery date slippage of a full week, than to slip a day each day of the week.
  2. Deliver the news as early as possible
    Using the scheduling delay issue above, a top British architect once told me “If a project is declared late on the delivery date, I fire the staffer for one of two reasons: Either he is incompetent and did not know it was late until the last day, or he hid the information from me, which is dishonest”
    In short, it’s almost always better to be up-front. Being blunt on advice is considered especially refreshing in the USA, and is a key factor in promotions, event when the communication is skipping reporting layers.

Facing things head-on

Similar to being direct, in USA avoiding something uncomfortable, is frowned upon, and even downright confusing to Americans. If you cannot do what is asked, it is expected you will push back directly, and say you can’t. The best advice is to offer alternatives, rather than a blanket refusal.
How to succeed: If you can’t do something, offer guidance on who can, or a date when you can, or an alternative approach that you can deliver on.

Initiative

In the USA, guidance given is often general and not specific and indirect. In contrast, the expectation from Indian staff is to be given precise directives. USA staff are used to giving vague and summary guidance, and only general goals.

This cultural gap leads to stress for Indian staff and dissatisfaction on the part of USA clients.

This cultural gap feeds the perception in USA is that Indian staff do not show initiative. This perception is grounded in experience where USA organizations expect staff to apply common sense and promote ideas, alternatives, and make unrequested changes.

In USA there is an expression “The Customer is King”. So if the client asks for something small (doable), the common response expected should be “Yes” or a qualified yes, with a promise to get back. It is less acceptable to say one needs to climb the management chain for approval. Staff are assumed to take responsibility. Certainly this has to be balanced with managing scope/costs caused by commitments.

Email confirmation

Americans expect email confirmation when sending an email. This may sound simpler and even petty, but has been an issue, especially exacerbated by the time zone differences. Emails are discussions in America, and Americans expect confirmation or a reply with questions, ideas or suggestions.

I’ve seen this simple expectation result in significant consternation when an American manager emails guidance, and doesn’t get questions or acknowledgement back in response.

Americans are generally happy to coordinate and navigate even complex business issues in writing.

Suggestions for success:

  • Acknowledge receipt of guidance along with stating that it was understood
  • Ask any questions or clarifications in response
  • Feel free to suggest improvements or enhancements or even critique

Summary

The cultural differences encountered in business can be a cause for delight, or a source for frustration and even failed business relationships. Closing the cultural gap starts with an understanding of differences, and effort to change behavior and reduce the impact of such differences.

Excel corruption writing DIF files

Restore SharePoint document timestamp and author from feedfile

Often an administrator during maintenance or checking in a document for a user, “stomps” on a timestamp and who edited the document. In a perfect world we take the time to restore authorship and timestamp. Here’s a script that reads in a CSV of the URL, timestamp and user of any number of documents to correct. it will also try to remove the previous incorrect version, if possible.

$actionlist= Import-Csv "C:scriptsNameDateTag.csv" 
for ($Ai=0; $Ai -lt $actionlist.Count; $Ai++)
{
$ActionRow=$ActionList[$Ai]
$docurl=$ActionRow.DocURL;
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$list = $item.ParentList
[System.DateTime] $dat = Get-Date $ActionRow.Timestamp
$usr = $web.ensureuser($ActionRow.Editor)
$item["Modified"] = $dat;
$item["Editor"] = $usr;
$item.Update()
try { $item.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item['Name'])"}
}

Gradual Site Collection Deletion

Gradual Site Collection Deletion

I had a mission to achieve overnight; move 28 site collections into a new managed path, as well as rename the 28 associated content databases.   Pretty straightforward to do, and to script in advance to run in stages:

  1. Backup site collections
  2. Delete site collections
  3. Dismount content databases
  4. Rename content databases and shuffle around underlying storage including FILESTREAM RBS location
  5. Create the new managed path; reset of IIS
  6. Remount 28 content DBs
  7. Delete the old managed path
  8. Restore the 27 content databases (this is where tinkling glass is heard, followed by painful silence)

After enough jolt cola to get that turtle to beat the hare fair and square, I got to the bottom of the problem.  First the problem.  The site collections could not be restored into their original database, as the site collection purportedly already existed there.  Even though it was deleted.

By midnight I gave up, and did the 28 Restore-SPSite into any random content databases, tossing organization and structure to the winds (temporarily), knowing I’ve got gads of available storage, knowing once I got to the bottom of the issue, a simple set of move-spsite commands would set things right.  No, I don’t like randomness, but I also like a few winks of sleep and happy users…

Now the cause.  Since SharePoint 2010 SP1 and SP2013 and SP2013, has the ability to recover deleted site collections (not through the UI, but only through PowerShell).  I used the -gradualdelete option, thinking I would be nice and gentle with a production farm.  Here’s a sample of my delete commands, where I also disable prompting:

Remove-spsite  href="http ://sharepoint/div/clm/int/A/"  -confirm:$false -gradualdelete

Here’s the kicker.  After the delete, the site collection is indeed still there.   It sticks around actually for the duration of the the Recycle Bin duration (default 30 days).  There’s one good way to see, let’s dive into the forbidden content database, and have a peek:

SELECT [DeletionTime]
,[Id]
,[SiteId]
,[InDeletion]
,[Restorable]
FROM [Content_mydivision_a].[dbo].[SiteDeletion]
where (Restorable=1)

Restorable=1 indicates this site collection could be restored.

The solution?  Well, it’s not the recycle bin job, that has no effect on this.  There is a gradual delete job at the web application level, but that won’t help us either; at least not just yet.  First you have to use the remove-spsite CmdLet to remove each site permanently.  Here’s the syntax:

remove-spdeletedsite f5f7639d-536f-4f76-8f94-57834d177a99 -confirm:$false

Ah, you don’t know your Site Collection GUIDs by heart? well, me neither, I prefer more useful allocation of brain cells, so here’s the command that will give you the Site Collection GUIDs that have been (partially) deleted:

get-spdeletedsite -webapplication "http ://sharepoint/"

So, you got your partially deleted GUIDs, you diligently did a remove-spdeletedsite for each, but the Restore-SPSite still will not work.  Now’s the time to run the handy-dandy Gradual Delete timer job for your web application, in Central Admin, Monitoring.  First thing you might notice is the job is taking a bit of time to run.  That’s good, it’s doing something for you, and actually triggering the Content DB Stored Procedure called proc_DeleteSiteCoreAsync.  It actually deletes in batches.

Here’s how to wipe out all these mildly annoying site collections from your recycle bin for a web application:

get-spdeletedsite -webapplication "http://my-srv-sp10/" | Remove-SPDeletedSite

At this point your Restore-SPSite will work to your target content database, and if you played SharePoint Roulette like me and restored to a random location, a move-SPSite will make fast work of putting things where they should be.

More information on the Gradual Deletion Timer Job can be found in this Technet Article by Bill Baer

SharePoint Group Management

Managing SharePoint Groups in PowerShell

SharePoint Groups are a great mechanism for managing user permissions, however they exist within a single site collection. What if you have hundreds of site collections? We can easily script a range of common operations.

I prefer to use a CSV fed approach to manage groups and users. I create a CSV with the name of the group, and the users, which I list in pipe separated format (commas are already being used for the CSV). To read in a CSV use:

Import-Csv "L:PowerShellAD and SP group mapping.csv"

Let’s get the Site, Root Web, as well as an SPUser for the group owner, and get the groups object:

$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;

Here’s how to add a Group:

$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group for Members")

Here’s how to give the group Read access, for example:

$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
$MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
$MyAcctrole = $RootWeb.RoleDefinitions["Read"]
$MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
$RootWeb.RoleAssignments.Add($MyAcctassignment)
}

Here’s how to add a Member to a Group:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj) #if it exists
{
$GroupToAddTo.addUser($UserObj)  
}

Note that a duplicate addition of a member is a null-op, throwing no errors.

Here’s how to remove a member:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  
}

Here’s how to remove all the members from a given group. This wipes the users from the whole site collection, so use this approach with care and consideration:

$user1 = $RootWeb.EnsureUser($MyUser)
try
{
$RootWeb.SiteUsers.Remove($MyUser)
$RootWeb.update()
}

Here’s the full script, with flags to setting the specific actions described above:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
# uses feedfile to load and create set of SharePoint Groups.
$mylogfile="L:PowerShellongoinglogfile.txt"
$ADMap= Import-Csv "L:PowerShellAD and SP group mapping.csv"
$OwnerName = "DOMAIN\sp2013farm"
$AddGroups = $false;
$AddMembers = $false;  # optionally populates those groups, Comma separated list
$GrantGroupsRead = $true; #grants read at top rootweb level
$RemoveMembers = $false; # optionally  removes Comma separated list of users from the associated group
$WipeMembers = $false;	# wipes the groups clean		
$WipeUsersOutOfSite = $false;  #The Nuclear option. Useful to eliminate AD groups used directly as groups
#we do not need a hashtable for this work, but let's load it for extensibility
$MyMap=@{}  #load CSV contents into HashTable
for ($i=0; $i -lt $AD.Count; $i++)
{
$MyMap[$ADMap[$i].SharePointGroup] = $ADMap[$i].ADGroup;
}
# Script changes the letter heading for each site collection
$envrun="Dev"			# selects environment to run in
if ($envrun -eq "Dev")
{
$siteUrl = "h ttp://DevServer/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "ht tp://sharepoint/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
Write-Host "script starting" 
$myheader = "STARTING: $(get-date)"
foreach ($letter in $LoopStringArr)
{
$SiteName=$siteurl+$letter
$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;
for ($ADi = 0; $ADi -lt $ADMap.count; $ADi++)
{
$SPGroupName = $ADMap[$ADi].SharePointGroup;
if ($AddGroups)
{
if (!$Groups[$SPGroupName]) #no exist, so create
{
try
{
$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group members")
}
catch
{
Write-Host -ForegroundColor DarkRed "Ouch, could not create $($SPgroupName)"
}
}
else
{
Write-Host -ForegroundColor DarkGreen "Already exists: $($SPgroupName)"
}
} #endif Add Groups
if ($GrantGroupsRead)
{
$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
$MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
$MyAcctrole = $RootWeb.RoleDefinitions["Read"]
$MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
$RootWeb.RoleAssignments.Add($MyAcctassignment)
} #if the group exists in the first place
} #ActionFlagTrue
if ($AddMembers)
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
$usersToAdd = $ADMap[$ADi].ADGroup;
if ($usersToAdd.length -gt 0) #if no users to add, skip
{
$usersToAddArr = $usersToAdd.split("|")
foreach ($userName in $usersToAddArr)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.addUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
}
}
} #users to add
} #if the group exists in the first place
} #ActionFlagTrue
if ($RemoveMembers)
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
$usersToAdd = $ADMap[$ADi].SharePointGroup;
if ($usersToAdd.length -gt 0) #if no users to add, skip
{
$usersToAddArr = $usersToAdd.split("|")
foreach ($userName in $usersToAddArr)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
}
}
} #users to add
} #if the group exists in the first place
} #ActionFlagTrue
if ($WipeMembers)  #Nukes all users in the group
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
foreach ($userName in $GroupToAddTo.Users)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot remove user ($($userName) to $($GroupToAddTo)"
}
}
} #if the group exists in the first place
} #ActionFlagTrue
if ($WipeUsersOutOfSite)  #Nukes all users in the group
{
$usersToNuke = $ADMap[$ADi].ADGroup;
if ($usersToNuke.length -gt 0) #if no users to add, skip
{
$usersToNukeArr = $usersToNuke.split("|")
foreach ($MyUser in $usersToNukeArr)
{
try
{
try
{
$user1 = $RootWeb.EnsureUser($MyUser)
}
catch
{
Write-Host "x1: Failed to ensure user $($MyUser) in $($Site.url)"
}
try
{
$RootWeb.SiteUsers.Remove($MyUser)
$RootWeb.update()
}
catch
{
Write-Host "x2: Failed to remove $($MyUser) from all users in $($Site.url)"
}
}
catch
{
Write-Host "x4: other failure for $($MyUser) in $($Site.url)"
}
} #if user is not null
} #foreach user to nuke
} #ActionFlagTrue
}
$rootWeb.dispose()
$site.dispose()
} #foreach site

Limiting Search Crawling to a subsite

I had an interesting challenge.  I was asked to limit Search Crawling to a single subsite.  The underlying issue was that a great deal of security in this farm was implemented via Audiences which is not a secure method of locking down content. Audiences expose documents and items to users, but don’t prevent the user from actually accessing the documents or items.  Search Content Sources expect to have nice and simple Web Application URLs to crawl.  So how best to restrict crawling to a subsite?

The simple answer is set up the Content Source to crawl the whole Web Application, but set up Crawl Rules to exclude everything else.  Only two rules are needed:

  1. Include: List the site to include, such as “http ://sharepoint/sites/site1/site2”
    Note the * at the end to ensure all sub-content is crawled.  Being the first crawl rule, this takes precedence over the next. Don’t forget the *.*
    It seems the testing of the crawl rule with just a * will appear to capture all content, but at crawl time, only a *.* will capture content with a file extension.
  2. Exclude: List everything else: http://*.*
    This will exclude anything not captured in the first rule.
  3. If you have a content source that includes people (sps3://sharepoint) be sure to use a wildcard on the protocol as well.

Voila!

Incrementing repeating table field values

How to increment a field for new rows added into a repeating table in InfoPath

It is really helpful to users to incrementally auto-populate newly added rows. In InfoPath, XPath is used to specify calculated values. To count the number of rows that exist, we can use the following expression:

count(../preceding-sibling::*)

This returns the 0 based count of existing rows. First row would be zero. To get a one-based count, simply use:

count(../preceding-sibling::*)+1

What if you want to have letters that increase? One way is to use the XPath translate(0 function such as:

translate(count(../preceding-sibling::*), "012345", "ABCDEF")

Uploading attachments to tasks in SharePoint

Uploading attachments to tasks in SharePoint programmatically

Uploading attachments to tasks in SharePoint programmatically can be done if the documents are stored in a structured way. The script described below assumes the documents are stored in folders that are named using the ID of the task in SharePoint.

For how to download attachments from a task list, please see: Downloading attachments from Task list

First, we get all files and folders, then we use the folder name to do a CAML Query lookup to get the Task by ID, then we binary upload the attachments.

$Web = Get-SPWeb "http: location" #SPWeb site location as URL
$fileDirectory = "D:\PROD";    # location holding the attachments
$spList = $web.lists["Tasks"]  #replace with the name of your task list
foreach($folder in Get-ChildItem $fileDirectory)
{
$folderID = $folder.Name
$spQuery = New-Object Microsoft.SharePoint.SPQuery;
$camlQuery = "<Where><Eq><FieldRef Name='TaskID'/><Value Type='Number'>$folderID</Value></Eq></Where>"
$spQuery.Query = $camlQuery
$processListItems = $spList.GetItems($spQuery)                                                                                                                             
$item = $processListItems[0];
$folderPath = $fileDirectory+"\"+$folder
foreach($file in Get-ChildItem $folderPath )
{
#$fileStream = ([System.IO.FileInfo] (Get-Item $File.FullName)).OpenRead()
$bytes = [System.IO.File]::ReadAllBytes($File.FullName)
$item.Attachments.Add([System.IO.Path]::GetFileName($File.FullName), $bytes)
$item.Update()
write-host File Uploaded.. $File.FullName -> $item.ID
}
}
$web.Dispose()

How to download all attachments for all tasks in a list

Downloading all attachments for a SharePoint task list
Tasks can have attachments, in fact they can have multiple attachments. However these are stored in an “AttachmentCollection”. We can iterate through all items in the task list to download all attachments. What we do is create a folder for each of the items, and name the folder by the ID of the task.

$webUrl = "http:.."            # this is the URL of the SPWeb
$library = "Compliance Tasks"  # this is the SPList display name
$tempLocation = "D:\PROD"      # Local Folder to dump files
$s = new-object Microsoft.SharePoint.SPSite($webUrl)   
$w = $s.OpenWeb()        
$l = $w.Lists[$library]   
foreach ($listItem in $l.Items)
{
Write-Host "    Content: " $listItem.ID
$destinationfolder = $tempLocation + "\" + $listItem.ID         
if($listItem.Attachments.Count -gt 0)
{
if (!(Test-Path -path $destinationfolder))       
{           
$dest = New-Item $destinationfolder -type directory         
}
foreach ($attachment in $listItem.Attachments)   
{       
$file = $w.GetFile($listItem.Attachments.UrlPrefix + $attachment)       
$bytes = $file.OpenBinary()               
$path = $destinationfolder + "\" + $attachment
Write "Saving $path"
$fs = new-object System.IO.FileStream($path, "OpenOrCreate")
$fs.Write($bytes, 0 , $bytes.Length)   
$fs.Close()   
}
}
}

A folder for each task was created to allow for multiple attachments. The ID was applied to each folder, to allow a subsequent script to traverse and upload the attachments by ID, or for any linkage preservation.

For how to upload attachments from a task list, please see: Uploading attachments to tasks

Workflow error in SP2013 related to App Step

I had a situation when my workflows went to “Suspended” after a few minutes. I noticed this occured when I used the “App Step”. Looking at the “Suspended” state, there was an “i” icon, when clicked, showed this error:
Details: An unhandled exception occurred during the execution of the workflow instance. Exception details: System.ApplicationException: HTTP 401 {“error_description”:”The server was unable to process the request due to an internal error

Here is how the error actually consistently appears:
WF Suspended

It turns out the App Step will not work, without specific configuration to grant the App Step rights.

So of course some permission issue was most probable. And I was suspicious of App Step, as I mentioned yesterday as the proximate cause.

So configured the permission as below link:
https://www.dmcinfo.com/latest-thinking/blog/id/8661/create-site-from-template-using-sharepoint-2013-workflow

The description in the above link should make things really clear. So when an App Step runs, it uses permissions that needed to be set.

One more thing, we need specify the Scope URL as below

On the subweb, you will find the configuration setting for app permission: http :[SPWeb URL] /_layouts/15/appinv.aspx

You can see it set up from here:
[site URL] /_layouts/15/appprincipals.aspx?Scope=Web

Here is the feature to enable:
WFAppStepFeatureToEnable

For App Step Permissions, it is the 3rd link under permissions from Site Settings.

Here is how the specific App Step Permission appears when viewed from Site Settinga App Step Permissions:
SpecificAppStepPermissionDefinition

With the permissions granted, the workflows should then work on retry. You can retry from the workflow summary page shown on top