SharePoint focused PowerShell automation

Restore SharePoint document timestamp and author from feedfile

Often an administrator during maintenance or checking in a document for a user, “stomps” on a timestamp and who edited the document. In a perfect world we take the time to restore authorship and timestamp. Here’s a script that reads in a CSV of the URL, timestamp and user of any number of documents to correct. it will also try to remove the previous incorrect version, if possible.

$actionlist= Import-Csv "C:scriptsNameDateTag.csv" 
for ($Ai=0; $Ai -lt $actionlist.Count; $Ai++)
{
$ActionRow=$ActionList[$Ai]
$docurl=$ActionRow.DocURL;
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$list = $item.ParentList
[System.DateTime] $dat = Get-Date $ActionRow.Timestamp
$usr = $web.ensureuser($ActionRow.Editor)
$item["Modified"] = $dat;
$item["Editor"] = $usr;
$item.Update()
try { $item.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item['Name'])"}
}

Gradual Site Collection Deletion

Gradual Site Collection Deletion

I had a mission to achieve overnight; move 28 site collections into a new managed path, as well as rename the 28 associated content databases.   Pretty straightforward to do, and to script in advance to run in stages:

  1. Backup site collections
  2. Delete site collections
  3. Dismount content databases
  4. Rename content databases and shuffle around underlying storage including FILESTREAM RBS location
  5. Create the new managed path; reset of IIS
  6. Remount 28 content DBs
  7. Delete the old managed path
  8. Restore the 27 content databases (this is where tinkling glass is heard, followed by painful silence)

After enough jolt cola to get that turtle to beat the hare fair and square, I got to the bottom of the problem.  First the problem.  The site collections could not be restored into their original database, as the site collection purportedly already existed there.  Even though it was deleted.

By midnight I gave up, and did the 28 Restore-SPSite into any random content databases, tossing organization and structure to the winds (temporarily), knowing I’ve got gads of available storage, knowing once I got to the bottom of the issue, a simple set of move-spsite commands would set things right.  No, I don’t like randomness, but I also like a few winks of sleep and happy users…

Now the cause.  Since SharePoint 2010 SP1 and SP2013 and SP2013, has the ability to recover deleted site collections (not through the UI, but only through PowerShell).  I used the -gradualdelete option, thinking I would be nice and gentle with a production farm.  Here’s a sample of my delete commands, where I also disable prompting:

Remove-spsite  href="http ://sharepoint/div/clm/int/A/"  -confirm:$false -gradualdelete

Here’s the kicker.  After the delete, the site collection is indeed still there.   It sticks around actually for the duration of the the Recycle Bin duration (default 30 days).  There’s one good way to see, let’s dive into the forbidden content database, and have a peek:

SELECT [DeletionTime]
,[Id]
,[SiteId]
,[InDeletion]
,[Restorable]
FROM [Content_mydivision_a].[dbo].[SiteDeletion]
where (Restorable=1)

Restorable=1 indicates this site collection could be restored.

The solution?  Well, it’s not the recycle bin job, that has no effect on this.  There is a gradual delete job at the web application level, but that won’t help us either; at least not just yet.  First you have to use the remove-spsite CmdLet to remove each site permanently.  Here’s the syntax:

remove-spdeletedsite f5f7639d-536f-4f76-8f94-57834d177a99 -confirm:$false

Ah, you don’t know your Site Collection GUIDs by heart? well, me neither, I prefer more useful allocation of brain cells, so here’s the command that will give you the Site Collection GUIDs that have been (partially) deleted:

get-spdeletedsite -webapplication "http ://sharepoint/"

So, you got your partially deleted GUIDs, you diligently did a remove-spdeletedsite for each, but the Restore-SPSite still will not work.  Now’s the time to run the handy-dandy Gradual Delete timer job for your web application, in Central Admin, Monitoring.  First thing you might notice is the job is taking a bit of time to run.  That’s good, it’s doing something for you, and actually triggering the Content DB Stored Procedure called proc_DeleteSiteCoreAsync.  It actually deletes in batches.

Here’s how to wipe out all these mildly annoying site collections from your recycle bin for a web application:

get-spdeletedsite -webapplication "http://my-srv-sp10/" | Remove-SPDeletedSite

At this point your Restore-SPSite will work to your target content database, and if you played SharePoint Roulette like me and restored to a random location, a move-SPSite will make fast work of putting things where they should be.

More information on the Gradual Deletion Timer Job can be found in this Technet Article by Bill Baer

SharePoint Group Management

Managing SharePoint Groups in PowerShell

SharePoint Groups are a great mechanism for managing user permissions, however they exist within a single site collection. What if you have hundreds of site collections? We can easily script a range of common operations.

I prefer to use a CSV fed approach to manage groups and users. I create a CSV with the name of the group, and the users, which I list in pipe separated format (commas are already being used for the CSV). To read in a CSV use:

Import-Csv "L:PowerShellAD and SP group mapping.csv"

Let’s get the Site, Root Web, as well as an SPUser for the group owner, and get the groups object:

$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;

Here’s how to add a Group:

$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group for Members")

Here’s how to give the group Read access, for example:

$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
$MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
$MyAcctrole = $RootWeb.RoleDefinitions["Read"]
$MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
$RootWeb.RoleAssignments.Add($MyAcctassignment)
}

Here’s how to add a Member to a Group:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj) #if it exists
{
$GroupToAddTo.addUser($UserObj)  
}

Note that a duplicate addition of a member is a null-op, throwing no errors.

Here’s how to remove a member:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  
}

Here’s how to remove all the members from a given group. This wipes the users from the whole site collection, so use this approach with care and consideration:

$user1 = $RootWeb.EnsureUser($MyUser)
try
{
$RootWeb.SiteUsers.Remove($MyUser)
$RootWeb.update()
}

Here’s the full script, with flags to setting the specific actions described above:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
# uses feedfile to load and create set of SharePoint Groups.
$mylogfile="L:PowerShellongoinglogfile.txt"
$ADMap= Import-Csv "L:PowerShellAD and SP group mapping.csv"
$OwnerName = "DOMAIN\sp2013farm"
$AddGroups = $false;
$AddMembers = $false;  # optionally populates those groups, Comma separated list
$GrantGroupsRead = $true; #grants read at top rootweb level
$RemoveMembers = $false; # optionally  removes Comma separated list of users from the associated group
$WipeMembers = $false;	# wipes the groups clean		
$WipeUsersOutOfSite = $false;  #The Nuclear option. Useful to eliminate AD groups used directly as groups
#we do not need a hashtable for this work, but let's load it for extensibility
$MyMap=@{}  #load CSV contents into HashTable
for ($i=0; $i -lt $AD.Count; $i++)
{
$MyMap[$ADMap[$i].SharePointGroup] = $ADMap[$i].ADGroup;
}
# Script changes the letter heading for each site collection
$envrun="Dev"			# selects environment to run in
if ($envrun -eq "Dev")
{
$siteUrl = "h ttp://DevServer/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "ht tp://sharepoint/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
Write-Host "script starting" 
$myheader = "STARTING: $(get-date)"
foreach ($letter in $LoopStringArr)
{
$SiteName=$siteurl+$letter
$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;
for ($ADi = 0; $ADi -lt $ADMap.count; $ADi++)
{
$SPGroupName = $ADMap[$ADi].SharePointGroup;
if ($AddGroups)
{
if (!$Groups[$SPGroupName]) #no exist, so create
{
try
{
$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group members")
}
catch
{
Write-Host -ForegroundColor DarkRed "Ouch, could not create $($SPgroupName)"
}
}
else
{
Write-Host -ForegroundColor DarkGreen "Already exists: $($SPgroupName)"
}
} #endif Add Groups
if ($GrantGroupsRead)
{
$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
$MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
$MyAcctrole = $RootWeb.RoleDefinitions["Read"]
$MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
$RootWeb.RoleAssignments.Add($MyAcctassignment)
} #if the group exists in the first place
} #ActionFlagTrue
if ($AddMembers)
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
$usersToAdd = $ADMap[$ADi].ADGroup;
if ($usersToAdd.length -gt 0) #if no users to add, skip
{
$usersToAddArr = $usersToAdd.split("|")
foreach ($userName in $usersToAddArr)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.addUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
}
}
} #users to add
} #if the group exists in the first place
} #ActionFlagTrue
if ($RemoveMembers)
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
$usersToAdd = $ADMap[$ADi].SharePointGroup;
if ($usersToAdd.length -gt 0) #if no users to add, skip
{
$usersToAddArr = $usersToAdd.split("|")
foreach ($userName in $usersToAddArr)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
}
}
} #users to add
} #if the group exists in the first place
} #ActionFlagTrue
if ($WipeMembers)  #Nukes all users in the group
{
$GroupToAddTo = $Groups[$SPGroupName]
if ($GroupToAddTo) #if group exists
{
foreach ($userName in $GroupToAddTo.Users)
{
try
{
$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
$GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
}
}
catch
{
Write-Host -ForegroundColor DarkRed "cannot remove user ($($userName) to $($GroupToAddTo)"
}
}
} #if the group exists in the first place
} #ActionFlagTrue
if ($WipeUsersOutOfSite)  #Nukes all users in the group
{
$usersToNuke = $ADMap[$ADi].ADGroup;
if ($usersToNuke.length -gt 0) #if no users to add, skip
{
$usersToNukeArr = $usersToNuke.split("|")
foreach ($MyUser in $usersToNukeArr)
{
try
{
try
{
$user1 = $RootWeb.EnsureUser($MyUser)
}
catch
{
Write-Host "x1: Failed to ensure user $($MyUser) in $($Site.url)"
}
try
{
$RootWeb.SiteUsers.Remove($MyUser)
$RootWeb.update()
}
catch
{
Write-Host "x2: Failed to remove $($MyUser) from all users in $($Site.url)"
}
}
catch
{
Write-Host "x4: other failure for $($MyUser) in $($Site.url)"
}
} #if user is not null
} #foreach user to nuke
} #ActionFlagTrue
}
$rootWeb.dispose()
$site.dispose()
} #foreach site

Uploading attachments to tasks in SharePoint

Uploading attachments to tasks in SharePoint programmatically

Uploading attachments to tasks in SharePoint programmatically can be done if the documents are stored in a structured way. The script described below assumes the documents are stored in folders that are named using the ID of the task in SharePoint.

For how to download attachments from a task list, please see: Downloading attachments from Task list

First, we get all files and folders, then we use the folder name to do a CAML Query lookup to get the Task by ID, then we binary upload the attachments.

$Web = Get-SPWeb "http: location" #SPWeb site location as URL
$fileDirectory = "D:\PROD";    # location holding the attachments
$spList = $web.lists["Tasks"]  #replace with the name of your task list
foreach($folder in Get-ChildItem $fileDirectory)
{
$folderID = $folder.Name
$spQuery = New-Object Microsoft.SharePoint.SPQuery;
$camlQuery = "<Where><Eq><FieldRef Name='TaskID'/><Value Type='Number'>$folderID</Value></Eq></Where>"
$spQuery.Query = $camlQuery
$processListItems = $spList.GetItems($spQuery)                                                                                                                             
$item = $processListItems[0];
$folderPath = $fileDirectory+"\"+$folder
foreach($file in Get-ChildItem $folderPath )
{
#$fileStream = ([System.IO.FileInfo] (Get-Item $File.FullName)).OpenRead()
$bytes = [System.IO.File]::ReadAllBytes($File.FullName)
$item.Attachments.Add([System.IO.Path]::GetFileName($File.FullName), $bytes)
$item.Update()
write-host File Uploaded.. $File.FullName -> $item.ID
}
}
$web.Dispose()

How to download all attachments for all tasks in a list

Downloading all attachments for a SharePoint task list
Tasks can have attachments, in fact they can have multiple attachments. However these are stored in an “AttachmentCollection”. We can iterate through all items in the task list to download all attachments. What we do is create a folder for each of the items, and name the folder by the ID of the task.

$webUrl = "http:.."            # this is the URL of the SPWeb
$library = "Compliance Tasks"  # this is the SPList display name
$tempLocation = "D:\PROD"      # Local Folder to dump files
$s = new-object Microsoft.SharePoint.SPSite($webUrl)   
$w = $s.OpenWeb()        
$l = $w.Lists[$library]   
foreach ($listItem in $l.Items)
{
Write-Host "    Content: " $listItem.ID
$destinationfolder = $tempLocation + "\" + $listItem.ID         
if($listItem.Attachments.Count -gt 0)
{
if (!(Test-Path -path $destinationfolder))       
{           
$dest = New-Item $destinationfolder -type directory         
}
foreach ($attachment in $listItem.Attachments)   
{       
$file = $w.GetFile($listItem.Attachments.UrlPrefix + $attachment)       
$bytes = $file.OpenBinary()               
$path = $destinationfolder + "\" + $attachment
Write "Saving $path"
$fs = new-object System.IO.FileStream($path, "OpenOrCreate")
$fs.Write($bytes, 0 , $bytes.Length)   
$fs.Close()   
}
}
}

A folder for each task was created to allow for multiple attachments. The ID was applied to each folder, to allow a subsequent script to traverse and upload the attachments by ID, or for any linkage preservation.

For how to upload attachments from a task list, please see: Uploading attachments to tasks

Setting a Site Collection to not be read-only

How to set a site collection as not read-only

Is your site collection read-only?

It is critical to be able to set a site collection to not be read-only. This situation can occur if a site backup is interrupted, as an SPSite backup is made read-only temporarily during backups.

$site=Get-SPSite "http://sharepoint/managedpath/sitename"
$site.set_ReadOnly($false)

To turn it back to read-only:

$site=Get-SPSite "http://sharepoint/managedpath/sitename"
$site.set_ReadOnly($true)

Working with Web Parts in PowerShell

Working with web parts programmatically across pages is awkward but possible in PowerShell. Let’s start by generating a report of the web parts. This iterates through non-admin web apps, site collections, and pages:

 
$oContentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService;
[Microsoft.SharePoint.Administration.SPWebApplicationCollection]$waCollection = $oContentService.WebApplications;
$log = ".\results.txt"    # output file name and path
$pagepath = "/default.aspx"    # you can change page name or page path  
"Site URL; WebPart Title ; Webpart ID" | out-file $log
$waCollection1 = $waCollection | where-object {$_.IsAdministrationWebApplication -eq $FALSE}
foreach ($wa in $waCollection1)
{
foreach ($obj in $wa.Sites) 
{
write-host "Processing site: " , $siteURL
$siteURL = $obj.URL
$site=new-object Microsoft.SharePoint.SPSite($siteURL)
$pageURL = $siteURL + $pagepath  
$web=$site.Openweb()   
$webpartmanager=$web.GetLimitedWebPartManager($pageURL,  [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)   
foreach ($webpart in $webpartmanager.WebParts)
{   
$siteURL + "; " + $webpart.Title + " ;  " + $webpart.ID | out-file $log -append   
}                           
}

As an example, we can remove web parts programmatically, by specifying the site collection and Web Part GUID:

$siteURL = "ht tp://sharePoint/sites/specialsite";  # first constant: site URL
$webpartId = "<guid of your choice for webpart>;   # second argument:  webpart GUID
$pagepath =  "/default.aspx"        # change page name or page path here
$pageURL = $siteURL + $pagepath
write-host "Processing site: ", $siteURL
Write-host "Processing page: " , $pageURL
write-host "Processing webpart ID: " , $webpartID
$site=new-object Microsoft.SharePoint.SPSite($siteURL)
$web=$site.Openweb()
$webpartmanager=$web.GetLimitedWebPartManager($pageURL, [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)
$webpartmanager.DeleteWebPart($webpartmanager.Webparts[$webpartId])
$web.Update()
$web.Dispose()
write-host "Finished."

Changing the page for a SharePoint Library View

Views are easily created by end users, and can be created through automation. I recently had to change the underlying aspx page for an existing view. First I tried using the SPView method SetViewXml(), however that does not work, as many other programmers have discovered to their chagrin.

The approach that works is to clone the view, then rename the title. In this case, I am trying to create AllItems.aspx, so I clone the view to that name then rename it:

$NewView = $SourceView.clone("AllItems",100,$true,$true)
$NewView.title = "All Documents";
$NewView.update();
$Views.delete($SourceView.id);

The script below goes one better. It finds the extra “All Documents” views, and deletes all the ones that are duplicate, preserving the original (latest) View. It iterates through the collection downward deleting as it goes.

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
$mylogfile="ongoinglogfile.txt"
$envGUID = $(Get-SPFarm).id
if		($envGUID -eq  "c25ca1f1-6d8f-4c59-86e3-d64821a3bcbb" )	{$env = "Dev"}
elseif	($envGUID -eq "8e51b3e2-1ed7-4549-ad51-c5c43b065d2c" )	{$env = "Prod"}
else
{
$env = "Unknown"
Write-Host -ForegroundColor darkred "UKNOWN FARM in $($MyInvocation.MyCommand.Definition)"
Add-Content $mylogfile "UKNOWN FARM in $($MyInvocation.MyCommand.Definition)"
exit
}
elseif ($env -eq "Dev")
{
#$MatchStr="ht tp://sharepointdev/Sites/*"  
$MatchStr="ht tp://sharepointdev/Sites/2015"  
$sitesArrNames = "ht tp://sharepointdev/Sites/2015"
$sitesArr = $sitesArrNames.split(",");
}
elseif ($env -eq "Prod")
{
$MatchStr="ht tp://sharepoint/Sites/2015"  
$sitesArrNames = "ht tp://sharepoint/Sites/2015"
$sitesArr = $sitesArrNames.split(",");
}
else
{
$waName = $MatchStr = $waName = $null;
}
$libsArrStr="Lib1,lib2,Documents,Shared Documents"
$LibsArr=$libsArrStr.split(",")
write-host  "STARTING: $(get-date) Script: $($MyInvocation.MyCommand.Definition)"
Add-Content $mylogfile "STARTING: $(get-date) Script: $($MyInvocation.MyCommand.Definition)"
foreach ($SiteName in $sitesArr)
{
$site = get-spsite $SiteName
if ($site.url -like $MatchStr)
{
$webs=$Site.AllWebs
$webcount = $Site.AllWebs.Count
for ($i=0; $i -lt $webcount; $i++)
{
$TargetWeb=$web=$webs[$i]  #$TargetWeb is used by my standard routines
Write-Host "==>working in $($web.url)"
Add-Content $mylogfile "==>working in $($web.url)"
$lists=$web.lists;
write-host -f green "The Library $($listName) exists in the site $($web.url), about to tune the view"  #not going to try to config it either
Add-Content $mylogfile "The Library $($listName) exists in the site $($web.url), about to tune the view"  
for ($k=0; $k -lt $LibsArr.count; $k++)
{
$libstr = $LibsArr[$k];
$JPLib = $web.Lists.TryGetList($libStr)  #Extra JPFolder is for the override on routines with lib
if($JPLib -ne $null)  #optional, can filter to ensure ($JPLib.ContentTypesEnabled)
{
write-host "Analyzing $($JPLib.title) in $($web.url)"
Add-Content $mylogfile "Analyzing $($JPLib.title) in $($web.url)"
$Views = $JPLib.views;
$foundOne=$false;
for ($vi=$views.count-1; $vi -ge 0; $vi--)
{
$View = $views[$vi];
if ($View.title -eq "All Documents")
{
if ($foundOne -eq $true)  #there was one found earlier
{
$views.Delete($view.id);
write-host "Deleted a view in $($JPLib.title) in $($web.url)"
Add-Content $mylogfile "Deleted a view in $($JPLib.title) in $($web.url)"
}
else
{
$foundOne=$true;
$SourceView = $view;
}
}
}
if ($FoundOne)
{
if ($SourceView.schemaxml.contains("Documents.aspx"))
{
$NewView = $SourceView.clone("AllItems",100,$true,$true)
$NewView.title = "All Documents";
$NewView.update();
$Views.delete($SourceView.id);
write-host "Cloned and deleted a view in $($JPLib.title) in $($web.url)"
Add-Content $mylogfile "Cloned and deleted a view in $($JPLib.title) in $($web.url)"
}
}
}
}
$web.Dispose()
} #SPWeb processing
} #if $true/Siteurl is not null, if environment setup is valid
$site.Dispose()
} #foreach site

Migrating documents via SFTP

A previous post covered how to programmatically download documents via FTP: How to download from FTP programmatically
If FTP over SSL is needed, that’s just a property to enable SSL:

$request = [Net.WebRequest]::Create($url)
$request.EnableSsl = $true  #enable SSL

Sometimes there is a need to access and download documents via SFTP. That’s a completely different beast. To do that, I utilize the open source WinSCP. Both the .exe and DLL are needed, and can be co-located with the script.  Read more about WinSCP.

In PowerShell, just load the type library:

Add-Type -Path "WinSCPnet.dll"

Then set up the session. Below I chose to use the actual SSH Host Key Fingerprint:

$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Sftp
$sessionOptions.HostName = "sftp.myDomain.com"
$sessionOptions.UserName = "username"
$sessionOptions.Password = 'password'
$sessionOptions.SshHostKeyFingerprint = "ssh-rsa 1024 96:9b:ed:1f:66:8b:13:64:c3:ed:11:e0:27:68:62:67"

If you don’t want to bother confirming the crypto key, just set this property instead:

$sessionOptions.GiveUpSecurityAndAcceptAnySshHostKey = "True"

Then create a new session and open it:

$session = New-Object WinSCP.Session
$session.Open($sessionOptions)

Note $session.output contains all the useful FTP transactions, which you can log.

You also have the option to capture debugging information and set the debugging level:

$session.DebugLogPath = "D:\plautj\mypath"
$session.SessionLogPath = "D:\plautj\mypath2"
$session.DebugLevel = 1

Once the connection is established, use the session to:
1. Capture the directory listing: $directory = $session.ListDirectory($FTPDir)
2. Download files: $session.GetFiles($remotePath, $localPath).Check()
3. Delete files: $session.RemoveFiles($remotePath).Check()

Below is the full script to connect to SFTP, download all files, and delete them from the FTP Server:

$DeleteSource = $true;
$DestLocation = "\\DestinationServer\Location\";
$FTPDir = "/USERS/MyDir"
#todo
$ok = $true;
try
{
$ok = test-path $destLocation;
}
catch
{
$ok=$false;
write-host -ForegroundColor darkred "Failed to reach destination location $($destLocation)"
}
if ($ok)
{
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"  # requires script co-located DLL
}
catch
{
$ok=$false;
write-host -ForegroundColor darkred "Failed to acquire types from the WinSCPnet.dll"
}
}
if ($ok)
{
try
{
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Sftp
$sessionOptions.HostName = "sftp.SomeDomain.com"
$sessionOptions.UserName = "userID"
$sessionOptions.Password = 'Password'
$sessionOptions.SshHostKeyFingerprint = "ssh-rsa 1024 96:9b:ed:1f:66:8b:13:64:c3:ed:11:e0:27:68:62:67"
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
}
catch
{
$ok=$false;
write-host -ForegroundColor darkred "Failed to open SFTP connection"
}
}
if ($ok)
{
try #to get the directory listing
{
$directory = $session.ListDirectory($FTPDir)
}
catch
{
$ok=$false;
write-host -ForegroundColor darkred "Failed to get FTP Directory $($FTPDir)"
}
}
if ($ok)
{
try # to download each file that is not itself a directory
{
foreach ($f in $Directory.Files)
{
if (!$f.IsDirectory)
{
try
{
$RemotePath = "$($FTPDir)/$($f.name)"
$LocalPath  =  "$($DestLocation)$($f.name)"
$LocalPath  = $LocalPath.trim()
$session.GetFiles($remotePath, $localPath).Check()
write-host -ForegroundColor darkgreen "Deleted file from $($RemotePath) to $($LocalPath)"
}
catch
{
$ok=$false;
write-host -ForegroundColor darkred "Failed to download file from $($RemotePath) to $($LocalPath)"
}
}
}
}
catch
{
$ok=$false;
write-host -ForegroundColor darkred "Generic failure Failed to download file from FTP Directory $($FTPDir)"
}
}
if ($ok)
{
if ($DeleteSource)
{
foreach ($f in $Directory.Files)
{
if (!$f.IsDirectory)
{
try # try to delete each FTP file that is not a directory\
{
$RemotePath = "$($FTPDir)/$($f.name)"
$LocalPath  =  "$($DestLocation)$($f.name)"
$LocalPath  = $LocalPath.trim()
$session.RemoveFiles($remotePath).Check()
write-host -ForegroundColor darkgreen "Downloaded file from $($RemotePath) to $($LocalPath)"
}
catch
{
$ok=$false;
write-host -ForegroundColor darkred "Failed to download file from $($RemotePath) to $($LocalPath)"
}
}
}
}
}

Crisply report on script duration

While scripts can get written in a jiffy, it’s best to make them usable and functional as a foundation for operability.

Some scripts can take time to execute.  As a habit, I tend to build into my scripts a crisp and clear report on script duration; both to console and to a logfile.

Let’s declare a logfile, and output the start of the script; let’s capture the start time of the script run, and output it to console and logfile at the start of the script run:

$startTime = get-date
write-host  "STARTING: $($startTime) Script: $($MyInvocation.MyCommand.Definition)"
Add-Content $mylogfile "STARTING: $($startTime) Script: $($MyInvocation.MyCommand.Definition)"

At the end, let’s do the same; output the duration of the script run to console and logfile:

$endtime = Get-Date
Write-Host "Started script at $($starttime), Ended script at $($endtime), duration of $($endtime.Subtract($starttime).minutes) minutes and $($endtime.Subtract($starttime).seconds) seconds "
Add-Content $mylogfile "Started Script: $($MyInvocation.MyCommand.Definition) at $($starttime), Ended script at $($endtime), duration of $($endtime.Subtract($starttime).minutes) minutes and $($endtime.Subtract($starttime).seconds) seconds "

Simple as that. Reuse wherever needed.