Thursday, September 27, 2012

Export to Excel and Email SharePoint Lists in a Timer Job


Problem Summary: SharePoint Lists need to be exported as Excel Worksheets and emailed to users on a scheduled basis so that they could be accessed via any device at any time.

The idea was communicating with team members without direct access to the list through a browser due to the nature of their job and also because people generally find it much simpler when information is pushed to their email accounts instead of them having to logon to a portal, click a few times and then be able to get the information. 

For example, a busy Sales Executive who travels a lot and is used to only carrying a mobile device for most purposes. 

Even though the OOTB Export to Excel is a great feature, it cannot be called over an API.  

Solution:

We had to write a custom timer job that would do the following:

- Run on a scheduled basis and find all the lists that need to be emailed as Excel Worksheet (the List Names could be read from a Config list). These lists may be spread across several Site Collections so the timer job needs to read the list name from the Config List.  

- Find the Views that need to be exported for each such list (the view name needs to be read from the Config list). 

- Export each List to a separate Excel Worksheet, using the View Name for Column List. 

- Either save the exported Excel Worksheet to a document library, or send the worksheet as an email, with a custom Email Message (the Email To, Subject and Body could be read from the Config list). 

- We also added a feature where the timer job would dynamically add columns such as an Item Url that don’t exist in the View, just in case the user wanted to access the Item directly by clicking the link. 

Code for the Timer Job can be accessed here. The code will deploy the timer job and feature to all Site Collections. All you need to do is activate the feature on Site Collections you want to run this timer job on. Activation of the feature will provision the Config List, using which you can specify all the fields mention in the explanation above. When the solution is deployed it also copies a template excel file to a folder in the Layouts folder. That was required in order to make the Export to Excel Utility work.

Thursday, August 2, 2012

Powershell Script to download attachments on a SharePoint List into a file share

Problem Description: Download all attachments on a SharePoint List to a File Share. Each set of attachments for a List Item need to be in its own folder, with ItemId as the name of the folder.

Solution: Powershell script shown below was run. The script took about 5 minutes to run on a list with 3000 items, each containing between 1 and 5 attachments. Average size of an attachment is between 1 and 5 MB.



   1:  $webUrl = "Url"    
   2:  $library = "ListName"   
   3:  #Local Folder to dump files
   4:  $tempLocation = "FolderPath"     
   5:  $s = new-object Microsoft.SharePoint.SPSite($webUrl)    
   6:  $w = $s.OpenWeb()         
   7:  $l = $w.Lists[$library]    
   8:  foreach ($listItem in $l.Items)
   9:  {
  10:      Write-Host "    Content: " $listItem.ID 
  11:       $destinationfolder = $tempLocation + "\" + $listItem.ID          
  12:        if (!(Test-Path -path $destinationfolder))        
  13:         {            
  14:          $dest = New-Item $destinationfolder -type directory          
  15:         }
  16:      foreach ($attachment in $listItem.Attachments)    
  17:          {        
  18:              $file = $w.GetFile($listItem.Attachments.UrlPrefix + $attachment)        
  19:              $bytes = $file.OpenBinary()                
  20:              $path = $destinationfolder + "
\" + $attachment
  21:              Write "
Saving $path"
  22:              $fs = new-object System.IO.FileStream($path, "
OpenOrCreate")
  23:              $fs.Write($bytes, 0 , $bytes.Length)    
  24:              $fs.Close()    
  25:          }
  26:  }

Reference: Thanks to this blogpost. I tweaked the script mentioned on this blog post for the above purpose.



Wednesday, June 20, 2012

Configurable SharePoint solution for Sending Email Alerts upon Document/List Item Expiration


Problem: need to develop a configurable sharepoint solution that would do the following:
- be deployed to any SharePoint Site as an Event receiver 
- Once the Event Receiver (ER) is attached to a List/Document Library, it will send Email Alerts a certain number of days before the Document/List Item Expiration Date added by User upon Document Upload/List Item Creation

Why Event Receiver? Explained at the end of the post.

Solution: The Event Receiver, once fired after a list item has been added, will do the following:
1) Fire a connection to a configuration list, lets call it DocumentExpirationConfig List.
2) Find out the list name it is currently fired on using properties.ListTitle, and then using a CAML Query, find the List Item in DocumentExpirationConfig List that corresponds to that List.
3) Once a record has been obtained the ER can find the name of the Column it has to read for the Document Expiration Date, subtract the number of days (once again read from the ConfigurationList) before which it should send the Alert from the Document Expiration Date, , and then set the value of the Destination Column(let's call it, for instance, EmailAlertDate) with that Date (once again read from the ConfigurationList) 

for example: Document Expiration Date - 90 (number of days to send the Alert before expiration)  = EmailAlertDate
The Document Expiration Config List is as shown in the screenshot below:


As you can tell, the DocumentExpirationConfig makes the ER configurable and deployable to any list in the site, without requiring any code changes.

After a record has been added, and the ER fired on it, it would look something like the screenshot below:


The code for the ER is shown below, simply call this method in the ItemAdded and ItemUpdated Event Handlers:
 1:  public void UpdateExpirationAlertColumn(SPItemEventProperties properties, bool isadded)
 2:  {
 3:  try
 4:  { 
 5:  SPList _configlist = properties.Web.Lists.TryGetList("DocumentExpirationConfig");
 6:  string _documentexpirationcolumn = string.Empty;
 7:  string _expirationalertcolumn = string.Empty;
 8:  if (_configlist != null)
 9:  {
 10:  string _camlQuery = "" +properties.ListTitle + "";
 11:  SPQuery _query = new SPQuery();
 12:  _query.Query = _camlQuery;
 13:  SPListItemCollection _listitems = _configlist.GetItems(_query);
 14:  
 15:  if (_listitems.Count > 0)
 16:  {
 17:  _documentexpirationcolumn = _listitems[0]["DocumentExpirationColumn"].ToString();
 18:  _expirationalertcolumn = _listitems[0]["ExpirationAlertColumn"].ToString();
 19:  
 20:  if (properties.ListItem[_documentexpirationcolumn] != null)
 21:  {
 22:  DateTime _documentexpiratedate = DateTime.Parse(properties.ListItem[_documentexpirationcolumn].ToString());
 23:  properties.ListItem[_expirationalertcolumn] = _documentexpiratedate.AddDays(-60).ToShortDateString();
 24:  properties.ListItem.SystemUpdate();
 25:  }
 26:  
 27:  }
 28:  
 29:  }
 30:  }
 31:  catch (Exception ex)
 32:  {
 33:  EventLog m_EventLog = new EventLog("");
 34:  m_EventLog.Source = "Error updating Document Expiration Date";
 35:  m_EventLog.WriteEntry("Error updating Document Expiration Date" + ex.Message + ex.StackTrace,
 36:  EventLogEntryType.FailureAudit);
 37:  }
 38:  
 39:  }

- Once the ER has finished updating the EmailAlertDate, the next step is find a way to send an alert regarding the expiration of the document/list item to the User who created the document. This could be done in either of two ways

Approach #1) Create a Timerjob that runs each night and figures out all the documents/list items that are expiring on that day, and send an Alert to users.
Approach #2) Create a Retention Policy that kicks off a Workflow, that checks if Today's Date = EmailAlertDate, then sends the Alert to users, informing them about the Contract Expiration.

I chose the second approach, as shown in the screenshot below:


Why EventReceiver?: In case you are wondering why I chose to add an Event Receiver, here are the reasons why:
First of all, SharePoint does not allow one to kickoff a Retention policy a certain number of days before a certain datetime value (read from a column such as the Document Expiration Date column described above). As you can see, in the screenshot above, there is a '+' sign but no '-' sign. Thus, for example, we cannot kick off this policy  30 days before Document Expiration Date ( I wish that was true :-)).
So you might ask, why not reate a calculated column with the same value set by the Event Receiver. Well, the answer to that is, yes, you could very well set the value of a calculated column without any need of an ER, but SharePoint will not allow you to kick off a Retention policy on it. It does not show up in the drop down list of columns on the Retention Stage Properties page, as shown in the screenshot above. 
Thus, one way around the above two problems is to use the Event Receiver as described above.

Sunday, June 3, 2012

Powershell Script for Item Level permissions

Requirement: Have a large list (tens of thousands of items). Each List item has a unique set of permissions, assigned dynamically through a SPD workflow Activity at the time when the list item is added and the workflow kicked off. A new user arrives in the middle of the workflow approval process ( maybe a reorganization or a supervisor change), and that user needs to be assigned to all items with a certain common property (could range from 1 item to potentially 1000's of list items).


Solution: Need 2 powershell scripts, one that will take the itemId (if only 1 item needs to be reassigned) and reassign permissions, and the second that will take a View Name (if more than 1 item with a common property needs to be reassigned)/filter through a CAML Query and reassign permissions.


Script #1) Powershell script for reassignment of a large number of list item permissions, each sharing a common property that can be used to create a view/apply a filter is as follows:


You can get the CAML Query for the View or the filter and pass it as the Caml parameter:


The script takes in the following parameters:


- Url: Url of the Web that you want to run the script on.
- ListName: ListName of the List you want to run the script on
- Caml: Query to be passed into the script to apply the filter
- UserName: UserName of the user whose permissions need to be assigned to the List Item, for example: "domain\kevin"
- PermissionLevel: The permission level that needs to be assigned to the UserName on the List Item, for example: "Contribute"


   1:  #Load SharePoint Snap In
   2:  Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
   3:   
   4:  function Add-SPPermissionToListItemUserConditional
   5:  {
   6:   param ($Url, $ListName, $UserName, $Caml, $PermissionLevel)
   7:   $web = Get-SPWeb -Identity $Url
   8:   $list = $web.Lists.TryGetList($ListName)
   9:   if ($list -ne $null)
  10:   {
  11:          $spQuery = New-Object Microsoft.SharePoint.SPQuery        
  12:          $spQuery.Query = $Caml
  13:          $spQuery.RowLimit = 10000
  14:          $listItems = $list.GetItems($spQuery)        
  15:          $listItems.Count
  16:        foreach ($item in $listItems) 
  17:        {
  18:             if ($item.HasUniqueRoleAssignments -eq $False)
  19:             {
  20:              $item.BreakRoleInheritance($True)
  21:                  $user = $web.AllUsers[$UserName]
  22:                  $roleDefinition = $web.RoleDefinitions[$PermissionLevel]
  23:                  $roleAssignment = New-Object Microsoft.SharePoint.SPRoleAssignment($user)
  24:                  $roleAssignment.RoleDefinitionBindings.Add($roleDefinition)
  25:                  $item.RoleAssignments.Add($roleAssignment)
  26:                  $item.Update()
  27:                       Write-Host "Successfully added $PermissionLevel permission to $UserName in $ListName list. " -foregroundcolor Green
  28:             }
  29:             else
  30:             {
  31:                  $user = $web.AllUsers[$UserName]
  32:                  $roleDefinition = $web.RoleDefinitions[$PermissionLevel]
  33:                  $roleAssignment = New-Object Microsoft.SharePoint.SPRoleAssignment($user)
  34:                  $roleAssignment.RoleDefinitionBindings.Add($roleDefinition)
  35:                  $item.RoleAssignments.Add($roleAssignment)
  36:                  $item.Update()
  37:                       Write-Host "Successfully added $PermissionLevel permission to $UserName in $ListName list. " -foregroundcolor Green
  38:              }
  39:         }
  40:   }
  41:   
  42:   $web.Dispose()
  43:  }
  44:   
  45:  Usage: Add-SPPermissionToListItemUserConditional "Url" "ListName" "username" "<Where><Eq><FieldRef Name='Status' /><Value Type='Text'>Yes</Value></Eq></Where>" "PermissionLevel"




Script #2) Powershell script for reassignment of ONLY 1 list item permission is as follows:
The script takes in the following parameters:


- Url: Url of the Web that you want to run the script on.
- ListName: ListName of the List you want to run the script on
- ItemId: ID of the Item that needs to have the permissions reassigned upon
- UserName: UserName of the user whose permissions need to be assigned to the List Item, for example: "domain\kevin"
- PermissionLevel: The permission level that needs to be assigned to the UserName on the List Item, for example: "Contribute"




   1:  #Load SharePoint Snap In
   2:  Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
   3:  function Add-SPPermissionToListItemUser
   4:  {
   5:   param ($Url, $ListName, $ItemID, $UserName, $PermissionLevel)
   6:   $web = Get-SPWeb -Identity $Url
   7:   $list = $web.Lists.TryGetList($ListName)
   8:   if ($list -ne $null)
   9:   {
  10:    $item = $list.Items.GetItemByID($ItemID)
  11:     if ($item.HasUniqueRoleAssignments -eq $False)
  12:     {
  13:      $item.BreakRoleInheritance($True)
  14:          $user = $web.AllUsers[$UserName]
  15:          $roleDefinition = $web.RoleDefinitions[$PermissionLevel]
  16:          $roleAssignment = New-Object Microsoft.SharePoint.SPRoleAssignment($user)
  17:          $roleAssignment.RoleDefinitionBindings.Add($roleDefinition)
  18:          $item.RoleAssignments.Add($roleAssignment)
  19:          $item.Update()
  20:               Write-Host "Successfully added $PermissionLevel permission to $UserName in $ListName list. " -foregroundcolor Yellow
  21:     }
  22:         else
  23:         {
  24:             $user = $web.AllUsers[$UserName]
  25:          $roleDefinition = $web.RoleDefinitions[$PermissionLevel]
  26:          $roleAssignment = New-Object Microsoft.SharePoint.SPRoleAssignment($user)
  27:          $roleAssignment.RoleDefinitionBindings.Add($roleDefinition)
  28:          $item.RoleAssignments.Add($roleAssignment)
  29:          $item.Update()
  30:               Write-Host "Successfully added $PermissionLevel permission to $UserName in $ListName list. " -foregroundcolor Yellow
  31:         }
  32:    }
  33:   $web.Dispose()
  34:  }
  35:   
  36:   
  37:  Usage: Add-SPPermissionToListItemUser "Url" "ListName" "ID" "username" "permission level"

Sunday, May 27, 2012

Dundas Chart/Gauge for SharePoint: common scenarios

This post will contain a few common problems/scenarios with Dundas Chart/Gauges for SharePoint, solutions to which are not easily found. You will need to dig around a little bit to get these answers and the Knowledge Base on Dundas site is not very helpful.
Scenario#1) Hyperlink: How would you make each bar a hyperlink, generated dynamically (as shown in the screenshot below)?


Tooltip and Hyperlink on each Bar



The code snippet below tells you how to. Simply paste the code using the Code Editor, and replace it with your particular columns from the datasource. The Hyperlinks will be generated when the chart loads.


   1:  foreach (Series series in chartObj.Series) 
   2:   
   3:  { 
   4:   
   5:   foreach (DataPoint dp in series.Points) 
   6:   
   7:   { 
   8:   
   9:    if (dp.YValues[0] == 0) 
  10:   
  11:    { 
  12:   
  13:     dp.ShowLabelAsValue = false; 
  14:   
  15:    } object o = dp["URL"]; 
  16:   
  17:    if (o != null) 
  18:   
  19:     { 
  20:   
  21:      string value1 = o.ToString(); 
  22:   
  23:      dp.Href = "" + value1 ; 
  24:   
  25:      dp.MapAreaAttributes = "Target=_Blank"; 
  26:   
  27:     } 
  28:   
  29:    } 
  30:   
  31:  }
  32:   

Scenario #2) Tooltip: How would you display a tooltip on each Bar of your Chart Series? It is pretty sraigthforward. Connect you chart to a Data Source, and as you go through the wizard, make sure you pick a column coming from your datasource selected in the Tooltip field, as shown in the screenshots below:



Column Selected from the DataSource on the Tooltip Field

Scenarios #3) Read Querystring Value from Url: If you need the Dundas Chart to read a Querystring Value from the Url, here are the steps:


1) Throw a Querystring Url Filter Web Part on your page. Then complete it's customization as you would for any other web part, as shown in the screenshot below.


Querystring Filter Web Part
Querystring Filter Web Part - Edit Mode

2) Once the configuration is complete, connect it to the Dundas Chart Web Part that you would like to send the Filter Values to, just like you would do for any other web part that is getting its value from a Querystring filter web part (sample screenshot below).


Read Value from Querystring Filter Web Part

3) Once the value has been read, you could send it to the data source, using the Connect to Data Wizard in the Dundas Chart, like in the following example, I am calling a stored proc to the backend database, while passing Country as a parameter read from the Querystring Filter web part by the Dundas Chart Web Part.



Send Querystring Value to stored proc

Scenario #4) Hide and Show Dundas Charts based on Querystring value:
What if you want to hide dundas charts based on a Query string value, For example, if you want to prevent a chart from showing empty in case someone lands on a page without a querystring string value, which is neccesary to load data on the chart.

The following code snippet will show you how to accomplish this need. Make sure it is pasted in the Code Editor's PostInitialize method.

   1:  if(chartObj.Page.Request.QueryString["CountryCode"] != null)    
   2:  { 
   3:  string s = chartObj.Page.Request.QueryString["CountryCode"];    
   4:  int i = int.Parse(s);    
   5:  if (i != 304 &amp;&amp; i != 308)  
   6:  {  
   7:  ChartWebPart wp = (ChartWebPart)chartObj.Parent;    
   8:  wp.ChromeState = System.Web.UI.WebControls.WebParts.PartChromeState.Minimized; 
   9:  wp.ChromeType = System.Web.UI.WebControls.WebParts.PartChromeType.None; 
  10:  chartObj.Visible = false; 
  11:  } 
  12:  else 
  13:  { 
  14:  chartObj.Visible = true;   
  15:  ChartWebPart wp = (ChartWebPart)chartObj.Parent; 
  16:  wp.ChromeState = System.Web.UI.WebControls.WebParts.PartChromeState.Normal; 
  17:  wp.ChromeType = System.Web.UI.WebControls.WebParts.PartChromeType.None;   
  18:  } 
  19:  } 
  20:  else  
  21:  {
  22:  ChartWebPart wp = (ChartWebPart)chartObj.Parent; 
  23:  wp.ChromeState = System.Web.UI.WebControls.WebParts.PartChromeState.Minimized; 
  24:  wp.ChromeType = System.Web.UI.WebControls.WebParts.PartChromeType.None; 
  25:  chartObj.Visible = false;  
  26:  } 

Sunday, May 13, 2012

Custom SharePoint Application with Mobile, Offline Requirements

Technical Requirements:  needed to design a web app  with the following Technical Requirements: 

1) A Web App that can be used to perform CRUD operations on data, which could consist of several tables with parent-child relationships and some independent tables such as Master tables that drive data types used in the parent-child tables OR, the same hierarchy could be replicated to SharePoint lists. That was one decision to be made as we though through the design (Decision #1).


2) The app must support offline editing of data through iPads, because many aspects of data entry are done in places with no connectivity to the Internet and the iPad camera needs to be leveraged for taking pictures and uploading them as attachments to data records. This required us to make a decision on which third party app to use for offline support or build our own mobile app (Decision #2).


3) Users must be able to generate Reports and Dashboards. We needed to determine the Reporting platform for this product (Decision #3).


4) Users must be able to set Alerts on the data.
5) The solution needs to be delivered quickly (4-8 weeks).

6) The data volume is not expected to be very large ( probably range between few thousand to ten thousand records).

Solutions Considered

1) ASP.NET Web App with a Relational Database: One way is to design a relational database and an ASP.NET  Web App to perform CRUD operations on it. This would have meant us reinventing the wheel for requirements such as Alerting, Reporting etc., besides ofcourse implementing standard features such as Authentication, Authorization etc. that any asp.net app needs would still have to be done. Given that this is a SharePoint shop, going this route seemed like lots of custom development and might not work in the short time we had for this project. Also, had we gone this route, we would still have needed to either develop a native iPad app or develop a hybrid app for this Web App.

 
2) Third-Party Web Parts on SharePoint: We did evaluate Third Party web parts such as Quest Web Parts to quickly create this app. Using this approach, we could still take advantage of all the OOTB features SharePoint has to offer and also take advantage of native iPad apps such as Shareplus available on the App Store for Offline Editing of SharePoint data. This would have however, required us to ramp up on customization of the Quest Web Part, not to mention extra licensing costs for the server licensing. 
Since SharePoint 2010 makes it easier to develop Visual Web Parts, and the users were already familiar with SharePoint, we decided to build our own Visual Web Part for this solution and adopt Solution 3) explained below.

3) SharePoint Visual Web Part and Lists: The Visual Web Part would be the interface and instead of using a relational database as the back end, we decided to use SharePoint lists instead. The lists would have a relational design with lookup columns for joins.


The SharePoint lists would then give us all the OOTB features SharePoint already provides such as Alerts, Export to Excel, Metadata Search etc.

One other advantage of using SharePoint lists for this project is that we could solve the offline editing problem by using apps already available on the App Store, such as SharePlus, which do a good job of supporting offline editing of Lists. It would have been harder to solve this problem if we had built a custom relational database.

We decided to leverage products including Dundas charts for SharePoint, Excel Services and Performance Point for the Reports and Dashboards.


Disadvantages: This solution does have some disadvantages, first of all, even though SharePoint 2010 does allow some amount of relational database power, such as allowing cascading/restricted deletes, using SPQuery.Joins or the use of SharePoint to LINQ to perform joins, it still cannot be looked upon as a full fledged relational database solution. We will need to live with the consequences of this decison throughout the life of this product. As we work on enhancing it, set up reporting etc., we will always face challenges with joining and querying different lists.
Another problem is that since we have two interfaces, one is the visual webpart interface, and the other is the native iPad app for offline use, the user experience becomes dramatically different as a user switches between the desktop browser to the mobile app. This is a disadvantage we would have to live with till we have time to build a better mobile solution or find one already available off the shelf. Please note we have recently also heard of an HTML 5 solution called Mobile Entree, which seems better as a soluton since sharePlus is a native iPad app and we would have to train our users on our SharePoint backend List Data Model. With an HTML5 solution we would retain the web app's user experience to some extent. We have not evaluated this solution at the time of writing this blog.

Thursday, April 19, 2012

Yahoo Stock Chart Ajax Web Part

Requirements: Need a stock web part, that refreshes in real time from Yahoo Finance. Given a stock symbol, the web part needs to fulfill the following requirements:


Req. 1) display the stock chart image for 1d, 5d, 3m, 1y, 2y, 5y, as shown in the screenshot below:










Req. 2) display stock quote information such as the following: Trade, Change, Open, PreviousClose, MarketCap etc., as shown in the figure below:











3) The image in Req. 1) and data in Req. 2) needs to be updated asynchronously while the browser is open.

Solutions: Several third party web parts were investigated, but none fullfilled all the above requirements, so we ended up developing a custom solution.


Solution Approach #1: Our first attempt was to develop the web part in such a manner that it would load the images for Req. 1) and data points for Req. 2) in real time. For example, we tried to use SharePoint Designer to load the data and images directly using the Data View Webpart and the Content Editor Web Part. This approach has been described very well in this post. However, we noticed that we got HTTP Error 504 Gateway timeouts quite a few times during a day, thus causing issue loading the stock image and quote information. Thus, we decided to cache the image locally in a location accessible to SharePoint (we decided to use 14 Hive)  and stock data points in a local database accessible to the SharePoint Page, as explained in Approach #2.

Approach #2: So it was decided to create a separate windows service that runs periodically and downloads the image to the 14 hive folder and data to a simple table in a database. The web part would load, in real time, the image from the 14 hive folder  (Req. 1) and the data (Req. 2) from a database. This would prevent a timeout on the web part, ensure that it gets data locally and also improve performance.


The design diagram for this solution is as follows:




Requirement 1) and 2) were fulfilled with the above design and in order to fulfill Requirement 3) we used the ASP.NET AJAX Update Panel that refreshes the web part asynchronously. I will be sharing the web part and windows service code on this blog soon.

Tuesday, April 3, 2012

SharePoint Farm Backups and Recovery plan

This post describes a recent Backups and Recovery plan I executed. It aims to describe my thought process, decisions and challenges faced while designing this plan. It could also be applied to SharePoint Farms with topologies different from the one mentioned below.

SharePoint Farm Topology: 3 server farm consisting of a web, app and database server.
Restore Requirements:
Our Requirements are described in the table below:

þ : Client Requirement                 ý: Not a Client Requirement
Requirements No.
Restore Type
a)In Place
b)Out of Place
c) Workflow History
1
Item
þ
þ
ý
2
List
þ
þ
ý
3
Site Collection
þ
þ
þ
4
Platform
þ
þ
þ

FYI, In-Place means restoration on the same farm and Out of Place restoration on another farm (such as in case of a Disaster).

Other Requirements:
5) All backups listed above needed to be fulfilled using a scheduled job
6) Incremental Backups needed each day during the week and Full Backups during the weekend
7) The solution needed to back up the database transaction logs daily
8) The solution should be complete and eliminate the need for Sql Database backups

What is so special about Workflow History: You might be wondering why I gave so much emphasis to Workflow History in the client requirements. The reason is that very often Workflows are implemented for critical business processes and the most important data in them pertains to who said what? when?, i.e: the audit trail.
Now consider the scenario, you have a custom list with a workflow running on it. For some reason you loose a list item, or for that matter, an entire list and need to restore its data along with its workflow history, can you restore it individually from a backup? The answer is: there is no product I know of that will let you restore a list item or an entire list along with its own Workflow History. Why is that so, because Workflow History is one hidden list with its data referencing both, a task list and the actual sharepoiont list on which the workflow is kicked off. So when we were designing this backup solution, it was important for us to know that the only way to restore a sharepoint list with its workflow history was to restore the ENTIRE site collection. We could not go more granular than that.

Solutions Considered:
a) OOTB: SharePoint would only be able to support 3 a), b), c), 4 a) and 4c) OOTB.
1 a), b) or c) would not be possible. 2 a) or b) would be possible but you would have to find the powershell command and write a scheduled job for it. It would be hard to manage such a job for an ever growing farm with new lists being added daily.

As you can see, SharePoint offers very basic Backup and Restore options OOTB and would not meet the requirements of many Enterprise SharePoint Farms. I decided to evaluate third party products.

b) VM Snapshots: Since the web, app and sql are all running virtual in this case, we could take vmware snapshots but that would only work for 4 a) and for no other Requirement. Another downside is that the resotration process is out of control of SharePoint Administrators and we would have always needed to depend on the Infr. team for restoration. Lets assume the Sql was physical and web and app were virtual ( as in many large SharePoint Installations), VM snapshots would still be able to restore the web and app servers and you would still need to depend on Sql backups as described in c ) below. A combination of VM Snapshots and sql backups could satisfy requirements 3 a), 3 c) along with 4 a), but we would still be left with not implementing requirements mentioned in 1) and 2).


c) Sql Backups : Sql Backups could be used to satisfy Requirements in 3 a), 3 b), 3 c) and 4 a), but we would but we would still be left with not implementing requirements mentioned in 1) and 2).



d) DocAve Backups and Recovery (AvePoint): After evaluating several third-party products, I thought the one that gave us the maximum value was DocAve. This product seems to be the most mature product in this product space. It would work for all of the above mentioned requirements, except any of them related to Workflow History. Event though you can restore a Site Collection containing Workflow History for several Workflows using powershell, DocAve does not support it. We had to write a scheduled job separately for that purpose as explained in my previous blog post located here. Avepoint technical support so far has been pretty good and that is one more reason I would like to stay with DocAve.


Conclusion: Adopting Strategy d) was the only way we could implement all the Requirements mentioned above. A combination of strategy a), b) and c) could be used to satisfy a subset of the above Requirements.






Monday, April 2, 2012

Powershell SiteCollection Backup script,Move files, Delete Files and Send Emails

We needed a powershell script for a SiteCollection backup job that does the following:

- Backup the Site Collection to a local drive

- Move the locally copied .bak file to a network file share backuped up to tape

- Delete the last file copied over a certain number of days ago, so that only a certain number of files were retained

- Send an email about success/failure of the job

The following powershell script that does all of the above:

1: Add-PsSnapin Microsoft.SharePoint.Powershell– ?ErrorAction SilentlyContinue
2:  
3: Start-SPAssignment -Global            # This cmdlet takes care of the disposable objects to prevent memory leak.
4:  
5: $srcSite=""                   # Replace with your site collection URL

6: $localBkUpLocation=""         # Replace with your local backup location
7: $destBkupLocation=""          # Replace with your destination backup location
8: 
9: $today=Get-Date -format "MM-dd-yyyy HH.mm.ss"     # Get current date and format it to avoid invalid characters such as "/" and ":"
10: $filename = ""  + $today
11: 
12: write-Host Start backing up $srcSite to $localBkUpLocation
13: try 
14: {
15:         #BackUp Site Collection
16:         Backup-SPSite -Identity $srcSite -Path $localBkUpLocation\\$filename.bak -force -ea Stop
17:  
18:         # move all backup files from the localBkUpLocation to the destBkupLocation
19:         Write-Host "Moving backups from $localBkUpLocation to $destBkupLocation" 
20:         Move-Item ($localBkUpLocation + "\\*.bak" ) $destBkupLocation
21:                 
22:         #Delete backups from Fileshare older then 10 days
23:         $Now = Get-Date
24:         $Days = "10"  
25:         $LastWrite = $Now.AddDays(-$days)
26:         $Files = get-childitem $destBkupLocation -include *.bak -recurse | Where {$_.LastWriteTime -le "$LastWrite" } 
27:         
28:         if  (!$Files) 
29:         { 
30:             Write-Host "variable is null"  
31:         }
32:         else 
33:         {
34:       
35:         foreach  ($File in  $Files)
36:         {
37:          write-host "Deleting File $File"  -foregroundcolor "Red" ;  Remove-Item $File | out -null 
38:         }
39:         }
40: 
41:         write-Host Backup succeeded.
42: 
43:          # Edit the From Address as per your environment.
44:           $emailFrom = "" 
45:          # Edit the mail address to which the Notification should be sent.
46:           $emailTo = "" 
47:          # Subject for the notification email. The +“” ? part will add the date in the subject.
48:           $subject = "The  Site Collection Backup file was created with name:  " +"$filename"  + "$result" 
49:          # Body or the notification email. The +“” ? part will add the date in the subject.
50:           $body = "The  Site Collection Backup file was created with name: " +"$filename" 
51:           # IP address of your SMTP server. Make sure relay Is enabled for the SharePoint server on your SMTP server
52:           $smtpServer = "" 
53:           $smtp = new -object  Net.Mail.SmtpClient($smtpServer)
54:           $smtp.Send($emailFrom, $emailTo, $subject, $body)
55: }
56: catch         # If the process failed
57: {
58:           $ErrorMessage = $_.Exception.Message
59:           # Configure the below parameters as per the above.
60:           $emailFrom = "" 
61:           $emailTo = "" 
62:           $subject = "The Site Collection Backup Job failed on " +"$today" 
63:           $body = "The Site Collection Job failed on " +"$today and the reason for failure was $ErrorMessage." 
64:           $smtpServer = "" 
65:           $smtp = new -object  Net.Mail.SmtpClient($smtpServer)
66:           $smtp.Send($emailFrom, $emailTo, $subject, $body)
67:   
68: }    
69:  
70: Stop-SPAssignment -Global
71:  
72: Remove-PsSnapin Microsoft.SharePoint.PowerShell
73:  
74: write-Host "Finished script."
75: