Sunday, 21 August 2016

Migrate SharePoint 2010 Term Store to SharePoint Online with PowerShell

Last week I worked with a customer on migrating one SharePoint 2010 site to a new SharePoint Online.
I can qualify the site as Knowledge Base designed for optimal discoverability of the documents that are uploaded. To achieve a good discoverability you will need some good metadata describing the resources. Many times the metadata that is used is actually managed metadata that needs to be migrated/recreated in SharePoint Online.
If you have 10 or 20 terms it will not be an issue to recreate them, but if you have 400 for example it will not be very practical to manually recreate all terms.
There are many powershell scripts out there to export/import terms, but the success rate and the complexity might vary. This is why I would like to share how I did it and it worked out pretty well for me.
For the purpose we are going use the custom cmdlets provided for free by Gary Lapointe.
For demonstration purposes I will export one term set group with one term set that has limited number of terms. You can check it out below, it also has some parent/child terms.

SharePoint 2010 Term Store

In order to export the term set group you will need to deploy the WSP that will add the custom SharePoint Server 2010 commands. By doing so you will add the additional 2010 commands directly to the Microsoft.SharePoint.PowerShell snap-in.
To export taxonomy object as xml we are going to use Export-SPTerms. You will need to supply some taxonomy object as input parameter, this will be a taxonomy session if you want to export everything, for more examples see the cmdlet help. You can see how the Legal term set group looks as  xml below.

Input XML

As you can see all essential information that is needed is exported, even some that will be an issue if you are importing the terms to a different environment or SharePoint Online. This is the Owner or every attribute that represents on-prem identity that you might have. The import command will also try to set this properties with the same values and it will fail because the identity as it was exported cannot be found. The way to workaround this is just to set different value for Owner that will be a valid Online identity. Now it is up to you to decide if you want to do this tradeoff and migrate the objects with different Owner than the source. Below are two lines (3 to make it fit better) that will take the content of the exported XML and will set new Owner for each XML node where the Owner attribute is not empty and later the same XML object can be used for input of the import command.

[xml]$termXML = Get-Content "C:\Legal.xml"
($termXML.SelectNodes("//*")) | Where {$_.Owner -ne $null} | `
ForEach-Object {$_.SetAttribute("Owner", "i:0#.f|membership|admin@MOD******.onmicrosoft.com")}

To import the taxonomy objects in SPO you will need to download and install the SharePoint Online Custom Cmdlets
This will actually install a new module called  Lapointe.SharePointOnline.PowerShell.
The command that we are going to use for the import is Import-SPOTaxonomy. For InputFile parameter we are going to use the variable from the above lines after we have set all identity attributes. If you are importing an object that is not a top level term store you should specify ParentTermStore(can get it with Get-SPOTermStore), if not you should switch on the parameter "Tenant". Before all that, you should connect to a site in your target tenant using Connect-SPOSite. Below are the lines to import the Legal term set group.

Connect-SPOSite -Url "https://mod******.sharepoint.com"
Import-SPOTaxonomy -InputFile $termXML -ParentTermStore (Get-SPOTermStore)

And this is it. Our Legal term set group is recreated and available in the entire tenant. One nice thing is that the GUIDs will be copied as well.

SharePoint Online Term Store

I hope that this was helpful and big thanks to Gary Lapointe for writing this great tools! The same approach should work for SharePoint 2013, but I have not tested this.

Monday, 1 August 2016

Display related item repeating section in Nintex Workflow task form Nintex

Last week I worked with a customer that had repeating sections in Nintex Forms 2013 item form and Nintex Workflow 2013 workflow associated with the list. The customer had the requirement to be able to properly display the repeating section data in the workflow task forms. This requirement does not seems to be a straightforward to accomplish, but in this post I am going to demonstration that this is actually very simple and since I haven't found this in other sources I am sharing my solution and other useful links in this post.
The issue with the repeating section is that it is living as "section"only in the form. You can connect the entire repeating section to a field with type "Multiple lines of text" and you will see that our repeating section value is actually saved as XML.

Nintex Repeating Section

The item actually looks like this:

Nintex Repeating Section  Form

The first nice thing that is not directly related to the title of this post is to make the XML data looks better in List View. To accomplish this I am going to use the CSR (Client-side rendering) approach demonstrated in this post "Displaying Repeating Section as table in List View - the CSR approach". Adapting and applying the script to my list view gives me below result that is way better than the XML.

Repeating Section CSR


The way to make sense out of the repeating section in workflow is by querying the XML from the field. I will not go in dept since there are many resource on the subject. One thing that can help you in this task is this article "Nintex Forms/Workflow - Parsing Repeating Section Data" by Vadim Tabakman.
Now to the reason to write this post. If you have tasks in your workflow it will not be unusual you or your customer/users to want to see the related item properties right in the task form instead clicking on links. If you leave the form as it is, the best you can get is to view the repeating section as XML. You can edit the task forms with Nintex Forms for most of the task templates you will get a good starting point and all item properties controls will be created. However check out how this controls look like in three common tasks. From left to right Flexi Task, Request review and Request data.

Nintex Task Forms

As you can see the data from the repeating section is displayed as XML. Even in the "Request data" template where I have used "List item" control to display the related item. 
This was also the case with the customer, they had many "Request data" tasks and all of them were using "List item" control to display the related item.
The solution to this is very simple, just create a new repeating section in the task form, recreate all child controls by replicating the data type and the Name of the controls. Then connect the repeating section control to the related item field that contains the XML from the related item. Checkout how a Flexi task looks like if you recreate the repeating section as described.

Flexi Task Form


The data from the repeating section in the related item is represented as repeating section in the task form as well. Just make sure that the names of the controls are as in the original item, make the repeating section read only in the task form and you will be completely fine.
I tested the same approach in SharePoint 2016 and Office 365. However something interesting is happening with the XML as you can see in the screenshot below (the field is called Rep)

Task Form Office 365

The important thing is that the repeating section is visualized as expected. If I found what is happening with the XML might blog about it.
I hope that this was helpful! 

Monday, 27 June 2016

Useful file handling commands in the SharePoint PnP PowerShell module

Last week I got a request from one of our customers to help them to move some of the files from one SharePoint library to another in different web. Sounds an easy and quick task, but the catch was that the library had ~ 12000 documents and 1500 folders and the customer also wanted to keep the Created, Created by, Modified and Modified by column values. The number of items that had to be moved was ~ 3500. Pointless to say that with such number of items the Explorer view is not working, the new OneDrive client does not support sync from SharePoint libraries yet and I still had to figure out how to effectively copy the metadata. The way to accomplish this is with PowerShell or with 3rd party migration tool. Since the customer had only this requirements and not migration of the version history for example, my weapon of choice was powershell.
In this post I wont to share a couple of SharePoint PnP PowerShell cmdlets that greatly helped me in writing my migration script. Bluesource is also a contributor to the PnP project, thanks to my colleague Pieter Veenstra.
The first thing I want to share is a new feature that came with the June 2016 release. It is the option to map SharePoint site as PSDrive. This is done at the begging using Connect-SPOnline with CreateDrive parameter. You will then get a PSDrive called SPO and a PSProvider also called SPO. See how it looks below.

As mentioned above you will connect to the site and you will be looking at the root web. The sub webs will be shown as folders. You can do many standard things in this PSDrive like listing items, copy, move and more. One thing I was unable to do is listing the items in lists with 5000+ items, it seems that the view threshold limitation is kicking in. The other thing is copy items from SPO drive to the local file system, this is because you cannot copy items from one drive to another if the PSProviders are different. Also it would have been nice if this SPO drive is persistent and you can access it in Windows Explorer, this is not available and I am not sure if it is possible. 
I did not used this in my script, but I think that it is nice to have and you can learn more about this and other improvements in the June 2016 Community Call.

Get-SPOFile - This is one very useful command that will help you to download a file by supplying the server relative url to it. This was easy task after I retrieved all items and their FileLeafRef and FileDirRef fields using the technique from my previous post

Ensure-SPOFolder - With this command you can get a folder by giving a web relative url to the folder. If the folder does not exist it will create it even if the folder is nested in other non-existing folders, they will be created as well.

Add-SPOFile - With this command you can upload a file by supplying web relative url of the folder, the file will be uploaded with the same name. If the folder does not exist this command will create it before uploading the file. Really nice command!

I hope that this was helpful!

Monday, 20 June 2016

Get All Items in 5000+ large list with CSOM in PowerShell

Last week I had to write a script that needed to take all items in large SharePoint Online list.
By large I mean above 5000 items. This means that the list is above the list view threshold in SharePoint Online, which is 5000 and we cannot change that. The way to get all items in SharePoint Online is to use CAML query. However if it is just an empty query without any filtering it will fail, if you use unindexed column for filtering or ordering the query will fail, if you filter/order by indexed column and the query returns more than 5000 items it will fail again. The error in this and other scenarios is similar to the one below.

Exception calling "ExecuteQuery" with "0" argument(s): "The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator."

The way to workaround this is pagination of the view. This means that we will have a row limit of the result that the query can return that should be less or equal to 5000. Once we get the first 5000 items we can do another query for the next 5000 starting from the position where the first result(page) ends. This is the same with what we do in the UI scrolling foreword in the list view. Below is an example PowerShell snippet that will take all items from a list using 5000 items page size ordering the items by ID.

$list = $ctx.Web.Lists.GetByTitle($DocLibName)
$ctx.Load($list)
$ctx.ExecuteQuery()
## View XML
$qCommand = @"
<View Scope="RecursiveAll">
    <Query>
        <OrderBy><FieldRef Name='ID' Ascending='TRUE'/></OrderBy>
    </Query>
    <RowLimit Paged="TRUE">5000</RowLimit>
</View>
"@
## Page Position
$position = $null
 
## All Items
$allItems = @()
Do{
    $camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
    $camlQuery.ListItemCollectionPosition = $position
    $camlQuery.ViewXml = $qCommand
 ## Executing the query
    $currentCollection = $list.GetItems($camlQuery)
    $ctx.Load($currentCollection)
    $ctx.ExecuteQuery()
 
 ## Getting the position of the previous page
    $position = $currentCollection.ListItemCollectionPosition
 
 # Adding current collection to the allItems collection
    $allItems += $currentCollection
}
# the position of the last page will be Null
Until($position -eq $null) 

Few word about the query, I am using RecursiveAll because I used it against library and I wanted to get all items in all folders, the size of the page is 5000, just on the edge of the threshold and I am ordering the result by ID because this column is always indexed.
I am using Do-Until loop to get all pages and setting the position to be the position of the last item collection that was retrieved.
This is really a powerful and quick way to workaround the annoying 5000 list view threshold. I hope you find it useful!

Sunday, 12 June 2016

Disable SharePoint Event Firing in PowerShell process

Last week I worked with a customer that is using SharePoint 2010 as part of their enterprise DMS solution. Only a small part of the users are accessing the SharePoint sites directly, but they are accessing and adding documents using 3rd party Outlook integration product and in-house legacy LOB system integrations with SharePoint. If you have dealt with such DMS solutions(which is not uncommon in some industries) you will know that during the exploitation you might end up with a complex folder structures created based on the document metadata. You might also end up with large numbers of empty folders.
The empty folders are an issue for my customer and they reached me with a request to write a PowerShell cleanup script that will run on schedule and will delete the empty folders.
The catch is that there is an ItemDeleting event receiver that is preventing the deletion of any item including folders. You can see in my dev. machine a similar event receiver in action. It is also stopping the operation when I call Recycle() on item in PowerShell.


If you are a developer, most probably you know how to disable the event firing inside an event receiver in order to prevent the firing of other event receivers. This is done by setting the value of property SPItemEventReceiver.EventFiringEnabled. We can do the same thing in powershell with below code and prevent any events from being fired.

$assembly = [Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint");
$type = $assembly.GetType("Microsoft.SharePoint.SPEventManager");
$prop = $type.GetProperty([string]"EventFiringDisabled",`
[System.Reflection.BindingFlags] `
([System.Reflection.BindingFlags]::NonPublic -bor [System.Reflection.BindingFlags]::Static)); 
#SET EVENT FIRING DISABLED.
$prop.SetValue($null, $true, $null); 
 
<#
 DO WHAT YOU NEED TO DO
#>
 
#SET EVENT FIRING ENABLED.
$prop.SetValue($null, $false, $null); 

This code will disable the event firing in the current PowerShell thread and I am able to delete/recycle any item without executing the event handler. I have tested this with SharePoint 2010 and 2013, haven't tested it on SharePoint 2016, but I assume that it will work there too.
This is very powerful technique use it carefully on your own responsibility. I hope that this was helpfull!

Sunday, 15 May 2016

Pause a Nintex workflow for less than 5 minutes in SharePoint Server and Office 365 Nintex

Last week I spent almost 2 days fixing a complex Nintex/SharePoint 2013 issue with one of our customers. The customer was not very big in terms of headcount, but they were using Nintex workflow to automate all sorts of processes(one of the biggest I have worked with).
There were Workflow timer jobs stucking/failing, workflows exiting with errors for no obvious reason and more. The issue was resolved with a couple of fixes and at least the workflows were executed when they should and were ending as they were designed to.
Some of the issues and improvement points I flagged were: not properly scaled Nintex deployment, incorrect service topology, outdated product versions and poor workflow design.
Now, the fourth (poor workflow design) was partially dictated by the inadequate scale of the deployment. They were using a lot Pause and Commit pending changes actions. Many of the workflows were designed to have two minute pause after the first couple of actions.
As maybe you know the pause action actually pauses the workflow instance for the defined period of time, but the workflow will not resume immediately, it will be resumed when the "Workflow" Timer Job is executed. The default schedule of this job is every 5 minutes. This means that you cannot pause a workflow for less than 5 minutes or pause it for exactly the time you have set. You can change the schedule of the Workflow timer job to workaround the first limitation, but this can put additional load on your system.
This is why I demonstrated an alternative of the Pause action that do not pause the workflow instance, but just waits a certain amount of time before continuing the execution. I have not seen this approach in other sources and this is why I decided to share and explain it in this post.
There is another alternative to pause a workflow for less than 5 minutes. It is described in this article.
As you can see this alternative requires "NTX PowerShell Action". This is great, but this action is open source, it is deployed with Farm solution and although developed and published by Nintex Employee this addon is not backed and supported by Nintex. The PowerShell action is fantastic, but in my opinion it is not worth to deploy it just to use it as Pause alternative. Also you cannot use it in Office 365(SharePoint Online).
The PowerShell example works by executing the powershell code that will just wait a certain amount of time, then it will continue the execution. Obviously to pause a workflow we need to do some sort of waiting. There is no out of the box action that just waits, as we know Pause action is not doing anything, but actually idling the instance execution at certain point and waits for the timer job to resume it after the time is elapsed. With the powershell example we use the powershell (.Net) framework to achieve wating without doing anything for certain time. The same thing can be achieved with T-SQL statement execution and in Nintex Workflow both On-Prem. and Office 365 we have "Execute SQL" action.
If we want to put a wait in our SQL query for two minutes we can use the code below:

WAITFOR DELAY '00:02'

I am not a SQL guy and was surprised to find out that this statement works outside of the SQL Management tools.
Below is the designer look of my demo workflow in SharePoint 2013. This should also work for SharePoint/Nintex 2010.

WorkflowDesigner

As you can see it is pretty simple just for PoC. Below is the configuration of the "Execute SQL" action.
Execute SQL Action Configuration

I am using a connection string that is using "SQL" as server, this is alias to the SQL instance that hosts my SharePoint, I am using SSPI security with Windows credential that are actually my farm account saved as global constant.
Below are the details from the execution. You can see that Execute SQL actions took exactly 2 minutes to complete.

Workflow Execution Detail
Unfortunately you cannot use this approach on-premise  for pauses longer than 5 minutes without doing a loop and in this loop execute multiple times delays that are less than 5 minutes. If you do set delay more than 5 minutes the workflow will fail with error "Error performing database operation. Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding." even if you set connection timeout in the connection string to be more than the 5 minute delay. I will do some more tests/research and might report this as bug.

As described in the MSDN documentation of the WAITFOR statement, it should work against Azure SQL Database. In Nintex workflow for Office 365 we also have Execute SQL action. I actually tested this and noticed two things, the connection timeout you specified in the connection string will be set to 365 if the number is bigger than that, also if you set a delay longer than 4 minutes you will get some unexpected http errors during the execution, the workflow manager will do a couple of retries and then it will fail. I think that both are issues with the Workflow Manager in SharePoint Online.
This is not so important because the Pause action in the SharePoint 2013 workflows (Workflow Manager) are not depending on SharePoint timer jobs and you will not get the same pause issues as in on-premise 2010 framework, but it is still an option. See example configuration of the action below.

Execute SQL Office 365


My final words are that this might be extremely useful if you need to put some short pauses(not more than 5 min.) in your on-premise workflows. I hope you find this useful!

Wednesday, 17 February 2016

Get a quick report of the SharePoint Databases with PowerShell [Tip]

Here comes another useful PowerShell one-liner I often use.
It will give you a quick overview of the SharePoint databases with properties like: Name, Server(Alias), TypeName, Web application name, Web application URL, Site collection count and the Size.
The size is actually the amount of disk space required for uncompressed backup. It might look something like a script, but actually it is a long and simple one-liner. You can see it below, I have used grave-accent(`) escape characters to fit it better in the blog. You can see it in one line here. Instead of piping to Format-Table you can generate CSV by piping to Export-CSV and later work with it in Excel.

Get-SPDatabase | Select-Object Name,@{Expression={$_.Server};Label="Server"},TypeName,@{Expression=`
{$_.WebApplication.Name};Label="WebAppLication"},@{Expression={$_.WebApplication.Url};Label="WebAppLicationUrl"}`
,@{Expression={($_.WebApplication.Sites | Measure).Count};Label="SC Count"},@{Label ="Size in MB";`
 Expression ={$_.disksizerequired/1024/1024}} |  Format-Table -AutoSize

Get SharePoint database report

Tuesday, 16 February 2016

Get Application Pool Identity credentials[Tip]

In this short tip I am going to post a PowerShell one-liner from my list of extremely useful one-liners. It can get the the credentials of the IIS application pools identity.
I use it mainly in two scenarios:
  1. Imagine that you will do a remote work on customer where you have only temporary access and credentials. Many times the account that is provided is Farm Admin, has local admin permissions on the SharePoint boxes, however it does not have permission to use PowerShell against SharePoint(no Shell Admin). You can use this short PowerShell script and get the Farm account(STS is running under it), many times it is left in the local admin group and you can log in with it and do what you need to do. Not a best practice, but it is a massive time saver.
  2. Imagine that you are working on issue where you need to restart the User Profile Synchronization Service instance and you need the Farm account password. You can get it with this script.

In order to use it you will need to have local admin permission. I can confirm that it is working on IIS 7.5,8.0 and 8.5.

Get-WmiObject -Namespace "root\MicrosoftIISV2" -Class "IIsApplicationPoolSetting" | Select WAMUserName, WAMUserPass

Get IIS Application pool credentials

Monday, 15 February 2016

Copy List Views in SharePoint and SharePoint Online with PowerShell

In the last couple of weeks I am working with a customer that mainly uses SharePoint Server as DMS(Document Management System). I had to move a large number of documents from one library to another   due to  corruption in some of the files caused by excessive use of unique permissions (~ 32 000).
In this post I will not talk about why you should limit the usage of unique permissions especial in big libraries, it's a long story.
As part of the work I had to copy many list views to the new library. I am not a fan of "Save as Template" approach, my solution was to use a PowerShell script and copy the views programmatically.
The script did its job and I thought that it will be nice to have something like this for SharePoint Online.
I was unable to find any ready script that can do this, so I wrote one.
Both scripts are doing basically the same thing, getting the source and destination webs, getting the source and destination lists, getting the source and destination View collections and create new view in the destination using properties from the source.
It was a bit tricky to load what I need and not making the script extremely slow with CSOM, since we cannot use simple lambda expressions syntax in PowerShell. My solution to this was to use a function written by the SharePoint automation superstar  Gary Lapointe. You can check it out in his article for ItUnity, where he explains how to workaround the limitation in PowerShell concerning the lambda expressions. I highly recommend to read the entire series dedicated on using PowerShell against SharePoint Online.
You can download both scripts (on premises and online) below. Please, Test, Rate and use Q&A section in the Gallery!


 Download On Premises script from: TechNet Gallery

Download SharePoint Online script from: TechNet Gallery

Monday, 16 November 2015

Display and Fill related item fields in workflow task with Nintex for Office 365 Nintex

A couple of weeks ago I posted an article called "Move away from the Workflow Tasks with Nintex for Office 365". The goal of this post was to show some key techniques that you can use if you want to automate a process without using tasks and do all the work in the form so users can see what they are approving.
However, if you want or need to stick with the tasks there is way to display and fill fields from the related item in the task form.
To do this you will need to edit the task template from the workflow designer. This feature was introduced with July 2015 release of Nintex Forms for Office 365.
To illustrate how this is working I am going to use the familiar simple Vacation Request form. See how it looks in the form designer below.

Vacation Request

You can see a field with label "Manager Comment" this field is visible only when the form is not in "New Mode", it should contain some information from the manager that will approve the request.
In Nintex Workflow for Office 365 there is no Request Data action, but we can include the field that requires information on task completion in the task form.
If you open a Assign a Task(or Start a Task Process) action configuration page you will see "Edit Task Form" in the action ribbon. By clicking it you will get to the familiar Nintex Forms Designer interface.


In the picture above I am editing the default Nintex task content type and once you open the form designer the initial form includes task columns and item column. Initially all related item columns are disabled, but you can change that for any column you want and make it editable in the task, you can also connect control from the task form to related item column. With some rearrangement you ca see how the Vacation Request approval task looks below.


There is something you should consider if you have many tasks with modified forms. This is something I hit with a customer and later reproduced. This is not yet confirmed by Nintex.
When you edit a task form, the task will be created with Nintex content type(ending with GUID), even if you have specified your own custom content type. This is not an issue because the Nintex content type will have the same columns as the original. However you will have to edit all Task Action forms in the workflow regardless of what content type they are using. If for example two tasks have the same initial custom content type, they will be created under the same Nintex content type after you edit both forms.
This is still not a big deal. However, every time you edit a task form two hidden variables are created and you can easily end up hitting the variable count limitation in the Workflow Manager which is 50 in SharePoint Online. See below error when you try to publish a workflow that has more than 50 variables.

Error publishing workflow. Workflow XAML failed validation due to the following errors: Activity 'DynamicActivity' has 52 arguments, which exceeds the maximum number of arguments per activity (50). HTTP headers received from the server - ActivityId: 9e1fc3bc-5a7c-4821-9605-d595acea851d. NodeId: . Scope: . Client ActivityId : 29ca419d-d068-2000-213e-aae9dfcc2677. The remote server returned an error: (400) Bad Request.



This is not an issue with Nintex, this is just the way the things are working in the background.
In my humble opinion Microsoft should rethink this limitation in SharePoint Online!

I hope that this was helpful!

Tuesday, 10 November 2015

Error when access Nintex Live Management page on fresh Nintex Workflow 2013 installation.[Tip]Nintex

Today I hit a strange issue with a fresh installation of Nintex Workflow 2013 with Nintex Live.
Nintex Live solution was deployed by the installer. However when I tried to access the Nintex Live Management page in the Central Administration I received below error.

The resource object with key 'LiveAdmin_Page_Management_Title' was not found.

If you have done everything correctly so far the solution is very simple.
Open SharePoint Management Shell as Administrator and launch the command Install-LiveService , test if the page is now available, perform IISRESET on your CA server(s).

Saturday, 24 October 2015

Move away from the Workflow Tasks with Nintex for Office 365 Nintex

In this post I am going to demonstrate my solution for a requirement that I believe is not an exception.
It is a fact that users not always like the workflow tasks. In the last two weeks I received this requirement twice. The first requirement was around an internal project I am working on in O365 with Nintex Forms and Nintex Workflow and the second came from a customer with SharePoint Server 2010, Nintex Workflow and InfoPath forms that wanted to get rid of the tasks in their Business Travel approval process. The managers that are approving those business travel requests just did not liked to receive a mail with a link to a task where to select Approve/Reject and if they want to see what actually they are approving to click on a second link and then go back to the task to complete it. They wanted to do everything in the InfoPath form where they can see all the numbers and can directly approve or reject. Of course from the title it is clear that I will focus on the first scenario in O365 with Nintex Workflow and Forms. 
Since the scenario is in SharePoint Online things are working a bit different, I will show you some key bits in Nintex for O365 that can help you(I hope) to meet the requirement to automate a process without using workflow tasks.
For demo purposes I am going to use a basic leave request form. The user will create a new item and submit it, then the person pointed as Approver should approve the request just by clicking a button in the form. See how the initial form looks like below.


One of the key things is that our approval workflow should know in what stage the form currently is.
The workflow can wait for item field change, a Status drop-down field is an option, but let's say you have many stages of approval, like Team Lead approval, HR approval and so on. In more complex workflows this will be confusing for the users and in general it will not be a good idea to let the users choose what is the stage of the process.
This is why we are going to make different buttons that will be visible/active in the different stages for different users and on click the buttons will set value of a text Status field.
In our example if the form is new the submit button will save the form and will set status "New", if the status is New the workflow will send a mail to the approver inviting him to review the request, it will also set some status for example Awaiting Approval. If the status is "Awaiting Approval" a different set of buttons Approve/Reject will be shown and they will set the status accordingly when they are clicked.
The first thing to setup will be the initial SUBMIT and CANCEL buttons, the submit button will save the form and will change the status to "New". This buttons will be visible only when the form is new and the Status field is empty.
The button controls in Nintex Forms for O365 have a nice feature, you can connect them to a field so when they are clicked they will send some value to that field. In our case we are going to send a string "New" to field Status. Below you can see how this button control setting looks like, everything is located under Advanced.


The next thing to do for both controls is to set an Expression for the setting that defines the visibility of the control, when the expression returns true the control will be visible, when false the control will be hidden. We are doing this for both buttons because if we leave the form with only one Cancel button in all stages the button will be placed only in one place, and other functional Submit buttons will be placed below or above it(I guess that this can be fixed with some CSS, but this is not a subject now). The expression is very simple, we are going to use Inline Function IsNullOrEmpty for Item Property(field) Status, this setting is located under Appearance, Visible and in the drop-down choose Expression. If the field is empty the function will return true and the button will be visible.


The story for the Approve/Reject is basically the same, they will be Save and Submit button controls connected to the Status field, that will set values Approved or Rejected, however we will need them to be visible when the status is "Awaiting Approval".
Also the Approve/Reject buttons should be visible only for the user that has the authority to approve or reject the  Leave request. This is not a difficult task to achieve, in our case the person that should approve the request is given in field Approver that is from type Person/Group.
Here comes another great feature, we can reference Current User Display Name,Email and Login ID in the form out of the box, we also can get this values from Person field in the current item.
So what we need to do is to set our Visible expression to be true when the Status is equal to "Awaiting Approval" and the email(for example) of the current user is equal to the Approver's email.
Entire logic is done by using the "And" Inline Function, for values we are going to use the function Equals for both conditions. This way the Approve/Reject buttons will be available only in "Awaiting Approval" stage for the appropriate user.


You can see how both forms look like in Edit mode side by side below. The left view is what the Approver will see with the options to Approve/Reject and on the right is a regular user that cannot even Save the form, there is no submit button in the form or Save in the ribbon.



Here comes the part where we will need to create the Approval Workflow for our Leave Requests.
As already mentioned the workflow will need to set the Status field to "Awaiting Approval" so we can activate the Approve/Reject buttons, nothing special about that, we will do it with action "Set Field in Current Item".
After we have done this we should make the workflow to wait for Status field change. We can use "Wait for Field Change in Current Item" action. There is one blocking point with this approach if we use this action alone and it is that this action will wait for field change only to a single predefined value, but we have two possible values Approved or Rejected.
This is why we are going to use the Parallel Block action. This action can execute many action branches simultaneously. We can have one branch where to wait for Rejected and one to wait for Approved. This action in Office 365 also has a feature that's making this logic possible. We can have two condition  for exit from the parallel block.
One of the condition is always end of one of the branches and second one is a variable that should evaluate to True. This is so important because without it the parallel block will wait until all branches are finished and the branch where we wait for Status to change to Rejected will never end if the item was Approved.
This is why we create a variable from type Boolean and assign value "No" just right before the Parallel Block(Not sure if this is absolutely needed but just to be on the save side). Then we set this variable as "Completion Condition"(in the action settings) for the Block and after every action for waiting field change we set this variable to True("Yes"), this way we are going to exit from the block when individual branch finish. Before the exit from each branch you can do State Machine State Change or something else, it depends on how you have designed your workflow and how complex is the process. See how this step looks like below.



If you are wondering what is the third branch for, it is for a reminder using a Loop with Condition action. We are going to pause the branch execution for a period of time let's say 1 day after this the workflow will send a reminder that there is pending Leave Request that requires approval and this will repeat until we exit the Condition Loop or the Parallel block. We need some condition for the loop, this is why we are going to put the variable that we use for completion of the entire branch.

This became one long post, I hope that I have not bored you to death, maybe this topic was more suitable for video tutorial. I hope that this was helpful, happy Nintexing!

Saturday, 17 October 2015

Capture SQL IO latencies for a period of time - The PowerShell Script

In this post I am going to share a PowerShell script that is not directly related to SharePoint, but can be a powerful tool for troubleshooting SharePoint performance issues.
Earlier this year I published an article called Test and Monitor your SQL storage performance and some good practices for SharePoint, in this article I showed some tools I use to troubleshoot SQL Server storage performance. One of the tools I mentioned there was a SQL script written by Paul Randal from SQLskills.com, it will let you capture the IO latencies for a period of time.
As with my previous post I wanted to transform the SQL script to PowerShell, so it can be used in more scenarios, get the result directly and present the result as objects in PowerShell.
I contacted Paul and he gave me permission to write and publish this script. Note that the original script(this one too) is copyrighted and SQLskills.com have all rights reserved. You can see the original copyright in Paul's post, in my script and in the TechNet Gallery post. Respect it!

Since we've said that, here are a few words about my powershell script. It is really simple, just issuing a T-SQL commands against the SQL instance. You do not have to be on the SQL server as long as you have connectivity and appropriate permissions. The script should work against SQL Server 2005 and newer. You can use Windows integrated authentication, where the identity of the account that is running the script will be used to connect to the sql or you can use SQL Authentication.
The output can be in powershell as System.Data.DataRow type or in CSV file that will be displayed in GridView at the end.
I think that this script is a good example and you can use it as reference to transform some of you SQL scripts to PowerShell. If you look at the original SQL code you will notice that it is one script and in my script I have 5 SQL commands. This is because 'GO' is not T-SQL statement and it will not work directly with the .Net SqlClient. This is why I am executing every batch as separate command.
You can see how the result looks and a download link below. For more information see the help section in the script, ask a question in the Q&A section in Technet Gallery and if you like it Rate it.

SQL Storage IO latencies result



 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Monday, 21 September 2015

Get the slowest request in SharePoint - The PowerShell script

In April this year I published a post called "Query Usage and Health Data collection database for slow requests execution". In this post I have explained the limitations of the "Slowest Pages" report page in SharePoint and provided a SQL query script that can overcome this limitations and you can receive a valuable information about the requests logged in the Usage and Health database.
After I created this script I have used it in multiple occasions and it turned out to be extremely useful! However, there is one big disadvantage(in my opinion) about this script, it is T-SQL script.
This is why I started to think how I can transform it into a PowerShell script, so I will not need to know which is the logging database, on which server it is, go to SSMS, copy to CSV and so on.
Now I might be wrong, but there is no SharePoint API that will help me to get such result in PowerShell. I also did some googling on the subject and I was unable to find anything remotely to what I was looking for, maybe there is some chance to get something like this in SharePoint 2010 where we have Web Analytics Service application.
If I am correct, this can be an idea for improvement in the SharePoint web analytics field, something that I think was announced to be better in SharePoint Server 2016.Usage and Health database captures a lot of useful information and it is a shame that SharePoint Admins cannot take advantage of it without being SQL master or BI guru. My solution was just to embed the SQL query in the script. 
In the script I put some sweet stuff like: Support for SQL Authentication, The output can be in PowerShell with type System.Data.DataRow and in CSV file/Grid View, you can pass the Start and End times as DateTime objects and many more.
If you run the script from SharePoint server the script will automatically determine the SQL server, the Usage and Health database, will do the connection to the SQL Server by impersonating the identity of the user that is running the script and you can filter by web application by passing Url,GUID or WebApplication object.
Of course you can successfully run the script when you are not on SharePoint or SQL server, you will just need to enter more parameters, use GUID for Web Application and have connectivity to the SQL server itself.
I have successfully tested the script with SharePoint 2010/SQL 2012, SharePoint 2013/SQL 2012 and SharePoint 2016 TP/SQL 2014. You can check the output and download the script below, if you have issues read the Help section, ask a question in the Q&A section in Technet Gallery and if you like it Rate it.
Finally I would like to have a minute for shameless self-promotion and say that last week I passed 3000 downloads mark in the Gallery and I received a Gold medal for my Gallery contributions. Thank you!

SharePoint Slow Request Grid View Output


 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Thursday, 20 August 2015

Unable to set custom Data Refresh credentials for PowerPivot workbook in SharePoint

Last week one of our customers had interesting issue, they were not able to set alternative Windows and Data Source credentials for PowerPivot Scheduled Data Refresh in SharePoint.
The deployment is typical BI Farm based on SharePoint 2013 Enterprise SP1 and SQL Server 2012 Enterprise SP1.
Since I was unable to find information about this and I also think that the authentication and refresh of external data sources is always a bit tricky in the SharePoint BI world, I decided to share my solution.
In PowerPivot for SharePoint you can configure Scheduled Data Refresh, including the option to set alternative credentials. This empowers BI Experts and BI site administrators to set connection credentials without having access to Secure Store Service(SSS) or even know what SSS is. See how this looks like below.


PowerPivot Scheduled Refresh Settings
There is a good article, that explains how to configure Data Refresh in PowerPivot, you can find it here.
The issue with the customer was that when they try to set alternative credentials they receive generic SharePoint error page and changes are not applied.
If you read the article I referred above, you will see that every set of alternative credentials is actually saved as target application in SSS.
If you search in the ULS logs for Correlation Id you get, you will find something similar to below Unexpected error:

System.ServiceModel.FaultException`1[[System.ServiceModel.ExceptionDetail, System.ServiceModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]: Access is denied to the Secure Store Service. Server stack trace: at System.ServiceModel.Channels.ServiceChannel.ThrowIfFaultUnderstood .....
.......


This error message actually indicates that PowerPivot application pool account does not have permissions in SSS to create target application for the new credentials.
My solution to that was to add the account as Administrator of SSS with Create Target Application and Manage Target Application permissions. After doing this it worked out like charm!