Monday 16 November 2015

Display and Fill related item fields in workflow task with Nintex for Office 365 Nintex

A couple of weeks ago I posted an article called "Move away from the Workflow Tasks with Nintex for Office 365". The goal of this post was to show some key techniques that you can use if you want to automate a process without using tasks and do all the work in the form so users can see what they are approving.
However, if you want or need to stick with the tasks there is way to display and fill fields from the related item in the task form.
To do this you will need to edit the task template from the workflow designer. This feature was introduced with July 2015 release of Nintex Forms for Office 365.
To illustrate how this is working I am going to use the familiar simple Vacation Request form. See how it looks in the form designer below.

Vacation Request

You can see a field with label "Manager Comment" this field is visible only when the form is not in "New Mode", it should contain some information from the manager that will approve the request.
In Nintex Workflow for Office 365 there is no Request Data action, but we can include the field that requires information on task completion in the task form.
If you open a Assign a Task(or Start a Task Process) action configuration page you will see "Edit Task Form" in the action ribbon. By clicking it you will get to the familiar Nintex Forms Designer interface.


In the picture above I am editing the default Nintex task content type and once you open the form designer the initial form includes task columns and item column. Initially all related item columns are disabled, but you can change that for any column you want and make it editable in the task, you can also connect control from the task form to related item column. With some rearrangement you ca see how the Vacation Request approval task looks below.


There is something you should consider if you have many tasks with modified forms. This is something I hit with a customer and later reproduced. This is not yet confirmed by Nintex.
When you edit a task form, the task will be created with Nintex content type(ending with GUID), even if you have specified your own custom content type. This is not an issue because the Nintex content type will have the same columns as the original. However you will have to edit all Task Action forms in the workflow regardless of what content type they are using. If for example two tasks have the same initial custom content type, they will be created under the same Nintex content type after you edit both forms.
This is still not a big deal. However, every time you edit a task form two hidden variables are created and you can easily end up hitting the variable count limitation in the Workflow Manager which is 50 in SharePoint Online. See below error when you try to publish a workflow that has more than 50 variables.

Error publishing workflow. Workflow XAML failed validation due to the following errors: Activity 'DynamicActivity' has 52 arguments, which exceeds the maximum number of arguments per activity (50). HTTP headers received from the server - ActivityId: 9e1fc3bc-5a7c-4821-9605-d595acea851d. NodeId: . Scope: . Client ActivityId : 29ca419d-d068-2000-213e-aae9dfcc2677. The remote server returned an error: (400) Bad Request.



This is not an issue with Nintex, this is just the way the things are working in the background.
In my humble opinion Microsoft should rethink this limitation in SharePoint Online!

I hope that this was helpful!

Tuesday 10 November 2015

Error when access Nintex Live Management page on fresh Nintex Workflow 2013 installation.[Tip]Nintex

Today I hit a strange issue with a fresh installation of Nintex Workflow 2013 with Nintex Live.
Nintex Live solution was deployed by the installer. However when I tried to access the Nintex Live Management page in the Central Administration I received below error.

The resource object with key 'LiveAdmin_Page_Management_Title' was not found.

If you have done everything correctly so far the solution is very simple.
Open SharePoint Management Shell as Administrator and launch the command Install-LiveService , test if the page is now available, perform IISRESET on your CA server(s).

Saturday 24 October 2015

Move away from the Workflow Tasks with Nintex for Office 365 Nintex

In this post I am going to demonstrate my solution for a requirement that I believe is not an exception.
It is a fact that users not always like the workflow tasks. In the last two weeks I received this requirement twice. The first requirement was around an internal project I am working on in O365 with Nintex Forms and Nintex Workflow and the second came from a customer with SharePoint Server 2010, Nintex Workflow and InfoPath forms that wanted to get rid of the tasks in their Business Travel approval process. The managers that are approving those business travel requests just did not liked to receive a mail with a link to a task where to select Approve/Reject and if they want to see what actually they are approving to click on a second link and then go back to the task to complete it. They wanted to do everything in the InfoPath form where they can see all the numbers and can directly approve or reject. Of course from the title it is clear that I will focus on the first scenario in O365 with Nintex Workflow and Forms. 
Since the scenario is in SharePoint Online things are working a bit different, I will show you some key bits in Nintex for O365 that can help you(I hope) to meet the requirement to automate a process without using workflow tasks.
For demo purposes I am going to use a basic leave request form. The user will create a new item and submit it, then the person pointed as Approver should approve the request just by clicking a button in the form. See how the initial form looks like below.


One of the key things is that our approval workflow should know in what stage the form currently is.
The workflow can wait for item field change, a Status drop-down field is an option, but let's say you have many stages of approval, like Team Lead approval, HR approval and so on. In more complex workflows this will be confusing for the users and in general it will not be a good idea to let the users choose what is the stage of the process.
This is why we are going to make different buttons that will be visible/active in the different stages for different users and on click the buttons will set value of a text Status field.
In our example if the form is new the submit button will save the form and will set status "New", if the status is New the workflow will send a mail to the approver inviting him to review the request, it will also set some status for example Awaiting Approval. If the status is "Awaiting Approval" a different set of buttons Approve/Reject will be shown and they will set the status accordingly when they are clicked.
The first thing to setup will be the initial SUBMIT and CANCEL buttons, the submit button will save the form and will change the status to "New". This buttons will be visible only when the form is new and the Status field is empty.
The button controls in Nintex Forms for O365 have a nice feature, you can connect them to a field so when they are clicked they will send some value to that field. In our case we are going to send a string "New" to field Status. Below you can see how this button control setting looks like, everything is located under Advanced.


The next thing to do for both controls is to set an Expression for the setting that defines the visibility of the control, when the expression returns true the control will be visible, when false the control will be hidden. We are doing this for both buttons because if we leave the form with only one Cancel button in all stages the button will be placed only in one place, and other functional Submit buttons will be placed below or above it(I guess that this can be fixed with some CSS, but this is not a subject now). The expression is very simple, we are going to use Inline Function IsNullOrEmpty for Item Property(field) Status, this setting is located under Appearance, Visible and in the drop-down choose Expression. If the field is empty the function will return true and the button will be visible.


The story for the Approve/Reject is basically the same, they will be Save and Submit button controls connected to the Status field, that will set values Approved or Rejected, however we will need them to be visible when the status is "Awaiting Approval".
Also the Approve/Reject buttons should be visible only for the user that has the authority to approve or reject the  Leave request. This is not a difficult task to achieve, in our case the person that should approve the request is given in field Approver that is from type Person/Group.
Here comes another great feature, we can reference Current User Display Name,Email and Login ID in the form out of the box, we also can get this values from Person field in the current item.
So what we need to do is to set our Visible expression to be true when the Status is equal to "Awaiting Approval" and the email(for example) of the current user is equal to the Approver's email.
Entire logic is done by using the "And" Inline Function, for values we are going to use the function Equals for both conditions. This way the Approve/Reject buttons will be available only in "Awaiting Approval" stage for the appropriate user.


You can see how both forms look like in Edit mode side by side below. The left view is what the Approver will see with the options to Approve/Reject and on the right is a regular user that cannot even Save the form, there is no submit button in the form or Save in the ribbon.



Here comes the part where we will need to create the Approval Workflow for our Leave Requests.
As already mentioned the workflow will need to set the Status field to "Awaiting Approval" so we can activate the Approve/Reject buttons, nothing special about that, we will do it with action "Set Field in Current Item".
After we have done this we should make the workflow to wait for Status field change. We can use "Wait for Field Change in Current Item" action. There is one blocking point with this approach if we use this action alone and it is that this action will wait for field change only to a single predefined value, but we have two possible values Approved or Rejected.
This is why we are going to use the Parallel Block action. This action can execute many action branches simultaneously. We can have one branch where to wait for Rejected and one to wait for Approved. This action in Office 365 also has a feature that's making this logic possible. We can have two condition  for exit from the parallel block.
One of the condition is always end of one of the branches and second one is a variable that should evaluate to True. This is so important because without it the parallel block will wait until all branches are finished and the branch where we wait for Status to change to Rejected will never end if the item was Approved.
This is why we create a variable from type Boolean and assign value "No" just right before the Parallel Block(Not sure if this is absolutely needed but just to be on the save side). Then we set this variable as "Completion Condition"(in the action settings) for the Block and after every action for waiting field change we set this variable to True("Yes"), this way we are going to exit from the block when individual branch finish. Before the exit from each branch you can do State Machine State Change or something else, it depends on how you have designed your workflow and how complex is the process. See how this step looks like below.



If you are wondering what is the third branch for, it is for a reminder using a Loop with Condition action. We are going to pause the branch execution for a period of time let's say 1 day after this the workflow will send a reminder that there is pending Leave Request that requires approval and this will repeat until we exit the Condition Loop or the Parallel block. We need some condition for the loop, this is why we are going to put the variable that we use for completion of the entire branch.

This became one long post, I hope that I have not bored you to death, maybe this topic was more suitable for video tutorial. I hope that this was helpful, happy Nintexing!

Saturday 17 October 2015

Capture SQL IO latencies for a period of time - The PowerShell Script

In this post I am going to share a PowerShell script that is not directly related to SharePoint, but can be a powerful tool for troubleshooting SharePoint performance issues.
Earlier this year I published an article called Test and Monitor your SQL storage performance and some good practices for SharePoint, in this article I showed some tools I use to troubleshoot SQL Server storage performance. One of the tools I mentioned there was a SQL script written by Paul Randal from SQLskills.com, it will let you capture the IO latencies for a period of time.
As with my previous post I wanted to transform the SQL script to PowerShell, so it can be used in more scenarios, get the result directly and present the result as objects in PowerShell.
I contacted Paul and he gave me permission to write and publish this script. Note that the original script(this one too) is copyrighted and SQLskills.com have all rights reserved. You can see the original copyright in Paul's post, in my script and in the TechNet Gallery post. Respect it!

Since we've said that, here are a few words about my powershell script. It is really simple, just issuing a T-SQL commands against the SQL instance. You do not have to be on the SQL server as long as you have connectivity and appropriate permissions. The script should work against SQL Server 2005 and newer. You can use Windows integrated authentication, where the identity of the account that is running the script will be used to connect to the sql or you can use SQL Authentication.
The output can be in powershell as System.Data.DataRow type or in CSV file that will be displayed in GridView at the end.
I think that this script is a good example and you can use it as reference to transform some of you SQL scripts to PowerShell. If you look at the original SQL code you will notice that it is one script and in my script I have 5 SQL commands. This is because 'GO' is not T-SQL statement and it will not work directly with the .Net SqlClient. This is why I am executing every batch as separate command.
You can see how the result looks and a download link below. For more information see the help section in the script, ask a question in the Q&A section in Technet Gallery and if you like it Rate it.

SQL Storage IO latencies result



 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Monday 21 September 2015

Get the slowest request in SharePoint - The PowerShell script

In April this year I published a post called "Query Usage and Health Data collection database for slow requests execution". In this post I have explained the limitations of the "Slowest Pages" report page in SharePoint and provided a SQL query script that can overcome this limitations and you can receive a valuable information about the requests logged in the Usage and Health database.
After I created this script I have used it in multiple occasions and it turned out to be extremely useful! However, there is one big disadvantage(in my opinion) about this script, it is T-SQL script.
This is why I started to think how I can transform it into a PowerShell script, so I will not need to know which is the logging database, on which server it is, go to SSMS, copy to CSV and so on.
Now I might be wrong, but there is no SharePoint API that will help me to get such result in PowerShell. I also did some googling on the subject and I was unable to find anything remotely to what I was looking for, maybe there is some chance to get something like this in SharePoint 2010 where we have Web Analytics Service application.
If I am correct, this can be an idea for improvement in the SharePoint web analytics field, something that I think was announced to be better in SharePoint Server 2016.Usage and Health database captures a lot of useful information and it is a shame that SharePoint Admins cannot take advantage of it without being SQL master or BI guru. My solution was just to embed the SQL query in the script. 
In the script I put some sweet stuff like: Support for SQL Authentication, The output can be in PowerShell with type System.Data.DataRow and in CSV file/Grid View, you can pass the Start and End times as DateTime objects and many more.
If you run the script from SharePoint server the script will automatically determine the SQL server, the Usage and Health database, will do the connection to the SQL Server by impersonating the identity of the user that is running the script and you can filter by web application by passing Url,GUID or WebApplication object.
Of course you can successfully run the script when you are not on SharePoint or SQL server, you will just need to enter more parameters, use GUID for Web Application and have connectivity to the SQL server itself.
I have successfully tested the script with SharePoint 2010/SQL 2012, SharePoint 2013/SQL 2012 and SharePoint 2016 TP/SQL 2014. You can check the output and download the script below, if you have issues read the Help section, ask a question in the Q&A section in Technet Gallery and if you like it Rate it.
Finally I would like to have a minute for shameless self-promotion and say that last week I passed 3000 downloads mark in the Gallery and I received a Gold medal for my Gallery contributions. Thank you!

SharePoint Slow Request Grid View Output


 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Thursday 20 August 2015

Unable to set custom Data Refresh credentials for PowerPivot workbook in SharePoint

Last week one of our customers had interesting issue, they were not able to set alternative Windows and Data Source credentials for PowerPivot Scheduled Data Refresh in SharePoint.
The deployment is typical BI Farm based on SharePoint 2013 Enterprise SP1 and SQL Server 2012 Enterprise SP1.
Since I was unable to find information about this and I also think that the authentication and refresh of external data sources is always a bit tricky in the SharePoint BI world, I decided to share my solution.
In PowerPivot for SharePoint you can configure Scheduled Data Refresh, including the option to set alternative credentials. This empowers BI Experts and BI site administrators to set connection credentials without having access to Secure Store Service(SSS) or even know what SSS is. See how this looks like below.


PowerPivot Scheduled Refresh Settings
There is a good article, that explains how to configure Data Refresh in PowerPivot, you can find it here.
The issue with the customer was that when they try to set alternative credentials they receive generic SharePoint error page and changes are not applied.
If you read the article I referred above, you will see that every set of alternative credentials is actually saved as target application in SSS.
If you search in the ULS logs for Correlation Id you get, you will find something similar to below Unexpected error:

System.ServiceModel.FaultException`1[[System.ServiceModel.ExceptionDetail, System.ServiceModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]: Access is denied to the Secure Store Service. Server stack trace: at System.ServiceModel.Channels.ServiceChannel.ThrowIfFaultUnderstood .....
.......


This error message actually indicates that PowerPivot application pool account does not have permissions in SSS to create target application for the new credentials.
My solution to that was to add the account as Administrator of SSS with Create Target Application and Manage Target Application permissions. After doing this it worked out like charm!


Tuesday 11 August 2015

Discover what permissions user has in SharePoint Farm with PowerShell

Permissions and access control is essential part of Information management and governance in a successful SharePoint implementation. As everything in SharePoint, user permissions and access require significant planning in order to prevent future headaches.
However, very often for one reason or another, there is no significant permissions planning, or if there is, it was not implemented correctly or the business users, site administrators and content authors haven't received proper training required to maintain a well organised permission structure. With the introduction of  the "SHARE" button all over SharePoint 2013(most probably this will be valid for SharePoint 2016 as well), it became even easier for the users to break permission inheritance and grant another user or group with "Edit" permission to site,list or item, when only Read access was needed.
Time goes by and you onboard a new customer that has no clear concept for permission management in SharePoint, everything is great the sun is shining and then you receive a query from the customer asking you to give them information on what permissions a user has in their SharePoint. As mentioned the customer has no concept for permission management and they have nice mix of SharePoint groups, AD groups, permission levels, many object with unique permission shared with individual users and so on. You try to figure out what permissions the user has, you dig deeper in the content and eventually end up in below situation.

SharePoint Admin Mind Blown

This is why I did something I had to do a long time ago, I wrote a powershell script that can get all permissions granted to windows user. This includes Farm level, Web Application User Policies, Site level(Owners/Admins),Web,List,Folder and Item. The script is creating a CSV file with all permissions and at the end it is loading the file in GridView for ad-hoc review. See how it looks like below.

Permission Report Gridview

The main goals for the script is to cover as much scenarios as possible. As you can see the script is covering different levels, not only permissions over securable objects. It is working and was tested in SharePoint 2010/PowerShell 2.0 and SharePoint 2013/PowerShell 3.0. It is working with Windows Classic authentication and Windows Claims.You can select the lowest object level that you can scan for permissions, starting with Web Application to Item. The script is showing if the permissions are granted directly or inherited by a Domain group, and if they are inherited it will show you the name of the group.
To achieve this I had to spare a lot of my free time. At the end this became one of the most complex scripts I have ever written and I am able to share. I hit a lot of rocks and did a lot of testing until I decided that it is good enough to be shared. There are many similar scripts, but so far I haven't found any that can cover this many scenarios. This is also a call for everyone that will download and use the script to give me a feedback and if there are any issues in different setups, I highly doubt that the script can break something.
With this post I also want to share some of the interesting PowerShell techniques I have used in the script.
Above I mentioned that the script is not getting permissions from the securable objects only. This are the objects that can be secured with permissions and permission levels, they are SPWeb,SPList and SPItem. Getting the permission information for securable objects is a big part of my script, you can read how to get permission information for securable object in a post by Gary Lapointe. In Gary's post you can find a script very similar to mine, unfortunately I found it when my script was almost ready.
There is one issue with the permission info for securable objects. In a scenario where the user is member of AD group and this group is member of SharePoint group there is no way to find from which AD group the user is inheriting the SharePoint group membership.Also consider a scenario where AD group is granted with permissions in Web Application with User Policy, you will need login/claim of the group to see if there is a policy. This is why my script is getting user's AD group membership, if there is a permission that is coming from SharePoint group the script will get the members of the SharePoint group and will see if the user is member of the SharePoint group or some of its AD groups is member and it will give you AD group name.
Getting AD user/group information is easy when you are on a Domain Controller, it is actually not very hard to do it from any domain joined computer. I have done it by using the Type Accelerator [adsisearcher], you can read about it and how it is working from this article.
Here is another issue I dealt with, the primary domain group is not listed in the attribute memberof. Imagine that the primary domain group is not "Domain User", you will need to know it in case there are some permissions granted in SharePoint. There is a KB that has example how to do it with nasty old VB script :). How to do it in PowerShell is a bit easier. Every AD user has attribute "primarygroupid" that matches exactly with the group object attribute "primarygrouptoken" of the primary group, this ID is also the last digits from the group's SID.
There are some more techniques and workarounds I have used in the script, but I am risking the post to become very lengthy and boring, so download the script to see them.
You can download the script from below link. Please, test,use and rate it in Technet Gallery. For examples and prerequisites see script's Help section!



 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Friday 31 July 2015

Change the URL of Existing SharePoint List or Library - Video tutorial by Webucator

In January this year I published an article called Change the URL of existing SharePoint List/Library[Tip] . After some time I was contacted by Webucator, they liked the solution from my article and asked for permission to create a video tutorial based on it. If you don't know them, Webucator is providing onsite and online technical and business trainings.
I was notified today that the video is published and it is really well explained, you can check it out below. Webucator provides a wide variety of technical trainings including SharePoint. If you are looking for SharePoint trainings be sure to check what SharePoint Classes are available at Webucator, you can also find many free tutorials on their site.

Thursday 2 July 2015

Migrate all SharePoint databases to new SQL Server with minimum clicking and a few words about my new job!

Here we are, a new post after a long pause. I will try to change this in future and blog as much as I can. If you are not visiting my blog for first time I guess you have noticed that the branding is slightly different. The change is mainly in the bottom right where is located the logo of my current employer.
At the moment I am writing this post I am in my fourth week as Senior SharePoint Engineer for bluesource.I think that I made a good decision and this change is a step forward, something that I needed and wanted.
Bluesource is a great company, not very big(at the moment), but currently present on three continents Europe, Australia and North America. It has wide portfolio of technologies and services.As a Senior SharePoint Engineer I will be the main person involved in the ongoing SharePoint support of our customers and our internal SharePoint infrastructure.
I am working with many customers and this is a great opportunity to see different solutions, implementations, requirements and requests. I will support not just SharePoint 2013, but 2010 and Online. A lot of the customers are looking at fast,cheap and supportable solutions, this is why they are heavily using Nintex Workflow, Nintex Forms, InfoPath and more. I am especially excited about Nintex Workflow, this is the perfect tool/platform for developing  no-code, robust workflows for SharePoint.
This is why you can expect more diverse post for interesting and useful(I hope) real-world examples from my day-to-day job. 
One of my first tasks was to plan and execute the migration of our public/extranet SharePoint 2010 farm to new SQL Server instance as part of internal infrastructure restructuring.
As I mentioned this farm is hosting our public and extranet sites, so no downtime was allowed, this required the migration method to be database backup/restore instead of detach/attach. There are many resources how to do this, one of the best overviews of the process can be found in the posts of Thuan Soldier and Todd Klindt.
To be exact, stopping all SharePoint services is mentioned in both posts, this means complete outage, I did it only with complete content management change freeze and stopped Timer Services in order to prevent any timer job execution, and components that can write in the databases or in the local configuration cache. This is not the best way, but I and my stakeholders took the risk to do it that way and the migration went smoothly without content outage and issues.
To minimize the risk I had to do the backup and restore as fast as possible. I had to do backup/restore for up to 40 databases with combined size of 70 GB. Database backup and restore is a click intensive, repeatable operation and people tend to do mistakes when a lot of clicking and repeating is involved, computers can do better repeatable,boring things.
My solution to automate this process was a great powershell script from Chrissy LeMaire  (PowerShell MVP). I just cannot express how good that script is! Although I haven't tested it in all possible scenarios that can be used in, I think it will work great just how it worked in my migration.
The script is called Start-SqlServerMigration and you can check it out and download below.
Here are some of the features: support detach/attach and backup/restore for db migration, can use Windows and SQL authentication, migrate Windows and SQL logins(with correct SID), supports old SQL Server versions, migrate jobs and SQL server objects, sets the initial source DB owner, export/migrate SQL global configuration and many more. See some screenshots of backup/restore migration in my dev. environment.



A video from Chrissy:



                             Download the script from: TechNet Gallery

Monday 11 May 2015

Write and Get User Profile Properties in SharePoint Online with CSOM in PowerShell. The Scripts!

Last year (1st December) I published an article called Write and Get User Profile Properties in SharePoint Online with CSOM in PowerShell . In this article I demonstrated the latest capabilities that were introduced with the latest CSOM SDK. There were new APIs that made possible to work with User Profile properties in SharePoint Online(for now). I published PowerShell snippets for writing user profile properties and retrieving them. At that time I was unable to figure out how to write MultiValue properties, the snippets were just for the demo, with hard coded property and they were not very reusable.
Since this article became very popular and people were happy that they can do this from PowerShell, I decided to write generalized script for writing any Single or Multi value user profile property.
The Name(not the DisplayName) of the property to be edited is provided as script parameter. As I said the script is generalized, not tested with all available out of the box properties in SharePoint Online. Some of the properties have specific requirement for the value. For example the About Me property should be html(<br> instead new line), the Birthday should be some parsable datetime format and so on. If the format is not correct in most cases you will receive some meaningful exception from CSOM.
The script I will provide requires SharePoint Server 2013 Client Components SDK(x64) or SharePoint Online Client Components SDK(x64) that are released September 2014 or newer.
The script is written like native cmdlet, it accepts pipeline values for the AccountName, you can combine it with a CSV or simple text file that contains a list with account names, include it in your own scripts and modify it as you find for useful. Test the script to account that is not C-level manager :-) and if it works use it for bulk property updates for "live" accounts for example. 

To demonstrate how the script for writing user profile properties is working I am going to change the profile of  Anne Wallace (AnneW@spyankulov.onmicrosoft.com). She is the President of Contoso. I am going to change her Work Phone(Single Value), her Skills(Multi Value) and her Birthday(short date Single Value). See how her profile page looks like before changes.


Below are the PowerShell lines I have used to change this properties, I have made it in couple of lines in order to fit it better in the blog. Fore more examples see the Help section of the scripts.


########################## Change Work Phone ###############
.\Set-SPOUserProfileProperty.ps1 -PropertyName WorkPhone `
-AccountName 'AnneW@spyankulov.onmicrosoft.com' -Value "+44 555 555 55" `
-SPOAdminPortalUrl 'https://spyankulov-admin.sharepoint.com' `
-UserName ****@spyankulov.onmicrosoft.com -Password *******

########################## Change The Skills ###############

.\Set-SPOUserProfileProperty.ps1 -PropertyName 'SPS-Skills' `
-AccountName 'AnneW@spyankulov.onmicrosoft.com' -Value "SharePoint","PowerShell","Tech Support"-SPOAdminPortalUrl 'https://spyankulov-admin.sharepoint.com' `
-UserName ****@spyankulov.onmicrosoft.com -Password ******* -MultiValue
######################### Change The Birthday ###############                                         
.\Set-SPOUserProfileProperty.ps1 -PropertyName 'SPS-Birthday' `
-AccountName 'AnneW@spyankulov.onmicrosoft.com' -Value "5/9/1988" `
-SPOAdminPortalUrl 'https://spyankulov-admin.sharepoint.com' `
-UserName ****@spyankulov.onmicrosoft.com -Password *****


Lets see how Anne's profile looks like below, after the change.



We can see that the properties are changed accordingly. How cool is that :).

In the package you can download below I also included a script that can retrieve the user profile properties in PowerShell. Please, download the script,test,use and rate!
I am writing a fair amount of Help to my scripts, please read it.

 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Wednesday 29 April 2015

Query Usage and Health Data collection database for slow requests execution

Here is another article I am going to dedicate to the SharePoint performance monitoring.
For me one of the most useful out of the box tools to gather information for the farm performance is the Usage and Health data collection done by the Usage and Health Data Collection Service Application. This service application can be easily provisioned and set if you have installed your SharePoint farm with AutoSPInstaller.
I highly recommend to plan,provision and use this service application in your SharePoint farm!
If you have not done this yet, you can see how to provision the application here. I recommend to specify at least the logging database name in the New-SPUsageApplication  command, for further configuration see following  article, there you will find how to set up various properties like log location, what to monitor and the retention of the logs. Here is an official resource from TechNet.
Above I mentioned the logging database, this is the database used by the Usage and Health Data Collection Service Application to store all the logs. The logs are collected on each SharePoint server and stored in the log location you have set. This logs are then pushed into the logging database by a timer job called "Microsoft SharePoint Foundation Usage Data Import ", by default it is running every 5 minutes.
The logging database is the only database in SharePoint where running SQL queries is supported in order to analyze our logs. See some official guidance View data in the logging database in SharePoint 2013.
The easiest way to get some information from the logging database is from the CA interface described here. There you can see some statistics for the Slowest Page request and you can filter by Server,Web Application, maximum range of 100 rows and predefined timeframe by day, week, month, as you can see on the screenshot below.

View Health Report

We are getting essential information from this page, but in my case I needed to track some occasional slow request reported by the users. Here are some disadvantages of the result that we get from "View Health reports" page.

1. This is the major one, because it is limiting the amount of information we can receive. The results are aggregated averages for a certain request. We can have one slow request or we may have many. In this case we are going to receive Average, Minimum, Maximum values for Duration for example, we cannot see when a peak had occurred, the requests are grouped.

2. The columns we can see is predefined. This is again because of the request grouping. If you do a simple Select Top 100 for example from one of the dbo.RequestUsage (this is the table where the requests are logged) partitions you will see that we have quite a lot of columns and some of them can be really useful for our troubleshooting process.

3. The timeframes are predefined, the result is limited by 100.

This is why I created a SQL query that will output all request for a certain timeframe that is configured by you, it can be less than a second or days. The output will show you all requests in that timeframe, they will not be grouped, you can add additional conditions and columns to be displayed and the most important, you can see the time that the request was logged. I find this extremely helpful for troubleshooting, you can narrow down the results for a very small timeframe, you can filter by user login for example and track the request in the ULS logs by Correlation Id. See how it looks from the screenshot below and you can compare it with the result from the Slowest Pages report.

[*UPDATE*]: I have published a PowerShell version you can find it and read more HERE

Slow Requests Query

The screenshot above is an older version, in the current version that you can download below I have included filtering by UserLogin. As I mention you can modify the script to show you whatever column  you want (include them in SELECT statement and the GROUP BY clause) and include additional filters. I have put example values for some of the variables, many of them can be Null and some useful comments as well.
A common use case will be if you receive a notification that the SharePoint was "slow" for a certain user in a certain time. You can use this script and it will show you all requests for this user, this information will help you in your investigation of potential performance issue.You can then export this result in CSV and play with it in Excel for example.
Keep in mind that this query can put some load on your SQL server, the timing is in UTC and the duration is in seconds.You should run the script against your Usage and Health (logging) database. The script is tested on SharePoint 2013/SQL 2012.

 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Thursday 9 April 2015

Script to get All Webs in SharePoint Online site collection

Do you ever wanted to have a SharePoint Online cmdlet to get all webs in a site collection? Something like this can be useful if you want to see all webs and if there is something that you do not want or need. Or if you want to get all webs as original type and do further operations with them. Well as you know there is no such out of the box cmdlet in SharePoint Online Management Shell.
We however can use the dlls from SharePoint Server 2013 Client Components SDK and write our own powershell script to do that.
There however are many scripts from the community that are described as "get all webs in site collection". One of the most popular scripts that is popping out if you do some googling, is the script of Corey Roth (MVP). The script is nice but actually getting all webs in a SharePoint site collection with CSOM is a bit tricky, since we do not have (at least I do not know) a shortcut for getting really all webs, something like SPSite.AllWebs when you run powershell in on-premises.
What Cory's script is doing is to get the top level web from the client context, then get its sub webs and then it is returning the Title and the Url of the subwebs like I have shown in the screenshot below.


As you can see we are not getting much. The site I am doing this demo on is a site with two subwebs, below them we have some more subwebs, below them more and we cannot see them.
So I create a script that can output all webs in SharePoint Online site collection very similar to what SPSite.AllWebs is doing in on-premises and the output is Microsoft.SharePoint.Client.Web. See how it looks like.

All SharePoint Online Webs

As you can see the output is different. I have quite a lot of webs in my site, there is even an AppWeb.
See part of my code.


BEGIN{
function Get-SPOSubWebs{
        Param(
        [Microsoft.SharePoint.Client.ClientContext]$Context,
        [Microsoft.SharePoint.Client.Web]$RootWeb
        )


        $Webs = $RootWeb.Webs
        $Context.Load($Webs)
        $Context.ExecuteQuery()

        ForEach ($sWeb in $Webs)
        {
            Write-Output $sWeb
            Get-SPOSubWebs -RootWeb $sWeb -Context $Context
        }
    }
    Add-Type -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" | Out-Null
    Add-Type -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll" | Out-Null
}
PROCESS{

    $securePassword = ConvertTo-SecureString $PassWord -AsPlainText -Force
    $spoCred = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $securePassword)
    $ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl)
    $ctx.Credentials = $spoCred

    $Web = $ctx.Web
    $ctx.Load($Web)
    $ctx.ExecuteQuery()

    Write-Output $Web

    Get-SPOSubWebs -RootWeb $Web -Context $ctx
}



All I have done is to get the root web, return it as result, get the subwebs, then get the subwebs of each subweb that is found (and so on) and then return each web as result. The key job is done by a function defined in BEGIN blok (not very pretty, but works) Get-SPOSubWebs . The function is running itself for each web found, I think that this can be defined as recursion.
At the end we have all webs in the site collection. In order to run the script you will need SharePoint Server 2013 Client Components SDK (x64) installed and optionally if you want to see the data in powershell console loaded type display template(included with the script). For more information see my article Display CSOM objects in PowerShell.

 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Sunday 5 April 2015

Load Testing SharePoint 2013 with Visual Studio 2013. Gotchas for SharePoint Administrators

Last week I had to do load test for one of our customers SharePoint 2013 environment, so I decided to share with you some gotchas that might not be obvious for SharePoint Administrators that are new to Visual Studio(VS) and do not have experience with load testing with VS, but want to try or use it. 
If you happened to be a fulltime developer or maybe QA, please do not laugh out loud and feel free to leave your comments below this post.
This article is not a walkthrough on how to build load test solution with VS2013. There is a great SPC 2014 session from Todd Klindt and Shane Young, it is very informative and so much fun.



So, in the test I created for my customer I have 3 Web Tests, two with read profile browsing different sites and testing search and one that has a write profile. The write activity is to create an item in out of the box Contact List and attach a small file to the item. The first thing that failed was in the "write" Web Test.

File Upload is Failing - Now in the new versions of Visual Studio this should not happen, the feature was introduced back in VS2010. However it happened with me when I did this the first time. The entire web test failed because it was unable to find the file that should be attached in the list item. The strange thing is that in the next day I recreated the test and everything worked fine. I tested this with Visual Studio 2013 Update 1 and Update 4. The normal behavior is when you record your web test Visual Studio is intelligent enough and knows what file you are uploading, it will copy that file to the load test project root folder and will set the post request for the test to take the file in your project folder. However this might not happen and your web test will fail, like I have shown on the picture below.

Visual Studio 2013 Failed Web Test Upload

[Solution 1] - This is the quick and dirty solution and since I am pro I went right for it :). When we are adding new item to a list we have a form that we should fill and then by clicking Save we send POST request with the parameters of the form, the file path  that should be uploaded is parameter as well. The POST request parameters are recorded by Visual Studio and you can edit them. You have just to find the correct request, expand it, look in to the "Form Post Parameters" like it is shown below.

Form Post Parameters

Next you have to locate "File Upload Parameter" that has some value for FileName and the value is just the filename of the file you want to upload. It is possible to have empty File Upload Parameters or many different parameters. When you find the correct  File Upload Parameter you just put the literal path of the file as FileName property, like on the picture below.

Edit FileName Property

Now, this is not very good because when you move your Load Test project to a different computer you have to move the file and again edit the File Upload Parameter(eventually). For a better solution keep reading/scrolling.

Masking Unwanted/Failed dependent requests - The dependent requests are the requests that you make when you hit a page and load additional resources. For example, you test request to http://sharepoint.contoso.net, but in the landing page there are multiple files referred like CSS, JavaScript, pictures etc. The requests to this files are dependent requests and they are taken into account in your web test results if they are successful or failed.
This became kind of a deal for me because I was testing highly customized site and several dependent requests for JavaScript files (.js) were failing with status 404 and this will not look good in your load test results. After a short googling I found this free eBook. There on page 189 is discussed exactly this issue. The solution to mask failed dependent request is to use web test plugin. Below you can see how a web test result looks like with failed dependent request. I have simulated this in my lab by referencing a JavaScript file located in the same site and then I deleted it. You can see that the missing file currentU.js even is making one of my primary request to fail, although everything else is fine with it.

Failed Dependent Request

In this book there is source code of a web test plugin that can filter dependent requests that are starting with certain string. This will be fine if your dependent request were from different domain like some CDN or caching solution. If I use this plugin as it is, it will filter all dependent requests from my SharePoint site. This is why I modified the code to filter dependent request that are ending with certain string. You can read how to develop and use your own plugin here. See below how it looks.


I have highlighted some of the important elements from the picture(hope you see it well), however I want to stress on two things. After you build your web test plugin project it is just a class library(dll), in order to have your new plugin available in your web tests you have to reference the dll in your Load Test project. The second thing is the configuration of the ending sting for filtering. This is first configured when you add the plugin to the web test, you add the string as value for property FilterDependentRequestsThatEndsWith, in my case it is ".js". If you like my plugin you can download the project (builded dll is in the bin/debug folder) here.

The third gotcha I will share took me 5 minutes to figure it out. It is not so much, but this issue popped out right after I launched the entire load test for first time. This was important because I was testing Production(like a boss) and I was going to test the limits of the Farm so the timing was negotiated with the customer as maintenance window with possible service degradation and/or outage.

Configure Load Test run location - This setting specify whether you will use Visual Studio Online or the computer you are running the test from. As I wrote above I have tested this with VS2013 Update 1 and 4, so I will assume that from Visual Studio 2013 Update 1 the default configuration is Visual Studio Online. You can change/check this in test settings of your solution it is in the General section as it is shown below.

Load Test Local Settings


If you click on Deployment section you will see that there is an option to add additional files that should be deployed with the solution.

[Solution 2 - File Upload is Failing] - Add the file as deployment in the test settings and if needed edit the post request form parameter to refer to the file only by filename. This solution is way better.

Finally, this post became too lengthy for blog post, but I hope it was informative!


Wednesday 18 February 2015

User Profile Incremental Synchronization Timer Job - Access is denied Error

I stumbled on this issue in one of our customer's SharePoint 2013 environment and I came across many forum posts and article, but did not found the complete solution, so I decided to share it with you.
The background story is as follows. We have a SharePoint Server 2013 hosted on Windows Server 2012 environment that is using User Profile Synchronization(leveraging FIM). We have one User Profile Service application.
The User Profile Synchronization service is running on the server where the Central Administration is hosted.
So far so good, we also have the User Profile Service running on two servers, in order to have a redundancy for this service, for short I will call this servers APP01 (CA,Sync Service, User Profile Service) and APP02(User Profile Service). The User Profile Synchronization service is successfully started and running on APP01.
The issue comes when you want to go and implement the principle of least privilege. This will require to remove the Farm account from the local Administrators group on the server where the User profile Synchronization is running (FIM). You need the Farm Account to be Local Admin in order to successfully start User Profile Synchronization.
You remove the the Farm account from the local Administration group.
Time goes by and you start to receive an error Event ID: 6398 on APP02 with following description:

The Execute method of job definition Microsoft.Office.Server.UserProfiles.UserProfileImportJob (ID 7acb171d-5d7d-4601-ba08-85f24c0969b4) threw an exception.More information is included below.

Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))


This description tells us that the "User Profile Incremental Synchronization Timer Job" timer job is failing on APP02. To be sure you can use Get-SPTimerJob and for Identity parameter to use the GUID from the error message. This is the Timer Job responsible for the User Profile Synchronization. You check the Timer Job history for this job and it is running on APP02 and it is Failing.

Failed User Profile Incremental Synchronization Timer Job

It is possible to see this job to fail every minute one day it can run on APP02 and next run to be on APP01, or to see ongoing synchronization in the Central Administration, but when you open the miisclient.exe you see that there is no synchronization operations running, this however may prevent new synchronisations to start.
But User Profile Synchronization service is running on APP01 not on APP02!
Now all over the forums, folks are posting question like "The Synchronization timer job is running on the wrong server and it is failing". Actually this is not true!
Remember that we are running User Profile Service instance on both APP01 and APP02. The "User Profile Incremental Synchronization Timer Job" can run on both servers and it should run in success.
So the timer job is executed by the Timer Service and the Timer Service is running under the Farm account as the Sync Service. From the event log error it is obvious that the Farm account needs some permissions in order to execute the job successfully. When the account is in the local Administration group it "have it all".

The First thing to check is following MS Article. There is a section called "Remove unnecessary permissions". In this article it is explained how to grant Remote Enable permission over the Microsoft FIM 2010 WMI namespace MicrosoftIdentityIntegrationServer (on the server running FIM). This permission is needed for the Farm account to be able to run the job successfully from server that is not running User Profile Synchronization Service (FIM). Remember to do IISRESET if you are running the Sync. service on a server that is also hosting Central Administration.
Of course I did the procedure above, but does not worked out!
Apparently the job needs to interact with the Microsoft FIM 2010 WMI namespace. I am not sure for this but I think that the timer job is invoking Execute method on MIIS_ManagementAgent class.
Out of curiosity below you can see how the MicrosoftIdentityIntegrationServer (MIIS) WMI namespace looks like. I have based my User Profile Synchronization reporting script on MIIS WMI.

MIIS WMI Namespace

Now under the hood WMI is using DCOM, so in order a non-Administrator account to interact with WMI remotely it needs permissions over DCOM on the server where the Sync service is running at.

The Second things to check/change are the permissions over DCOM server on the machine that is running User Profile Synchronization Service. The Farm account needs to have Remote Access, Remote Launch and Remote Activation permissions. To do this use the article  Securing a Remote WMI Connection . Now this is a bit tricky and confusing for me. There is a simple way to check if the Farm account has remote access to WMI. By using below PowerShell command with farm account credentials from remote computer you should be able to retrieve the synchronization operations history.

Get-WmiObject -Class MIIS_RunHistory -Namespace root/MicrosoftIdentityIntegrationServer -ComputerName <YourSyncServer> -Credential <Domain>\<FarmAccount>

If there is an issue with the permissions you will receive some Access Denied error like this.

WMI Access Denied

The DCOM permissions can be tricky after doing a change there close and open new PowerShell console and test again the remote WMI with the script above. If you still have Access Denied error locate Windows Management and Instrumentation DCOM application and give needed permissions to the Farm Account.

As you can see there are lots of moving  parts Timer Jobs, WMI, DCOM and so on. But there are countless reason for fail. Use the PowerShell line above. If there is an issue with the WMI remoting it will show it. There may be firewall issue it will show you something like "RPC service unavailable" error. If you do not receive a result from the remote WMI test you have some issue.

This is how I solved the issue. I am hoping that it is helpful for you!

Monday 16 February 2015

SharePoint Weekly Backup cycle script.

In this post I am going to share a script for SharePoint backup I wrote for one of our customers.
Since we had no budget for a 3rd party backup solution I decided to use the native SharePoint backup capabilities in a script.
For me the SharePoint Farm backup is a good solution if you do not have any advanced 3rd party tools like DocAve for example or your Farm is not very big. Amongst DPM,SQL backups, File System backups the SharePoint farm backup is the best solution in case of disaster and you have to restore the entire Farm. You can see the following article for details on the different methods.
The drawback with the SharePoint Farm backup is that it is not very granular and it can be a bit slow in large environments. For example in Web Applications the lowest level object you can restore is Content Database. However, you can restore a content database and then use Recover data from an unattached content database and restore site, web or list. Still, you will not be able to directly restore an item for example.
Be aware that if you try to restore a content database from Farm backup to a live Farm you may receive error similar to the one below.

Restore failed for Object 2K8R2_SP_Content_Portal_Bla (previous name: 2K8R2_SP_Content_Portal) failed in event OnPostRestore. For more information, see the spbackup.log or sprestore.log file located in the backup directory.

And when you look at the restore log you will find something like this:

FatalError: Object 2K8R2_SP_Content_Portal_Bla (previous name: 2K8R2_SP_Content_Portal) failed in event OnPostRestore. For more information, see the spbackup.log or sprestore.log file located in the backup directory. SPException: Cannot attach database to Web application. Use the command line tool or Central Administration pages to attach the database manually to the proper Web Application.

In this case I am restoring the old content database 2K8R2_SP_Content_Portal to a database with name 2K8R2_SP_Content_Portal_Bla. However SharePoint will try to attach the restored content database to the Web Application and it will fail because we already have a content database with that ID. However the restored database is completely okay and you can see it in the SQL Server Management Studio.
The customer required to have a backup once a day and to keep the backups for one month. We were also disk space constrained.
There are many SharePoint backup scripts, but I wanted to have a different folder for every week and also to have one full and the rest of the backups to be differential in order to save some free space. Additionally you can enable SQL Server backup compression to save more disk space. 
The script I wrote is doing this, also it can backup only the configuration.
The script is doing a 7 day cycles, when a new cycle starts it will create a new folder that resembles the start, end and the type of the backup. For example if you start a full backup cycle on 14.02.2015 the script will create a folder with name 14022015_20022015_F (Dates are in EU standard ddMMyyyy). Since it is new cycle the script will start with Full farm backup. Then all the backups until 20.02.2015 will be differential. Here how it looks one weekly cycle.

SharePoint backup cycle


The script will also do a clean up. You can specify how many cycles you want to maintain and it will keep them and the old backups will be deleted. Do no combine this script with full database backups because it will break full-differential sequence.  


 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)