Wednesday 29 April 2015

Query Usage and Health Data collection database for slow requests execution

Here is another article I am going to dedicate to the SharePoint performance monitoring.
For me one of the most useful out of the box tools to gather information for the farm performance is the Usage and Health data collection done by the Usage and Health Data Collection Service Application. This service application can be easily provisioned and set if you have installed your SharePoint farm with AutoSPInstaller.
I highly recommend to plan,provision and use this service application in your SharePoint farm!
If you have not done this yet, you can see how to provision the application here. I recommend to specify at least the logging database name in the New-SPUsageApplication  command, for further configuration see following  article, there you will find how to set up various properties like log location, what to monitor and the retention of the logs. Here is an official resource from TechNet.
Above I mentioned the logging database, this is the database used by the Usage and Health Data Collection Service Application to store all the logs. The logs are collected on each SharePoint server and stored in the log location you have set. This logs are then pushed into the logging database by a timer job called "Microsoft SharePoint Foundation Usage Data Import ", by default it is running every 5 minutes.
The logging database is the only database in SharePoint where running SQL queries is supported in order to analyze our logs. See some official guidance View data in the logging database in SharePoint 2013.
The easiest way to get some information from the logging database is from the CA interface described here. There you can see some statistics for the Slowest Page request and you can filter by Server,Web Application, maximum range of 100 rows and predefined timeframe by day, week, month, as you can see on the screenshot below.

View Health Report

We are getting essential information from this page, but in my case I needed to track some occasional slow request reported by the users. Here are some disadvantages of the result that we get from "View Health reports" page.

1. This is the major one, because it is limiting the amount of information we can receive. The results are aggregated averages for a certain request. We can have one slow request or we may have many. In this case we are going to receive Average, Minimum, Maximum values for Duration for example, we cannot see when a peak had occurred, the requests are grouped.

2. The columns we can see is predefined. This is again because of the request grouping. If you do a simple Select Top 100 for example from one of the dbo.RequestUsage (this is the table where the requests are logged) partitions you will see that we have quite a lot of columns and some of them can be really useful for our troubleshooting process.

3. The timeframes are predefined, the result is limited by 100.

This is why I created a SQL query that will output all request for a certain timeframe that is configured by you, it can be less than a second or days. The output will show you all requests in that timeframe, they will not be grouped, you can add additional conditions and columns to be displayed and the most important, you can see the time that the request was logged. I find this extremely helpful for troubleshooting, you can narrow down the results for a very small timeframe, you can filter by user login for example and track the request in the ULS logs by Correlation Id. See how it looks from the screenshot below and you can compare it with the result from the Slowest Pages report.

[*UPDATE*]: I have published a PowerShell version you can find it and read more HERE

Slow Requests Query

The screenshot above is an older version, in the current version that you can download below I have included filtering by UserLogin. As I mention you can modify the script to show you whatever column  you want (include them in SELECT statement and the GROUP BY clause) and include additional filters. I have put example values for some of the variables, many of them can be Null and some useful comments as well.
A common use case will be if you receive a notification that the SharePoint was "slow" for a certain user in a certain time. You can use this script and it will show you all requests for this user, this information will help you in your investigation of potential performance issue.You can then export this result in CSV and play with it in Excel for example.
Keep in mind that this query can put some load on your SQL server, the timing is in UTC and the duration is in seconds.You should run the script against your Usage and Health (logging) database. The script is tested on SharePoint 2013/SQL 2012.

 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Thursday 9 April 2015

Script to get All Webs in SharePoint Online site collection

Do you ever wanted to have a SharePoint Online cmdlet to get all webs in a site collection? Something like this can be useful if you want to see all webs and if there is something that you do not want or need. Or if you want to get all webs as original type and do further operations with them. Well as you know there is no such out of the box cmdlet in SharePoint Online Management Shell.
We however can use the dlls from SharePoint Server 2013 Client Components SDK and write our own powershell script to do that.
There however are many scripts from the community that are described as "get all webs in site collection". One of the most popular scripts that is popping out if you do some googling, is the script of Corey Roth (MVP). The script is nice but actually getting all webs in a SharePoint site collection with CSOM is a bit tricky, since we do not have (at least I do not know) a shortcut for getting really all webs, something like SPSite.AllWebs when you run powershell in on-premises.
What Cory's script is doing is to get the top level web from the client context, then get its sub webs and then it is returning the Title and the Url of the subwebs like I have shown in the screenshot below.


As you can see we are not getting much. The site I am doing this demo on is a site with two subwebs, below them we have some more subwebs, below them more and we cannot see them.
So I create a script that can output all webs in SharePoint Online site collection very similar to what SPSite.AllWebs is doing in on-premises and the output is Microsoft.SharePoint.Client.Web. See how it looks like.

All SharePoint Online Webs

As you can see the output is different. I have quite a lot of webs in my site, there is even an AppWeb.
See part of my code.


BEGIN{
function Get-SPOSubWebs{
        Param(
        [Microsoft.SharePoint.Client.ClientContext]$Context,
        [Microsoft.SharePoint.Client.Web]$RootWeb
        )


        $Webs = $RootWeb.Webs
        $Context.Load($Webs)
        $Context.ExecuteQuery()

        ForEach ($sWeb in $Webs)
        {
            Write-Output $sWeb
            Get-SPOSubWebs -RootWeb $sWeb -Context $Context
        }
    }
    Add-Type -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" | Out-Null
    Add-Type -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll" | Out-Null
}
PROCESS{

    $securePassword = ConvertTo-SecureString $PassWord -AsPlainText -Force
    $spoCred = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $securePassword)
    $ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl)
    $ctx.Credentials = $spoCred

    $Web = $ctx.Web
    $ctx.Load($Web)
    $ctx.ExecuteQuery()

    Write-Output $Web

    Get-SPOSubWebs -RootWeb $Web -Context $ctx
}



All I have done is to get the root web, return it as result, get the subwebs, then get the subwebs of each subweb that is found (and so on) and then return each web as result. The key job is done by a function defined in BEGIN blok (not very pretty, but works) Get-SPOSubWebs . The function is running itself for each web found, I think that this can be defined as recursion.
At the end we have all webs in the site collection. In order to run the script you will need SharePoint Server 2013 Client Components SDK (x64) installed and optionally if you want to see the data in powershell console loaded type display template(included with the script). For more information see my article Display CSOM objects in PowerShell.

 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Sunday 5 April 2015

Load Testing SharePoint 2013 with Visual Studio 2013. Gotchas for SharePoint Administrators

Last week I had to do load test for one of our customers SharePoint 2013 environment, so I decided to share with you some gotchas that might not be obvious for SharePoint Administrators that are new to Visual Studio(VS) and do not have experience with load testing with VS, but want to try or use it. 
If you happened to be a fulltime developer or maybe QA, please do not laugh out loud and feel free to leave your comments below this post.
This article is not a walkthrough on how to build load test solution with VS2013. There is a great SPC 2014 session from Todd Klindt and Shane Young, it is very informative and so much fun.



So, in the test I created for my customer I have 3 Web Tests, two with read profile browsing different sites and testing search and one that has a write profile. The write activity is to create an item in out of the box Contact List and attach a small file to the item. The first thing that failed was in the "write" Web Test.

File Upload is Failing - Now in the new versions of Visual Studio this should not happen, the feature was introduced back in VS2010. However it happened with me when I did this the first time. The entire web test failed because it was unable to find the file that should be attached in the list item. The strange thing is that in the next day I recreated the test and everything worked fine. I tested this with Visual Studio 2013 Update 1 and Update 4. The normal behavior is when you record your web test Visual Studio is intelligent enough and knows what file you are uploading, it will copy that file to the load test project root folder and will set the post request for the test to take the file in your project folder. However this might not happen and your web test will fail, like I have shown on the picture below.

Visual Studio 2013 Failed Web Test Upload

[Solution 1] - This is the quick and dirty solution and since I am pro I went right for it :). When we are adding new item to a list we have a form that we should fill and then by clicking Save we send POST request with the parameters of the form, the file path  that should be uploaded is parameter as well. The POST request parameters are recorded by Visual Studio and you can edit them. You have just to find the correct request, expand it, look in to the "Form Post Parameters" like it is shown below.

Form Post Parameters

Next you have to locate "File Upload Parameter" that has some value for FileName and the value is just the filename of the file you want to upload. It is possible to have empty File Upload Parameters or many different parameters. When you find the correct  File Upload Parameter you just put the literal path of the file as FileName property, like on the picture below.

Edit FileName Property

Now, this is not very good because when you move your Load Test project to a different computer you have to move the file and again edit the File Upload Parameter(eventually). For a better solution keep reading/scrolling.

Masking Unwanted/Failed dependent requests - The dependent requests are the requests that you make when you hit a page and load additional resources. For example, you test request to http://sharepoint.contoso.net, but in the landing page there are multiple files referred like CSS, JavaScript, pictures etc. The requests to this files are dependent requests and they are taken into account in your web test results if they are successful or failed.
This became kind of a deal for me because I was testing highly customized site and several dependent requests for JavaScript files (.js) were failing with status 404 and this will not look good in your load test results. After a short googling I found this free eBook. There on page 189 is discussed exactly this issue. The solution to mask failed dependent request is to use web test plugin. Below you can see how a web test result looks like with failed dependent request. I have simulated this in my lab by referencing a JavaScript file located in the same site and then I deleted it. You can see that the missing file currentU.js even is making one of my primary request to fail, although everything else is fine with it.

Failed Dependent Request

In this book there is source code of a web test plugin that can filter dependent requests that are starting with certain string. This will be fine if your dependent request were from different domain like some CDN or caching solution. If I use this plugin as it is, it will filter all dependent requests from my SharePoint site. This is why I modified the code to filter dependent request that are ending with certain string. You can read how to develop and use your own plugin here. See below how it looks.


I have highlighted some of the important elements from the picture(hope you see it well), however I want to stress on two things. After you build your web test plugin project it is just a class library(dll), in order to have your new plugin available in your web tests you have to reference the dll in your Load Test project. The second thing is the configuration of the ending sting for filtering. This is first configured when you add the plugin to the web test, you add the string as value for property FilterDependentRequestsThatEndsWith, in my case it is ".js". If you like my plugin you can download the project (builded dll is in the bin/debug folder) here.

The third gotcha I will share took me 5 minutes to figure it out. It is not so much, but this issue popped out right after I launched the entire load test for first time. This was important because I was testing Production(like a boss) and I was going to test the limits of the Farm so the timing was negotiated with the customer as maintenance window with possible service degradation and/or outage.

Configure Load Test run location - This setting specify whether you will use Visual Studio Online or the computer you are running the test from. As I wrote above I have tested this with VS2013 Update 1 and 4, so I will assume that from Visual Studio 2013 Update 1 the default configuration is Visual Studio Online. You can change/check this in test settings of your solution it is in the General section as it is shown below.

Load Test Local Settings


If you click on Deployment section you will see that there is an option to add additional files that should be deployed with the solution.

[Solution 2 - File Upload is Failing] - Add the file as deployment in the test settings and if needed edit the post request form parameter to refer to the file only by filename. This solution is way better.

Finally, this post became too lengthy for blog post, but I hope it was informative!