Monday, 11 May 2015

Write and Get User Profile Properties in SharePoint Online with CSOM in PowerShell. The Scripts!

Last year (1st December) I published an article called Write and Get User Profile Properties in SharePoint Online with CSOM in PowerShell . In this article I demonstrated the latest capabilities that were introduced with the latest CSOM SDK. There were new APIs that made possible to work with User Profile properties in SharePoint Online(for now). I published PowerShell snippets for writing user profile properties and retrieving them. At that time I was unable to figure out how to write MultiValue properties, the snippets were just for the demo, with hard coded property and they were not very reusable.
Since this article became very popular and people were happy that they can do this from PowerShell, I decided to write generalized script for writing any Single or Multi value user profile property.
The Name(not the DisplayName) of the property to be edited is provided as script parameter. As I said the script is generalized, not tested with all available out of the box properties in SharePoint Online. Some of the properties have specific requirement for the value. For example the About Me property should be html(<br> instead new line), the Birthday should be some parsable datetime format and so on. If the format is not correct in most cases you will receive some meaningful exception from CSOM.
The script I will provide requires SharePoint Server 2013 Client Components SDK(x64) or SharePoint Online Client Components SDK(x64) that are released September 2014 or newer.
The script is written like native cmdlet, it accepts pipeline values for the AccountName, you can combine it with a CSV or simple text file that contains a list with account names, include it in your own scripts and modify it as you find for useful. Test the script to account that is not C-level manager :-) and if it works use it for bulk property updates for "live" accounts for example. 

To demonstrate how the script for writing user profile properties is working I am going to change the profile of  Anne Wallace (AnneW@spyankulov.onmicrosoft.com). She is the President of Contoso. I am going to change her Work Phone(Single Value), her Skills(Multi Value) and her Birthday(short date Single Value). See how her profile page looks like before changes.


Below are the PowerShell lines I have used to change this properties, I have made it in couple of lines in order to fit it better in the blog. Fore more examples see the Help section of the scripts.


########################## Change Work Phone ###############
.\Set-SPOUserProfileProperty.ps1 -PropertyName WorkPhone `
-AccountName 'AnneW@spyankulov.onmicrosoft.com' -Value "+44 555 555 55" `
-SPOAdminPortalUrl 'https://spyankulov-admin.sharepoint.com' `
-UserName ****@spyankulov.onmicrosoft.com -Password *******

########################## Change The Skills ###############

.\Set-SPOUserProfileProperty.ps1 -PropertyName 'SPS-Skills' `
-AccountName 'AnneW@spyankulov.onmicrosoft.com' -Value "SharePoint","PowerShell","Tech Support"-SPOAdminPortalUrl 'https://spyankulov-admin.sharepoint.com' `
-UserName ****@spyankulov.onmicrosoft.com -Password ******* -MultiValue
######################### Change The Birthday ###############                                         
.\Set-SPOUserProfileProperty.ps1 -PropertyName 'SPS-Birthday' `
-AccountName 'AnneW@spyankulov.onmicrosoft.com' -Value "5/9/1988" `
-SPOAdminPortalUrl 'https://spyankulov-admin.sharepoint.com' `
-UserName ****@spyankulov.onmicrosoft.com -Password *****


Lets see how Anne's profile looks like below, after the change.



We can see that the properties are changed accordingly. How cool is that :).

In the package you can download below I also included a script that can retrieve the user profile properties in PowerShell. Please, download the script,test,use and rate!
I am writing a fair amount of Help to my scripts, please read it.

 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Wednesday, 29 April 2015

Query Usage and Health Data collection database for slow requests execution

Here is another article I am going to dedicate to the SharePoint performance monitoring.
For me one of the most useful out of the box tools to gather information for the farm performance is the Usage and Health data collection done by the Usage and Health Data Collection Service Application. This service application can be easily provisioned and set if you have installed your SharePoint farm with AutoSPInstaller.
I highly recommend to plan,provision and use this service application in your SharePoint farm!
If you have not done this yet, you can see how to provision the application here. I recommend to specify at least the logging database name in the New-SPUsageApplication  command, for further configuration see following  article, there you will find how to set up various properties like log location, what to monitor and the retention of the logs. Here is an official resource from TechNet.
Above I mentioned the logging database, this is the database used by the Usage and Health Data Collection Service Application to store all the logs. The logs are collected on each SharePoint server and stored in the log location you have set. This logs are then pushed into the logging database by a timer job called "Microsoft SharePoint Foundation Usage Data Import ", by default it is running every 5 minutes.
The logging database is the only database in SharePoint where running SQL queries is supported in order to analyze our logs. See some official guidance View data in the logging database in SharePoint 2013.
The easiest way to get some information from the logging database is from the CA interface described here. There you can see some statistics for the Slowest Page request and you can filter by Server,Web Application, maximum range of 100 rows and predefined timeframe by day, week, month, as you can see on the screenshot below.

View Health Report

We are getting essential information from this page, but in my case I needed to track some occasional slow request reported by the users. Here are some disadvantages of the result that we get from "View Health reports" page.

1. This is the major one, because it is limiting the amount of information we can receive. The results are aggregated averages for a certain request. We can have one slow request or we may have many. In this case we are going to receive Average, Minimum, Maximum values for Duration for example, we cannot see when a peak had occurred, the requests are grouped.

2. The columns we can see is predefined. This is again because of the request grouping. If you do a simple Select Top 100 for example from one of the dbo.RequestUsage (this is the table where the requests are logged) partitions you will see that we have quite a lot of columns and some of them can be really useful for our troubleshooting process.

3. The timeframes are predefined, the result is limited by 100.

This is why I created a SQL query that will output all request for a certain timeframe that is configured by you, it can be less than a second or days. The output will show you all requests in that timeframe, they will not be grouped, you can add additional conditions and columns to be displayed and the most important, you can see the time that the request was logged. I find this extremely helpful for troubleshooting, you can narrow down the results for a very small timeframe, you can filter by user login for example and track the request in the ULS logs by Correlation Id. See how it looks from the screenshot below and you can compare it with the result from the Slowest Pages report.

[*UPDATE*]: I have published a PowerShell version you can find it and read more HERE

Slow Requests Query

The screenshot above is an older version, in the current version that you can download below I have included filtering by UserLogin. As I mention you can modify the script to show you whatever column  you want (include them in SELECT statement and the GROUP BY clause) and include additional filters. I have put example values for some of the variables, many of them can be Null and some useful comments as well.
A common use case will be if you receive a notification that the SharePoint was "slow" for a certain user in a certain time. You can use this script and it will show you all requests for this user, this information will help you in your investigation of potential performance issue.You can then export this result in CSV and play with it in Excel for example.
Keep in mind that this query can put some load on your SQL server, the timing is in UTC and the duration is in seconds.You should run the script against your Usage and Health (logging) database. The script is tested on SharePoint 2013/SQL 2012.

 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Thursday, 9 April 2015

Script to get All Webs in SharePoint Online site collection

Do you ever wanted to have a SharePoint Online cmdlet to get all webs in a site collection? Something like this can be useful if you want to see all webs and if there is something that you do not want or need. Or if you want to get all webs as original type and do further operations with them. Well as you know there is no such out of the box cmdlet in SharePoint Online Management Shell.
We however can use the dlls from SharePoint Server 2013 Client Components SDK and write our own powershell script to do that.
There however are many scripts from the community that are described as "get all webs in site collection". One of the most popular scripts that is popping out if you do some googling, is the script of Corey Roth (MVP). The script is nice but actually getting all webs in a SharePoint site collection with CSOM is a bit tricky, since we do not have (at least I do not know) a shortcut for getting really all webs, something like SPSite.AllWebs when you run powershell in on-premises.
What Cory's script is doing is to get the top level web from the client context, then get its sub webs and then it is returning the Title and the Url of the subwebs like I have shown in the screenshot below.


As you can see we are not getting much. The site I am doing this demo on is a site with two subwebs, below them we have some more subwebs, below them more and we cannot see them.
So I create a script that can output all webs in SharePoint Online site collection very similar to what SPSite.AllWebs is doing in on-premises and the output is Microsoft.SharePoint.Client.Web. See how it looks like.

All SharePoint Online Webs

As you can see the output is different. I have quite a lot of webs in my site, there is even an AppWeb.
See part of my code.


BEGIN{
function Get-SPOSubWebs{
        Param(
        [Microsoft.SharePoint.Client.ClientContext]$Context,
        [Microsoft.SharePoint.Client.Web]$RootWeb
        )


        $Webs = $RootWeb.Webs
        $Context.Load($Webs)
        $Context.ExecuteQuery()

        ForEach ($sWeb in $Webs)
        {
            Write-Output $sWeb
            Get-SPOSubWebs -RootWeb $sWeb -Context $Context
        }
    }
    Add-Type -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" | Out-Null
    Add-Type -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll" | Out-Null
}
PROCESS{

    $securePassword = ConvertTo-SecureString $PassWord -AsPlainText -Force
    $spoCred = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $securePassword)
    $ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl)
    $ctx.Credentials = $spoCred

    $Web = $ctx.Web
    $ctx.Load($Web)
    $ctx.ExecuteQuery()

    Write-Output $Web

    Get-SPOSubWebs -RootWeb $Web -Context $ctx
}



All I have done is to get the root web, return it as result, get the subwebs, then get the subwebs of each subweb that is found (and so on) and then return each web as result. The key job is done by a function defined in BEGIN blok (not very pretty, but works) Get-SPOSubWebs . The function is running itself for each web found, I think that this can be defined as recursion.
At the end we have all webs in the site collection. In order to run the script you will need SharePoint Server 2013 Client Components SDK (x64) installed and optionally if you want to see the data in powershell console loaded type display template(included with the script). For more information see my article Display CSOM objects in PowerShell.

 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Sunday, 5 April 2015

Load Testing SharePoint 2013 with Visual Studio 2013. Gotchas for SharePoint Administrators

Last week I had to do load test for one of our customers SharePoint 2013 environment, so I decided to share with you some gotchas that might not be obvious for SharePoint Administrators that are new to Visual Studio(VS) and do not have experience with load testing with VS, but want to try or use it. 
If you happened to be a fulltime developer or maybe QA, please do not laugh out loud and feel free to leave your comments below this post.
This article is not a walkthrough on how to build load test solution with VS2013. There is a great SPC 2014 session from Todd Klindt and Shane Young, it is very informative and so much fun.



So, in the test I created for my customer I have 3 Web Tests, two with read profile browsing different sites and testing search and one that has a write profile. The write activity is to create an item in out of the box Contact List and attach a small file to the item. The first thing that failed was in the "write" Web Test.

File Upload is Failing - Now in the new versions of Visual Studio this should not happen, the feature was introduced back in VS2010. However it happened with me when I did this the first time. The entire web test failed because it was unable to find the file that should be attached in the list item. The strange thing is that in the next day I recreated the test and everything worked fine. I tested this with Visual Studio 2013 Update 1 and Update 4. The normal behavior is when you record your web test Visual Studio is intelligent enough and knows what file you are uploading, it will copy that file to the load test project root folder and will set the post request for the test to take the file in your project folder. However this might not happen and your web test will fail, like I have shown on the picture below.

Visual Studio 2013 Failed Web Test Upload

[Solution 1] - This is the quick and dirty solution and since I am pro I went right for it :). When we are adding new item to a list we have a form that we should fill and then by clicking Save we send POST request with the parameters of the form, the file path  that should be uploaded is parameter as well. The POST request parameters are recorded by Visual Studio and you can edit them. You have just to find the correct request, expand it, look in to the "Form Post Parameters" like it is shown below.

Form Post Parameters

Next you have to locate "File Upload Parameter" that has some value for FileName and the value is just the filename of the file you want to upload. It is possible to have empty File Upload Parameters or many different parameters. When you find the correct  File Upload Parameter you just put the literal path of the file as FileName property, like on the picture below.

Edit FileName Property

Now, this is not very good because when you move your Load Test project to a different computer you have to move the file and again edit the File Upload Parameter(eventually). For a better solution keep reading/scrolling.

Masking Unwanted/Failed dependent requests - The dependent requests are the requests that you make when you hit a page and load additional resources. For example, you test request to http://sharepoint.contoso.net, but in the landing page there are multiple files referred like CSS, JavaScript, pictures etc. The requests to this files are dependent requests and they are taken into account in your web test results if they are successful or failed.
This became kind of a deal for me because I was testing highly customized site and several dependent requests for JavaScript files (.js) were failing with status 404 and this will not look good in your load test results. After a short googling I found this free eBook. There on page 189 is discussed exactly this issue. The solution to mask failed dependent request is to use web test plugin. Below you can see how a web test result looks like with failed dependent request. I have simulated this in my lab by referencing a JavaScript file located in the same site and then I deleted it. You can see that the missing file currentU.js even is making one of my primary request to fail, although everything else is fine with it.

Failed Dependent Request

In this book there is source code of a web test plugin that can filter dependent requests that are starting with certain string. This will be fine if your dependent request were from different domain like some CDN or caching solution. If I use this plugin as it is, it will filter all dependent requests from my SharePoint site. This is why I modified the code to filter dependent request that are ending with certain string. You can read how to develop and use your own plugin here. See below how it looks.


I have highlighted some of the important elements from the picture(hope you see it well), however I want to stress on two things. After you build your web test plugin project it is just a class library(dll), in order to have your new plugin available in your web tests you have to reference the dll in your Load Test project. The second thing is the configuration of the ending sting for filtering. This is first configured when you add the plugin to the web test, you add the string as value for property FilterDependentRequestsThatEndsWith, in my case it is ".js". If you like my plugin you can download the project (builded dll is in the bin/debug folder) here.

The third gotcha I will share took me 5 minutes to figure it out. It is not so much, but this issue popped out right after I launched the entire load test for first time. This was important because I was testing Production(like a boss) and I was going to test the limits of the Farm so the timing was negotiated with the customer as maintenance window with possible service degradation and/or outage.

Configure Load Test run location - This setting specify whether you will use Visual Studio Online or the computer you are running the test from. As I wrote above I have tested this with VS2013 Update 1 and 4, so I will assume that from Visual Studio 2013 Update 1 the default configuration is Visual Studio Online. You can change/check this in test settings of your solution it is in the General section as it is shown below.

Load Test Local Settings


If you click on Deployment section you will see that there is an option to add additional files that should be deployed with the solution.

[Solution 2 - File Upload is Failing] - Add the file as deployment in the test settings and if needed edit the post request form parameter to refer to the file only by filename. This solution is way better.

Finally, this post became too lengthy for blog post, but I hope it was informative!


Wednesday, 18 February 2015

User Profile Incremental Synchronization Timer Job - Access is denied Error

I stumbled on this issue in one of our customer's SharePoint 2013 environment and I came across many forum posts and article, but did not found the complete solution, so I decided to share it with you.
The background story is as follows. We have a SharePoint Server 2013 hosted on Windows Server 2012 environment that is using User Profile Synchronization(leveraging FIM). We have one User Profile Service application.
The User Profile Synchronization service is running on the server where the Central Administration is hosted.
So far so good, we also have the User Profile Service running on two servers, in order to have a redundancy for this service, for short I will call this servers APP01 (CA,Sync Service, User Profile Service) and APP02(User Profile Service). The User Profile Synchronization service is successfully started and running on APP01.
The issue comes when you want to go and implement the principle of least privilege. This will require to remove the Farm account from the local Administrators group on the server where the User profile Synchronization is running (FIM). You need the Farm Account to be Local Admin in order to successfully start User Profile Synchronization.
You remove the the Farm account from the local Administration group.
Time goes by and you start to receive an error Event ID: 6398 on APP02 with following description:

The Execute method of job definition Microsoft.Office.Server.UserProfiles.UserProfileImportJob (ID 7acb171d-5d7d-4601-ba08-85f24c0969b4) threw an exception.More information is included below.

Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))


This description tells us that the "User Profile Incremental Synchronization Timer Job" timer job is failing on APP02. To be sure you can use Get-SPTimerJob and for Identity parameter to use the GUID from the error message. This is the Timer Job responsible for the User Profile Synchronization. You check the Timer Job history for this job and it is running on APP02 and it is Failing.

Failed User Profile Incremental Synchronization Timer Job

It is possible to see this job to fail every minute one day it can run on APP02 and next run to be on APP01, or to see ongoing synchronization in the Central Administration, but when you open the miisclient.exe you see that there is no synchronization operations running, this however may prevent new synchronisations to start.
But User Profile Synchronization service is running on APP01 not on APP02!
Now all over the forums, folks are posting question like "The Synchronization timer job is running on the wrong server and it is failing". Actually this is not true!
Remember that we are running User Profile Service instance on both APP01 and APP02. The "User Profile Incremental Synchronization Timer Job" can run on both servers and it should run in success.
So the timer job is executed by the Timer Service and the Timer Service is running under the Farm account as the Sync Service. From the event log error it is obvious that the Farm account needs some permissions in order to execute the job successfully. When the account is in the local Administration group it "have it all".

The First thing to check is following MS Article. There is a section called "Remove unnecessary permissions". In this article it is explained how to grant Remote Enable permission over the Microsoft FIM 2010 WMI namespace MicrosoftIdentityIntegrationServer (on the server running FIM). This permission is needed for the Farm account to be able to run the job successfully from server that is not running User Profile Synchronization Service (FIM). Remember to do IISRESET if you are running the Sync. service on a server that is also hosting Central Administration.
Of course I did the procedure above, but does not worked out!
Apparently the job needs to interact with the Microsoft FIM 2010 WMI namespace. I am not sure for this but I think that the timer job is invoking Execute method on MIIS_ManagementAgent class.
Out of curiosity below you can see how the MicrosoftIdentityIntegrationServer (MIIS) WMI namespace looks like. I have based my User Profile Synchronization reporting script on MIIS WMI.

MIIS WMI Namespace

Now under the hood WMI is using DCOM, so in order a non-Administrator account to interact with WMI remotely it needs permissions over DCOM on the server where the Sync service is running at.

The Second things to check/change are the permissions over DCOM server on the machine that is running User Profile Synchronization Service. The Farm account needs to have Remote Access, Remote Launch and Remote Activation permissions. To do this use the article  Securing a Remote WMI Connection . Now this is a bit tricky and confusing for me. There is a simple way to check if the Farm account has remote access to WMI. By using below PowerShell command with farm account credentials from remote computer you should be able to retrieve the synchronization operations history.

Get-WmiObject -Class MIIS_RunHistory -Namespace root/MicrosoftIdentityIntegrationServer -ComputerName <YourSyncServer> -Credential <Domain>\<FarmAccount>

If there is an issue with the permissions you will receive some Access Denied error like this.

WMI Access Denied

The DCOM permissions can be tricky after doing a change there close and open new PowerShell console and test again the remote WMI with the script above. If you still have Access Denied error locate Windows Management and Instrumentation DCOM application and give needed permissions to the Farm Account.

As you can see there are lots of moving  parts Timer Jobs, WMI, DCOM and so on. But there are countless reason for fail. Use the PowerShell line above. If there is an issue with the WMI remoting it will show it. There may be firewall issue it will show you something like "RPC service unavailable" error. If you do not receive a result from the remote WMI test you have some issue.

This is how I solved the issue. I am hoping that it is helpful for you!

Monday, 16 February 2015

SharePoint Weekly Backup cycle script.

In this post I am going to share a script for SharePoint backup I wrote for one of our customers.
Since we had no budget for a 3rd party backup solution I decided to use the native SharePoint backup capabilities in a script.
For me the SharePoint Farm backup is a good solution if you do not have any advanced 3rd party tools like DocAve for example or your Farm is not very big. Amongst DPM,SQL backups, File System backups the SharePoint farm backup is the best solution in case of disaster and you have to restore the entire Farm. You can see the following article for details on the different methods.
The drawback with the SharePoint Farm backup is that it is not very granular and it can be a bit slow in large environments. For example in Web Applications the lowest level object you can restore is Content Database. However, you can restore a content database and then use Recover data from an unattached content database and restore site, web or list. Still, you will not be able to directly restore an item for example.
Be aware that if you try to restore a content database from Farm backup to a live Farm you may receive error similar to the one below.

Restore failed for Object 2K8R2_SP_Content_Portal_Bla (previous name: 2K8R2_SP_Content_Portal) failed in event OnPostRestore. For more information, see the spbackup.log or sprestore.log file located in the backup directory.

And when you look at the restore log you will find something like this:

FatalError: Object 2K8R2_SP_Content_Portal_Bla (previous name: 2K8R2_SP_Content_Portal) failed in event OnPostRestore. For more information, see the spbackup.log or sprestore.log file located in the backup directory. SPException: Cannot attach database to Web application. Use the command line tool or Central Administration pages to attach the database manually to the proper Web Application.

In this case I am restoring the old content database 2K8R2_SP_Content_Portal to a database with name 2K8R2_SP_Content_Portal_Bla. However SharePoint will try to attach the restored content database to the Web Application and it will fail because we already have a content database with that ID. However the restored database is completely okay and you can see it in the SQL Server Management Studio.
The customer required to have a backup once a day and to keep the backups for one month. We were also disk space constrained.
There are many SharePoint backup scripts, but I wanted to have a different folder for every week and also to have one full and the rest of the backups to be differential in order to save some free space. Additionally you can enable SQL Server backup compression to save more disk space. 
The script I wrote is doing this, also it can backup only the configuration.
The script is doing a 7 day cycles, when a new cycle starts it will create a new folder that resembles the start, end and the type of the backup. For example if you start a full backup cycle on 14.02.2015 the script will create a folder with name 14022015_20022015_F (Dates are in EU standard ddMMyyyy). Since it is new cycle the script will start with Full farm backup. Then all the backups until 20.02.2015 will be differential. Here how it looks one weekly cycle.

SharePoint backup cycle


The script will also do a clean up. You can specify how many cycles you want to maintain and it will keep them and the old backups will be deleted. Do no combine this script with full database backups because it will break full-differential sequence.  


 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Tuesday, 3 February 2015

Start User Profile Synchronization with PowerShell [Tip]

This is something new I learned and decided to share it in this short tip.
So I was trying to help in a TechNet Forum question. The user was asking how to start Incremental User Profile Synchronization with PowerShell because she needed it to run two times a day. Now when you manually star Full or Incremental sync. a timer job is started named "User Profile Service Application - User Profile Incremental Synchronization". If you start this job manually from the UI or in PowerShell it will trigger Incremental synchronization. If you want to set the schedule for the Incremental sync. you go to "Configure Synchronization Timer Job" in the User Profile Service Application Management page. However you cannot set the job to run twice a day. Here comes the PowerShell.
Thanks to Trevor Seward I learned this alternative method to start User Profile Synchronization in PowerShell, it will let you chose which type the sync. should be (Incremental or Full) by specifying $true or $false argument for the StartImport method. Here are the snippets for starting Full or Incremental synchronization.

$UPS= Get-SPServiceApplication | Where { $_.DisplayName -eq "<UPSA_DisplayName>"}
$UPS.StartImport($true)

## To Start Incremental Sync. use $false as argument for StartImport method

$UPS= Get-SPServiceApplication | Where { $_.DisplayName -eq "<UPSA_DisplayName>"}
$UPS.StartImport($false)


At the end of the day it is just starting the timer job I mentioned above, but it may come handy for some powershell scripts.

Sunday, 25 January 2015

Test and Monitor your SQL storage performance and some good practices for SharePoint.

In the last couple of weeks I've been busy with analyzing and optimising one of our customer SharePoint 2013 deployment. This is a huge topic to discuss, SharePoint itself is not a slow application. However, there might be countless reasons that can make your SharePoint work and feel slow. In my case the source of the performance issues was the SQL.
You should pay as much attention to your SQL when you plan and deploy your SharePoint solution, as other components. The SQL server is referred as  the storage tier for SharePoint. Except the binaries and the static application files SharePoint stores almost everything(94% to be accurate) in the SQL, after all SharePoint farm is just a Configuration database and all your user content.sites,webs are stored in content databases.
Now, I am not a SQL Server DBA nor a storage expert, but for me the storage that hosts you databases is one of the most important components for the performance of your SQL. If your RAM is enough for your workload,you have enough CPU power or you have followed all good practices for SQL that hosts SharePoint, but if your storage needs more time accessing and delivering certain data(latency) or the debit (IOPS) is not small enough or big enough the SQL will work slow, therefore your SharePoint will work and feel slow.
In this post I am going to give you links and short description to articles and tools(free/open source) you can use for testing and monitor your SQL storage at the planning phase or as it was in my case after the system was deployed.
I mentioned the good practices for SQL that hosts SharePoint. Well if you do not know what this practices are you can learn about them from the following resources:

    1. Tuning SQL Server 2012 for SharePoint 2013 - This are MVA videos hosted by Bill Baer(Senior Technical Product Manager) and Brian Alderman. In this demo-rich videos they will cover some of the best practices for integrating SQL Server 2012 and SharePoint 2013

    2. Maximizing SQL 2012 Performance for SharePoint 2013 Whitepaper - This is a great whitepaper for the good practices that can maximize your SQL Server performance for SharePoint. The author is Vlad Catrinescu (SharePoint MVP) that covers most of the points from the videos above, plus many additional.

I had to implement lots of this practices before going to investigate any storage bottlenecks.

Now I would like to bold on one thing that has became more important with the development of the Cloud services. Distribute your database files on as many disks as possible. Every cloud provider I've been working with limits the VHDs by IOPS. For example in Azure class A and D VMs you have 500 IOPS per disk. However with the different VM sizes you can attach different number of disks. With A4 for example you can attach up to 16 data disks and that will make 16x500 IOPS. So why not attach as many disk as possible if it will not affect your budget(most of the providers charge storage per GB)?

One of the first thing to check if you are experiencing SharePoint slowness and you are suspecting slow SQL is the "Developer Dashboard". There you can find the SQL execution time and many more useful stats for your request execution.

Developer Dashboard

You can find information how to enable and use the dashboard for SharePoint 2013 in following Article.

At the planning phase of your solution, before you have installed anything you can use various tools to test the performance of your storage, here are some of them:

     1. SQLIO - In this article from Brent Ozar you will find download link for the tool and how to use it. The tool is really straightforward and can show you basic performance information for your storage, IO/S, MBs/S and the latency from your output. In this article you will also find links for other tools like SQLIOSim that can simulate SQL Server OLTP and OLAP.

     2. DiskSpd - Now this is more robust tool for testing your storage. With this tool you can test local disks and SMB shares as well. This is a great article from Jose Barreto (member of the File Server team at Microsoft). There you will find everything about DiskSpd. The tool also comes with very comprehensive documentation. However this is a powerful tool  and you should not run it against storage with live data because it can destroy it.

Tools that you can use to monitor after system deployment:

     1. PAL  - This is a great universal tool for Performance Log Analysis. It is written by Clint Huffman (PFE at Microsoft). It is a tool not just for SQL, but for number of products. If you do not know how to interpret the performance logs for certain Microsoft product or you need aggregated performance data report , the PAL is just the tool for you. PAL is working with templates for the different products. If you do not know what PerfMon counters to capture you can export the PAL template to PerfMon template and then you can create regular Data Collector Set in the Performance Monitor MMC. You can see an example for PAL HTML report below.

PAL Report Sample


     2. SSMS Activity Monitor - This is a great tool that you can launch in the SQL Server Management Studio. Just open SSMS right click on the instance and open Activity Monitor. There you will find current stats for Overview, Active User Tasks, Resource Waits, Data File I/O, and Recent Expensive Queries. If you do not know how the Activity Monitor looks like you can see it below:

Activity Monitor

     3. Triage Wait Stats in SQL Server -  Here is another article from Brent Ozar where you can find an explanation of what Resource Waits are, a script that can capture stats for a period of time and references to other articles on the subject. For me the Waits are very important thing to look at when it comes to SQL performance in general.

     4. Capturing IO latencies for a period of time - This article is last but not least. It is written by Paul Randal from www.sqlskills.com. There you can find a script that for me is the ultimate tool to identify some storage bottlenecks. Because it is not mentioned in the article you can set the time period with changing WAITFOR DELAY. In the output you can see the average latency (Read/Write) per file Average Bytes for Read/Write operations and more. You can see example output below.



And here are the final notes. SharePoint is application where we cannot edit some DB schema for example. It is enterprise product, not some inhouse build. If you have followed above mentioned Best Practices, you have correctly scaled your deployment, you are sure that there is no customisation that can cause issues, but you suddenly receive performance hit most, probably it is caused by other systems that are more likely to change like storage,network, Windows etc.. And of course there is no magic number or silver bullet to fix all issues.
If you are just SharePoint Administrator and deep diving in SQL and storage is not your responsibility, notify and push the SQL and Storage guys to do their work and fix the issues. If you are not sure with above mentioned tools, test them in non-production environment first. As far as I understand T-SQL the above scripts do not touch any user database.

I am hoping this was useful for you.

Sunday, 18 January 2015

Unable to Update/Uninstall/Repair SharePoint binaries. Find out if files are missing from 'C:\Windows\Installer'

I guess that some of the things I am going to tell you in this post are well known for some of you that has been involved in Windows platform maintenance in recent years.
However this can happen, it can happen in your SharePoint environment, can happen in Production and you should be prepared for it.
So, last week I was helping a colleague of mine that has an issue in one of our customers production SharePoint 2013 servers. The farm has several language packs installed, but for one of the servers it was showing that this language packs are not installed. There was no chance this server to be missing binaries. We checked the "Product Version Job" timer job. This job by default runs every night on each SharePoint server and it should check the installed products and versions. If there is server where some binaries are not installed or they are lower version you will see some red colored alerts like the one below. However, it is possible that this job is unable to get the product data, because it runs under the Farm account identity and it needs local administrator privileges to retrieve the product data. In this case you will see event log warnings like the one described in this WiKi, you can put the account in the local admin group, restart the Timer Service and run the job, but this is not recommended. You can run the job with different identity if you log in to the server with proper account and run the command Get-SPProduct -Local and you will be able to update the data for the products installed on the server, this can help if PSConfig is blocked because of "missing products".

Missing Language Pack

Unfortunately this did not helped us in this case. We decided to run PSConfig on the server in question and oh surprise it was blocked by missing products on the server.
Logically we came up to the idea to Uninstall/Repair and reinstall the language packs.
However we were unable to do so from control panel nor by running the installation packages.
This was really strange, then we figure out that there might be some files missing from 'C:\Windows\Installer'.
This folder stores the Windows Installer Cache. The folder is hidden by default and it contains some very important files for every software that was deployed on the system using Windows Installer technology and this are all products and updates that are coming from Microsoft. The files are unique for every machine and they are not interchangeable. If you are not aware for the importance of this files it is possible to think that this are some temporary files that can be deleted. 
You will be seriously wrong! If there is missing files from this folder it will block the Update/Uninstall/Repair of the product  and your server is in "out of support" state. The guidance from Microsoft for such cases is very general. You need Machine Rebuild.
Apparently something has happened on our server. The support and maintenance of the Windows platform for this customer is delegated to a different company, so it is not possible someone from us to delete/remove/rename some of the files in that folder.
This situation is described in KB2667628, there you will find guidance for a very nice Microsoft Support Diagnostics tool called Windows Installer Cache Diagnostic. Run the tool on the server and it will generate a CAB file that contains nice reports on the installer cache. If there are no missing files you will see this:

No Windows Installer Cache Issue

If you have missing files you will see something like this for every missing files. Notice the bolded table row Action Required, it says "This .msp appears to be missing. Machine needs to be rebuilt."

Missing files from Windows Installer Cache

With this reports you have a strong weapon against the platform support teams :). 
However you will need to rebuild this server. I am not sure if restoring the files from backup will be supported. If you have not experienced such issues I think that it is good idea to run the diagnostic tool to check all your servers, it is fairly fast and simple.

Tuesday, 13 January 2015

Change the URL of existing SharePoint List/Library[Tip]

Let's say that a site admin has created a ShatePoint library (or a list) with name CrappyOldUrl. Then the user decides that this is not an appropriate name for his library. The user can easily change the Name of the library from the Library Settings and changes it to Nice New Library. However the url of the Library is still /CrappyOldUrl/ and he/she calls you for help to change the url to something like http://<yoursite>/Nice New Library/.

Crappy Library Url

There are many articles that will tell you to use SharePoint Designer, Save the library as template and recreate it or something else. But you are lazy SharePoint administrator, for you here is a way to do this in PowerShell.

$web = Get-SPWeb http://portal-2k8r2.ilabs.l/ ## the Url of the Web
$list = $web.Lists["Nice New Library"] ##Get by current Library Name
$list.RootFolder.MoveTo("/Nice New Library") ## Change the Url(relative to the Web)


And here is the result. Be careful because the Name of the library may be changed as well if it is not the same as the new url.

New Library Url

In this case we are renaming the Root Folder of the library the original URL of the library in this example is http(s)://<WebUrl>/<LibraryRootFolder>.
However if you are renaming List it is likely that the URL of your list is http(s)://<WebUrl>/Lists/<LibraryRootFolder> in this case if you do not want to put the list in the Root folder and keep the .../Lists/... The code will be as follows:


$web = Get-SPWeb http://portal-2k8r2.ilabs.l/ ## the Url of the Web
$list = $web.Lists["Nice New List"] ##Get by current List Name
$list.RootFolder.MoveTo("/List/Nice New List") ## Change the Url


You can see current RootFolder with something like this:


$web = Get-SPWeb http://portal-2k8r2.ilabs.l/ ## the Url of the Web
$list = $web.Lists["Nice New List"] ##Get by current List Name
$list.RootFolder

See the value of property ServerRelativeUrl.

[*UPDATE*]: Check out a Video Tutorial of this solution HERE

Monday, 1 December 2014

Write and Get User Profile Properties in SharePoint Online with CSOM in PowerShell

A couple of weeks ago Vesa Juvonen wrote a post about the new capability in CSOM that is letting us to write user profile properties. This feature is already enabled in all Office 365 tenants.
The features are available in Microsoft.SharePoint.Client.UserProfiles.dll assembly version 16. Here is the moment to mention that if you download the latest CSOM package you will receive client dlls version 15 and version 16. In my earlier post I wrote about using CSOM in powershell I mentioned that the client dlls are located in "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI", well there are version 15 dlls that can be used against SharePoint 2013 on-premise. But if you want to use the latest and greatest features against SharePoint Online you should load the version 16 dlls located in "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI".
The new features are coming  from two new methods in PeopleManager class, SetSingleValueProfileProperty() and SetMultiValuedProfileProperty().
In his post Vesa gives example code that demonstrates the capability in SharePoint app. I found this interesting and since I am not a developer I will show you how this works in PowerShell. Unfortunately, I was unable to get SetMultiValuedProfileProperty() working from powershell, I think that this is because of the issue with the generic lists in powershell. However I will show you a function that edits the AboutMe user profile property and function that will get all user properties. I will authenticate against the admin portal of SharePoint online and I will be able to play with the properties of other users.
Below you can see both function(for editing AboutMe and gettin profile properties) and everything I do befor that.

[*UPDATE*]: I have published "universal" scripts for editing any User Profile property read more HERE

$cDLLS = Get-ChildItem -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI" | Where {$_.Name -notlike "*.Portable.dll"}
ForEach ($dll in $cDLLS)
{
    Add-Type -Path $dll.FullName
}


$username = "******@yankulovdemo.onmicrosoft.com" 
$password = "*******" 
$url = "https://yankulovdemo-admin.sharepoint.com"
$securePassword = ConvertTo-SecureString $Password -AsPlainText -Force

$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $securePassword)
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($url)
$Context.Credentials = $credentials
$Global:oContext = $Context

Function Update-AboutText{
Param(
    [string]$AboutText,
    [string]$UserLogin
)
Process{
    $logIn = ("i:0#.f|membership|" + $UserLogin)
    $ctx = $Global:oContext
    $aboutTextHtml = $AboutText.Replace(([System.Environment]::NewLine), "<br />")
    $peopleManager = New-Object Microsoft.SharePoint.Client.UserProfiles.PeopleManager($ctx)
    $Properties = $peopleManager.GetPropertiesFor($logIn)
    $ctx.Load($Properties)
    $ctx.ExecuteQuery()
    $peopleManager.SetSingleValueProfileProperty($logIn, "AboutMe", $AboutText)
    $ctx.ExecuteQuery()
}
}

Function Get-UserProperties{
Param(
    [string]$UserLogin
)
Process{
    $logIn = ("i:0#.f|membership|" + $UserLogin)
    $ctx = $Global:oContext
    $peopleManager = New-Object Microsoft.SharePoint.Client.UserProfiles.PeopleManager($ctx)
    $user = $peopleManager.GetPropertiesFor($logIn)
    $ctx.Load($user)
    $ctx.ExecuteQuery()
    Write-Output $user.UserProfileProperties

}
}


The first thing I do is to load the client dlls. Second, I am getting the client context, as I said against the admin portal and putting it as global variable for further reuse.
After we are ready we can start editing the properties and here is the outcome:

Write and Get User Profile Property with CSOM

It seems that this worked out. The updated About Me property is shown in Gencho's profile as well.

He is the master of SharePoint