Showing posts with label SharePoint. Show all posts
Showing posts with label SharePoint. Show all posts

Monday, 23 April 2018

Closing, Opening and Unlocking SharePoint site collections

The way to implement some sort of site collection life cycle in SharePoint Server and the classic SharePoint Online sites is the Site Policy.
With the site policy you can set when to close the site and what time to wait after closure and delete it.
Closing and Opening of site can be done very easily using server or client side code if the site already has policy assigned.
Below is an example server side powershell code for closing and opening site collections. Note that the code should be executed within elevated privilege context.


Add-PSSnapin *sh*
 
## Close Site Collection
[Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
 {
  $spSite = Get-SPSite http://portal.azdev.l/sites/TestSite/
            
  [Microsoft.Office.RecordsManagement.InformationPolicy.ProjectPolicy]::`
   CloseProject($spSite.OpenWeb())
 }
)
 
## Open Site Collection
[Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
 {
  $spSite = Get-SPSite http://portal.azdev.l/sites/TestSite/
            
  [Microsoft.Office.RecordsManagement.InformationPolicy.ProjectPolicy]::`
   OpenProject($spSite.OpenWeb())
 }
)


Reopening the site will also update the "Project Expiration date". This is the date when the site will be deleted according to the applied site policy.
You might need to reopen a site if for example you need to do some administrative task over the site collection like disabling feature, removing event receiver or something similar.
However, changing the deletion date might not be acceptable.
When a site is closed a special read-only lock is applied. If you check in the Central Administration you will see below.

Archived Site
If you have to do change in couple of closed sites, you can remove the lock from the Central Administration UI. Doing this for hundreds or thousand of closed sites will not be very practical.
The issue is that doing below command will not unlock the site if it was closed by site policy or using the ProjectPolicy class.


Set-SPSite -Identity http://portal.contoso.net/sites/TestSite -LockState Unlock


The key thing to notice on the picture above is the term "Archived". The SPSite object has Archived Boolean property, if it is true the site is "archived" and read-only, if false and there is no other lock type applied, the site will be read-write. You can just change the value of that property with PowerShell, setting the value to false will not alter the project expiration date.


$spSite = Get-SPSite http://portal.contoso.net/sites/TestSite 
## Unlock the site
$spSite.Archived = $false
 
## DO YOUR THING
 
## Lock back the site
$spSite.Archived = $true


There is no client side analog that I am aware of. I hope it was helpful!

Thursday, 6 April 2017

Nintex Workflow UDA Usage report script

Yesterday I worked with a client that have many Nintex workflow published with heavy usage of UDAs(User Defined Actions). I wanted to get a detailed report on the UDA usage. Unfortunately I am not aware of any  out of the box Nintex tool that can do that. The Analyze button can give you some information, but you need to click on the workflow to find out where it is located, you need to be in the scope where the UDA is published, you can get information for one UDA at a time and the information is not really "exportable".
This is why I created a powershell script that will give you information for the UDA usage across the farm on all levels. It will give you useful information like UDA Name, Workflow Name, Defined At, List, Web, Site, WebApplication, WorkflowType, Author, UDA Version Used, Workflow Id.
There are two "modes" of the script, the default will give you just the GUIDs of the list,web,site and the web apps. If you want to get the name of the list and the URLs you need to use the second mode that will require more time to complete but will give you nice looking URLs instead GUIDs. If you want to get the URLs just use switch parameter GetUrls. The result can be saved in CSV format or it can be outputted in powershell. If you give value for CSVPath the Grid View will open at the end to visualize the data. The main source of information is the Nintex Configuration database and you can use the script with SQL authentication if you have an account with enough permissions and your SQL supports it.
I have tested the script with SharePoint 2016,2013 and 2010 and the oldest Nintex Workflow version I tested was 2.3.7.0.
You can see the code and the output examples below. I hope you find it useful!

Output with URLs retrieved:
Nintex Workflow UDA Usage report with URLs

Quick output with GUIDs:
Nintex Workflow UDA Usage report with GUIDs

Monday, 14 November 2016

Trust failed error when browsing the Central Administration

In this quick post I am going to share an issue that I recently hit with one SharePoint Server deployment. While browsing the Central Administration I got below error when clicking on the Manage service applications page.


The key thing with this error is to know the background story, something I was missing.
The story is that this farm was migrated from one domain to another.
Everything was working fine the new farm was in production when we started to get this error.
There was one small detail that we were not aware of and it is that there was domain trust between the new and the old domain during the migration. This is why everithing was working fine until the network link between the old and the new domain was cut.
With this small detail the error below started to make sense. You will see this error in different .NET apps if the app is trying to do something with identity from a trusted domain but no domain controller from the trusted domain can be reached.

The trust relationship between the primary domain and the trusted domain failed.

By looking at the "Delegated Administrators" I concluded that there are accounts from the old domain that have permissions over some of the service applications. I was even unable to get the service applications using Get-SPServiceApplication in powershell. It seems that there is some identity checking when we access the service application management page and it is failing because the trusted domain cannot be reached. The same exception can be reproduced if you try to translate username from the trusted domain to SID. The lines below are a good test to check if there is an issue with the trusted domain with PowerShell.

$userName = "DOMAIN\User"
$objUser = New-Object System.Security.Principal.NTAccount($userName)
$strSID = $objUser.Translate([System.Security.Principal.SecurityIdentifier])
$strSID.Value

Here are some of the things that might not work if you are in this situation:

- You will not be able to access the service management page
- You will not be able to enumerate the service applications in powershell
- In my case the Search and UPA was the SAs with administrators from the trusted domain and you will not be able to restart the service instances
- The UPA might stop working working completely
- If you clear the configuration cache the Timer Service will fall in a loop of rebuild attempts and crashes and no timer jobs will be executed.

As you can see this situation might not be a good place to be :).
The solution to this will be to restore the connection to the trusted domain and I am talking about a physical availability to a DC from the trusted domain or just remove the trust from the current domain. Sometimes the second solution might be the only possible solution, or just maybe this relationship was just not removed by the domain admins when the connection was cut.
If you remove the domain trust the error will be fixed and the translate method in powershell will fail with "Some or all identity references could not be translated." which it seems is handled better. Than you will be able to do some proper cleanup, if you wish.
I hope that this post was helpful! 

Friday, 14 October 2016

Build slider bar graph date time search refiner with custom intervals

A couple of weeks ago I worked with a client that had this requirement for their search center in SharePoint Online. They had a repository with different research documents and these documents had a Publishing Date date/time field with values up to 30 years ago.
The client wanted to build a result page for this documents and have a slider bar refiner with custom intervals up to 10 years ago. 
If we have a numeric based managed property we can specify a custom refiner interval like the one below.
Unfortunately the Custom option is missing for date and time datatype. We have predefined intervals that are up to one year ago and "Defined in search schema" which I am not sure what is suppose to mean, but this will be the error you will get if you select this option.

For this Display Template you must specify custom intervals for the values that will be shown. Please change the refinement settings to use custom intervals.

It really does not tell us much if you don't have an option to specify custom interval in the UI.
Luckily if  you export the Refinement webpart you can see more refiner settings. All selected refiners are represented as JSON and below are the settings of our Publication Date refiner(formatted).



There are two settings that grab the attention and they are highlighted in the picture above. They are "useDefaultDateIntervals", which obviously means if the default intervals that cover only one year should be used and "intervals" that should represent the custom intervals. After some research on the web I found that the intervals value should be array of integers that are representing the intervals in days. I came up with these intervals for my client: Ten Years Ago, Five Years Ago, Three Years Ago, One Year Ago, Six Months Ago, Three Months Ago, One Month Ago, 7 days Ago and Today. This will be set with flowing intervals value:

[-3650,-1825,-1095,-365,-180,-90,-30,-7,0]

The first step will be to update the values for "useDefaultDateIntervals" and "intervals". Set the "useDefaultDateIntervals" to false and for "intervals" use your interval array like the picture below.


Then you will need to import the webpart  and use it in your page. The result is below.


We have our custom intervals and they are working as we expect(at least with me). However we can see one big issue and it is that the intervals are not labeled appropriately. This should be fixed in the refiner display template.
As it is not a good practice or practical in this case to edit the out of the box display template I created a new display template based on the out of the box "Slider with bar graph".
In the new template I have specified values for the Label and the NextIntervalLabel of all "filter boundaries". In this example we are going to have 10 boundaries. NextIntervalLabel is used when you move the mouse over the bar and the Label is used for boundary label in the slider. You can see the entire template below.

On line 104 we can see how to get all boundaries and their values for Label and NextIntervalLabel.
After deploying and setting the new display template we can see that the labels are much more accurate.


There is small detail that should be updated and it is the start and end labels of the bar graph.
Unfortunately my solution to that is to change the text by selecting the elements by class name and this is not the most elegant solution if you have more than one slider bar refiners, in that case you will need to change the index number to get the correct elements. You can see the code below.

With this final touch this is how our custom slider bar graph refiner looks like.



It looks really cool and useful. If you check the refiner settings in the UI now you will see that "Defined in search schema" is selected. I found this misleading since I have done nothing special in the search schema.
I hope that this was helpful!

Monday, 20 June 2016

Get All Items in 5000+ large list with CSOM in PowerShell

Last week I had to write a script that needed to take all items in large SharePoint Online list.
By large I mean above 5000 items. This means that the list is above the list view threshold in SharePoint Online, which is 5000 and we cannot change that. The way to get all items in SharePoint Online is to use CAML query. However if it is just an empty query without any filtering it will fail, if you use unindexed column for filtering or ordering the query will fail, if you filter/order by indexed column and the query returns more than 5000 items it will fail again. The error in this and other scenarios is similar to the one below.

Exception calling "ExecuteQuery" with "0" argument(s): "The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator."

The way to workaround this is pagination of the view. This means that we will have a row limit of the result that the query can return that should be less or equal to 5000. Once we get the first 5000 items we can do another query for the next 5000 starting from the position where the first result(page) ends. This is the same with what we do in the UI scrolling foreword in the list view. Below is an example PowerShell snippet that will take all items from a list using 5000 items page size ordering the items by ID.

$list = $ctx.Web.Lists.GetByTitle($DocLibName)
$ctx.Load($list)
$ctx.ExecuteQuery()
## View XML
$qCommand = @"
<View Scope="RecursiveAll">
    <Query>
        <OrderBy><FieldRef Name='ID' Ascending='TRUE'/></OrderBy>
    </Query>
    <RowLimit Paged="TRUE">5000</RowLimit>
</View>
"@
## Page Position
$position = $null
 
## All Items
$allItems = @()
Do{
    $camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
    $camlQuery.ListItemCollectionPosition = $position
    $camlQuery.ViewXml = $qCommand
 ## Executing the query
    $currentCollection = $list.GetItems($camlQuery)
    $ctx.Load($currentCollection)
    $ctx.ExecuteQuery()
 
 ## Getting the position of the previous page
    $position = $currentCollection.ListItemCollectionPosition
 
 # Adding current collection to the allItems collection
    $allItems += $currentCollection
}
# the position of the last page will be Null
Until($position -eq $null) 

Few word about the query, I am using RecursiveAll because I used it against library and I wanted to get all items in all folders, the size of the page is 5000, just on the edge of the threshold and I am ordering the result by ID because this column is always indexed.
I am using Do-Until loop to get all pages and setting the position to be the position of the last item collection that was retrieved.
This is really a powerful and quick way to workaround the annoying 5000 list view threshold. I hope you find it useful!

Monday, 15 February 2016

Copy List Views in SharePoint and SharePoint Online with PowerShell

In the last couple of weeks I am working with a customer that mainly uses SharePoint Server as DMS(Document Management System). I had to move a large number of documents from one library to another   due to  corruption in some of the files caused by excessive use of unique permissions (~ 32 000).
In this post I will not talk about why you should limit the usage of unique permissions especial in big libraries, it's a long story.
As part of the work I had to copy many list views to the new library. I am not a fan of "Save as Template" approach, my solution was to use a PowerShell script and copy the views programmatically.
The script did its job and I thought that it will be nice to have something like this for SharePoint Online.
I was unable to find any ready script that can do this, so I wrote one.
Both scripts are doing basically the same thing, getting the source and destination webs, getting the source and destination lists, getting the source and destination View collections and create new view in the destination using properties from the source.
It was a bit tricky to load what I need and not making the script extremely slow with CSOM, since we cannot use simple lambda expressions syntax in PowerShell. My solution to this was to use a function written by the SharePoint automation superstar  Gary Lapointe. You can check it out in his article for ItUnity, where he explains how to workaround the limitation in PowerShell concerning the lambda expressions. I highly recommend to read the entire series dedicated on using PowerShell against SharePoint Online.
You can download both scripts (on premises and online) below. Please, Test, Rate and use Q&A section in the Gallery!


 Download On Premises script from: TechNet Gallery

Download SharePoint Online script from: TechNet Gallery

Monday, 21 September 2015

Get the slowest request in SharePoint - The PowerShell script

In April this year I published a post called "Query Usage and Health Data collection database for slow requests execution". In this post I have explained the limitations of the "Slowest Pages" report page in SharePoint and provided a SQL query script that can overcome this limitations and you can receive a valuable information about the requests logged in the Usage and Health database.
After I created this script I have used it in multiple occasions and it turned out to be extremely useful! However, there is one big disadvantage(in my opinion) about this script, it is T-SQL script.
This is why I started to think how I can transform it into a PowerShell script, so I will not need to know which is the logging database, on which server it is, go to SSMS, copy to CSV and so on.
Now I might be wrong, but there is no SharePoint API that will help me to get such result in PowerShell. I also did some googling on the subject and I was unable to find anything remotely to what I was looking for, maybe there is some chance to get something like this in SharePoint 2010 where we have Web Analytics Service application.
If I am correct, this can be an idea for improvement in the SharePoint web analytics field, something that I think was announced to be better in SharePoint Server 2016.Usage and Health database captures a lot of useful information and it is a shame that SharePoint Admins cannot take advantage of it without being SQL master or BI guru. My solution was just to embed the SQL query in the script. 
In the script I put some sweet stuff like: Support for SQL Authentication, The output can be in PowerShell with type System.Data.DataRow and in CSV file/Grid View, you can pass the Start and End times as DateTime objects and many more.
If you run the script from SharePoint server the script will automatically determine the SQL server, the Usage and Health database, will do the connection to the SQL Server by impersonating the identity of the user that is running the script and you can filter by web application by passing Url,GUID or WebApplication object.
Of course you can successfully run the script when you are not on SharePoint or SQL server, you will just need to enter more parameters, use GUID for Web Application and have connectivity to the SQL server itself.
I have successfully tested the script with SharePoint 2010/SQL 2012, SharePoint 2013/SQL 2012 and SharePoint 2016 TP/SQL 2014. You can check the output and download the script below, if you have issues read the Help section, ask a question in the Q&A section in Technet Gallery and if you like it Rate it.
Finally I would like to have a minute for shameless self-promotion and say that last week I passed 3000 downloads mark in the Gallery and I received a Gold medal for my Gallery contributions. Thank you!

SharePoint Slow Request Grid View Output


 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Thursday, 20 August 2015

Unable to set custom Data Refresh credentials for PowerPivot workbook in SharePoint

Last week one of our customers had interesting issue, they were not able to set alternative Windows and Data Source credentials for PowerPivot Scheduled Data Refresh in SharePoint.
The deployment is typical BI Farm based on SharePoint 2013 Enterprise SP1 and SQL Server 2012 Enterprise SP1.
Since I was unable to find information about this and I also think that the authentication and refresh of external data sources is always a bit tricky in the SharePoint BI world, I decided to share my solution.
In PowerPivot for SharePoint you can configure Scheduled Data Refresh, including the option to set alternative credentials. This empowers BI Experts and BI site administrators to set connection credentials without having access to Secure Store Service(SSS) or even know what SSS is. See how this looks like below.


PowerPivot Scheduled Refresh Settings
There is a good article, that explains how to configure Data Refresh in PowerPivot, you can find it here.
The issue with the customer was that when they try to set alternative credentials they receive generic SharePoint error page and changes are not applied.
If you read the article I referred above, you will see that every set of alternative credentials is actually saved as target application in SSS.
If you search in the ULS logs for Correlation Id you get, you will find something similar to below Unexpected error:

System.ServiceModel.FaultException`1[[System.ServiceModel.ExceptionDetail, System.ServiceModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]: Access is denied to the Secure Store Service. Server stack trace: at System.ServiceModel.Channels.ServiceChannel.ThrowIfFaultUnderstood .....
.......


This error message actually indicates that PowerPivot application pool account does not have permissions in SSS to create target application for the new credentials.
My solution to that was to add the account as Administrator of SSS with Create Target Application and Manage Target Application permissions. After doing this it worked out like charm!


Tuesday, 11 August 2015

Discover what permissions user has in SharePoint Farm with PowerShell

Permissions and access control is essential part of Information management and governance in a successful SharePoint implementation. As everything in SharePoint, user permissions and access require significant planning in order to prevent future headaches.
However, very often for one reason or another, there is no significant permissions planning, or if there is, it was not implemented correctly or the business users, site administrators and content authors haven't received proper training required to maintain a well organised permission structure. With the introduction of  the "SHARE" button all over SharePoint 2013(most probably this will be valid for SharePoint 2016 as well), it became even easier for the users to break permission inheritance and grant another user or group with "Edit" permission to site,list or item, when only Read access was needed.
Time goes by and you onboard a new customer that has no clear concept for permission management in SharePoint, everything is great the sun is shining and then you receive a query from the customer asking you to give them information on what permissions a user has in their SharePoint. As mentioned the customer has no concept for permission management and they have nice mix of SharePoint groups, AD groups, permission levels, many object with unique permission shared with individual users and so on. You try to figure out what permissions the user has, you dig deeper in the content and eventually end up in below situation.

SharePoint Admin Mind Blown

This is why I did something I had to do a long time ago, I wrote a powershell script that can get all permissions granted to windows user. This includes Farm level, Web Application User Policies, Site level(Owners/Admins),Web,List,Folder and Item. The script is creating a CSV file with all permissions and at the end it is loading the file in GridView for ad-hoc review. See how it looks like below.

Permission Report Gridview

The main goals for the script is to cover as much scenarios as possible. As you can see the script is covering different levels, not only permissions over securable objects. It is working and was tested in SharePoint 2010/PowerShell 2.0 and SharePoint 2013/PowerShell 3.0. It is working with Windows Classic authentication and Windows Claims.You can select the lowest object level that you can scan for permissions, starting with Web Application to Item. The script is showing if the permissions are granted directly or inherited by a Domain group, and if they are inherited it will show you the name of the group.
To achieve this I had to spare a lot of my free time. At the end this became one of the most complex scripts I have ever written and I am able to share. I hit a lot of rocks and did a lot of testing until I decided that it is good enough to be shared. There are many similar scripts, but so far I haven't found any that can cover this many scenarios. This is also a call for everyone that will download and use the script to give me a feedback and if there are any issues in different setups, I highly doubt that the script can break something.
With this post I also want to share some of the interesting PowerShell techniques I have used in the script.
Above I mentioned that the script is not getting permissions from the securable objects only. This are the objects that can be secured with permissions and permission levels, they are SPWeb,SPList and SPItem. Getting the permission information for securable objects is a big part of my script, you can read how to get permission information for securable object in a post by Gary Lapointe. In Gary's post you can find a script very similar to mine, unfortunately I found it when my script was almost ready.
There is one issue with the permission info for securable objects. In a scenario where the user is member of AD group and this group is member of SharePoint group there is no way to find from which AD group the user is inheriting the SharePoint group membership.Also consider a scenario where AD group is granted with permissions in Web Application with User Policy, you will need login/claim of the group to see if there is a policy. This is why my script is getting user's AD group membership, if there is a permission that is coming from SharePoint group the script will get the members of the SharePoint group and will see if the user is member of the SharePoint group or some of its AD groups is member and it will give you AD group name.
Getting AD user/group information is easy when you are on a Domain Controller, it is actually not very hard to do it from any domain joined computer. I have done it by using the Type Accelerator [adsisearcher], you can read about it and how it is working from this article.
Here is another issue I dealt with, the primary domain group is not listed in the attribute memberof. Imagine that the primary domain group is not "Domain User", you will need to know it in case there are some permissions granted in SharePoint. There is a KB that has example how to do it with nasty old VB script :). How to do it in PowerShell is a bit easier. Every AD user has attribute "primarygroupid" that matches exactly with the group object attribute "primarygrouptoken" of the primary group, this ID is also the last digits from the group's SID.
There are some more techniques and workarounds I have used in the script, but I am risking the post to become very lengthy and boring, so download the script to see them.
You can download the script from below link. Please, test,use and rate it in Technet Gallery. For examples and prerequisites see script's Help section!



 Download the script from: TechNet Gallery
(You may experience issue with Chrome, use Firefox/IE to open the link)

Friday, 31 July 2015

Change the URL of Existing SharePoint List or Library - Video tutorial by Webucator

In January this year I published an article called Change the URL of existing SharePoint List/Library[Tip] . After some time I was contacted by Webucator, they liked the solution from my article and asked for permission to create a video tutorial based on it. If you don't know them, Webucator is providing onsite and online technical and business trainings.
I was notified today that the video is published and it is really well explained, you can check it out below. Webucator provides a wide variety of technical trainings including SharePoint. If you are looking for SharePoint trainings be sure to check what SharePoint Classes are available at Webucator, you can also find many free tutorials on their site.

Monday, 16 February 2015

SharePoint Weekly Backup cycle script.

In this post I am going to share a script for SharePoint backup I wrote for one of our customers.
Since we had no budget for a 3rd party backup solution I decided to use the native SharePoint backup capabilities in a script.
For me the SharePoint Farm backup is a good solution if you do not have any advanced 3rd party tools like DocAve for example or your Farm is not very big. Amongst DPM,SQL backups, File System backups the SharePoint farm backup is the best solution in case of disaster and you have to restore the entire Farm. You can see the following article for details on the different methods.
The drawback with the SharePoint Farm backup is that it is not very granular and it can be a bit slow in large environments. For example in Web Applications the lowest level object you can restore is Content Database. However, you can restore a content database and then use Recover data from an unattached content database and restore site, web or list. Still, you will not be able to directly restore an item for example.
Be aware that if you try to restore a content database from Farm backup to a live Farm you may receive error similar to the one below.

Restore failed for Object 2K8R2_SP_Content_Portal_Bla (previous name: 2K8R2_SP_Content_Portal) failed in event OnPostRestore. For more information, see the spbackup.log or sprestore.log file located in the backup directory.

And when you look at the restore log you will find something like this:

FatalError: Object 2K8R2_SP_Content_Portal_Bla (previous name: 2K8R2_SP_Content_Portal) failed in event OnPostRestore. For more information, see the spbackup.log or sprestore.log file located in the backup directory. SPException: Cannot attach database to Web application. Use the command line tool or Central Administration pages to attach the database manually to the proper Web Application.

In this case I am restoring the old content database 2K8R2_SP_Content_Portal to a database with name 2K8R2_SP_Content_Portal_Bla. However SharePoint will try to attach the restored content database to the Web Application and it will fail because we already have a content database with that ID. However the restored database is completely okay and you can see it in the SQL Server Management Studio.
The customer required to have a backup once a day and to keep the backups for one month. We were also disk space constrained.
There are many SharePoint backup scripts, but I wanted to have a different folder for every week and also to have one full and the rest of the backups to be differential in order to save some free space. Additionally you can enable SQL Server backup compression to save more disk space. 
The script I wrote is doing this, also it can backup only the configuration.
The script is doing a 7 day cycles, when a new cycle starts it will create a new folder that resembles the start, end and the type of the backup. For example if you start a full backup cycle on 14.02.2015 the script will create a folder with name 14022015_20022015_F (Dates are in EU standard ddMMyyyy). Since it is new cycle the script will start with Full farm backup. Then all the backups until 20.02.2015 will be differential. Here how it looks one weekly cycle.

SharePoint backup cycle


The script will also do a clean up. You can specify how many cycles you want to maintain and it will keep them and the old backups will be deleted. Do no combine this script with full database backups because it will break full-differential sequence.  


 Download the script from: Technet Gallery
(You may experien issue with Chrome, use Firefox/IE to open the link)

Sunday, 25 January 2015

Test and Monitor your SQL storage performance and some good practices for SharePoint.

In the last couple of weeks I've been busy with analyzing and optimising one of our customer SharePoint 2013 deployment. This is a huge topic to discuss, SharePoint itself is not a slow application. However, there might be countless reasons that can make your SharePoint work and feel slow. In my case the source of the performance issues was the SQL.
You should pay as much attention to your SQL when you plan and deploy your SharePoint solution, as other components. The SQL server is referred as  the storage tier for SharePoint. Except the binaries and the static application files SharePoint stores almost everything(94% to be accurate) in the SQL, after all SharePoint farm is just a Configuration database and all your user content.sites,webs are stored in content databases.
Now, I am not a SQL Server DBA nor a storage expert, but for me the storage that hosts you databases is one of the most important components for the performance of your SQL. If your RAM is enough for your workload,you have enough CPU power or you have followed all good practices for SQL that hosts SharePoint, but if your storage needs more time accessing and delivering certain data(latency) or the debit (IOPS) is not small enough or big enough the SQL will work slow, therefore your SharePoint will work and feel slow.
In this post I am going to give you links and short description to articles and tools(free/open source) you can use for testing and monitor your SQL storage at the planning phase or as it was in my case after the system was deployed.
I mentioned the good practices for SQL that hosts SharePoint. Well if you do not know what this practices are you can learn about them from the following resources:

    1. Tuning SQL Server 2012 for SharePoint 2013 - This are MVA videos hosted by Bill Baer(Senior Technical Product Manager) and Brian Alderman. In this demo-rich videos they will cover some of the best practices for integrating SQL Server 2012 and SharePoint 2013

    2. Maximizing SQL 2012 Performance for SharePoint 2013 Whitepaper - This is a great whitepaper for the good practices that can maximize your SQL Server performance for SharePoint. The author is Vlad Catrinescu (SharePoint MVP) that covers most of the points from the videos above, plus many additional.

I had to implement lots of this practices before going to investigate any storage bottlenecks.

Now I would like to bold on one thing that has became more important with the development of the Cloud services. Distribute your database files on as many disks as possible. Every cloud provider I've been working with limits the VHDs by IOPS. For example in Azure class A and D VMs you have 500 IOPS per disk. However with the different VM sizes you can attach different number of disks. With A4 for example you can attach up to 16 data disks and that will make 16x500 IOPS. So why not attach as many disk as possible if it will not affect your budget(most of the providers charge storage per GB)?

One of the first thing to check if you are experiencing SharePoint slowness and you are suspecting slow SQL is the "Developer Dashboard". There you can find the SQL execution time and many more useful stats for your request execution.

Developer Dashboard

You can find information how to enable and use the dashboard for SharePoint 2013 in following Article.

At the planning phase of your solution, before you have installed anything you can use various tools to test the performance of your storage, here are some of them:

     1. SQLIO - In this article from Brent Ozar you will find download link for the tool and how to use it. The tool is really straightforward and can show you basic performance information for your storage, IO/S, MBs/S and the latency from your output. In this article you will also find links for other tools like SQLIOSim that can simulate SQL Server OLTP and OLAP.

     2. DiskSpd - Now this is more robust tool for testing your storage. With this tool you can test local disks and SMB shares as well. This is a great article from Jose Barreto (member of the File Server team at Microsoft). There you will find everything about DiskSpd. The tool also comes with very comprehensive documentation. However this is a powerful tool  and you should not run it against storage with live data because it can destroy it.

Tools that you can use to monitor after system deployment:

     1. PAL  - This is a great universal tool for Performance Log Analysis. It is written by Clint Huffman (PFE at Microsoft). It is a tool not just for SQL, but for number of products. If you do not know how to interpret the performance logs for certain Microsoft product or you need aggregated performance data report , the PAL is just the tool for you. PAL is working with templates for the different products. If you do not know what PerfMon counters to capture you can export the PAL template to PerfMon template and then you can create regular Data Collector Set in the Performance Monitor MMC. You can see an example for PAL HTML report below.

PAL Report Sample


     2. SSMS Activity Monitor - This is a great tool that you can launch in the SQL Server Management Studio. Just open SSMS right click on the instance and open Activity Monitor. There you will find current stats for Overview, Active User Tasks, Resource Waits, Data File I/O, and Recent Expensive Queries. If you do not know how the Activity Monitor looks like you can see it below:

Activity Monitor

     3. Triage Wait Stats in SQL Server -  Here is another article from Brent Ozar where you can find an explanation of what Resource Waits are, a script that can capture stats for a period of time and references to other articles on the subject. For me the Waits are very important thing to look at when it comes to SQL performance in general.

     4. Capturing IO latencies for a period of time - This article is last but not least. It is written by Paul Randal from www.sqlskills.com. There you can find a script that for me is the ultimate tool to identify some storage bottlenecks. Because it is not mentioned in the article you can set the time period with changing WAITFOR DELAY. In the output you can see the average latency (Read/Write) per file Average Bytes for Read/Write operations and more. You can see example output below.



And here are the final notes. SharePoint is application where we cannot edit some DB schema for example. It is enterprise product, not some inhouse build. If you have followed above mentioned Best Practices, you have correctly scaled your deployment, you are sure that there is no customisation that can cause issues, but you suddenly receive performance hit most, probably it is caused by other systems that are more likely to change like storage,network, Windows etc.. And of course there is no magic number or silver bullet to fix all issues.
If you are just SharePoint Administrator and deep diving in SQL and storage is not your responsibility, notify and push the SQL and Storage guys to do their work and fix the issues. If you are not sure with above mentioned tools, test them in non-production environment first. As far as I understand T-SQL the above scripts do not touch any user database.

I am hoping this was useful for you.

Sunday, 18 January 2015

Unable to Update/Uninstall/Repair SharePoint binaries. Find out if files are missing from 'C:\Windows\Installer'

I guess that some of the things I am going to tell you in this post are well known for some of you that has been involved in Windows platform maintenance in recent years.
However this can happen, it can happen in your SharePoint environment, can happen in Production and you should be prepared for it.
So, last week I was helping a colleague of mine that has an issue in one of our customers production SharePoint 2013 servers. The farm has several language packs installed, but for one of the servers it was showing that this language packs are not installed. There was no chance this server to be missing binaries. We checked the "Product Version Job" timer job. This job by default runs every night on each SharePoint server and it should check the installed products and versions. If there is server where some binaries are not installed or they are lower version you will see some red colored alerts like the one below. However, it is possible that this job is unable to get the product data, because it runs under the Farm account identity and it needs local administrator privileges to retrieve the product data. In this case you will see event log warnings like the one described in this WiKi, you can put the account in the local admin group, restart the Timer Service and run the job, but this is not recommended. You can run the job with different identity if you log in to the server with proper account and run the command Get-SPProduct -Local and you will be able to update the data for the products installed on the server, this can help if PSConfig is blocked because of "missing products".

Missing Language Pack

Unfortunately this did not helped us in this case. We decided to run PSConfig on the server in question and oh surprise it was blocked by missing products on the server.
Logically we came up to the idea to Uninstall/Repair and reinstall the language packs.
However we were unable to do so from control panel nor by running the installation packages.
This was really strange, then we figure out that there might be some files missing from 'C:\Windows\Installer'.
This folder stores the Windows Installer Cache. The folder is hidden by default and it contains some very important files for every software that was deployed on the system using Windows Installer technology and this are all products and updates that are coming from Microsoft. The files are unique for every machine and they are not interchangeable. If you are not aware for the importance of this files it is possible to think that this are some temporary files that can be deleted. 
You will be seriously wrong! If there is missing files from this folder it will block the Update/Uninstall/Repair of the product  and your server is in "out of support" state. The guidance from Microsoft for such cases is very general. You need Machine Rebuild.
Apparently something has happened on our server. The support and maintenance of the Windows platform for this customer is delegated to a different company, so it is not possible someone from us to delete/remove/rename some of the files in that folder.
This situation is described in KB2667628, there you will find guidance for a very nice Microsoft Support Diagnostics tool called Windows Installer Cache Diagnostic. Run the tool on the server and it will generate a CAB file that contains nice reports on the installer cache. If there are no missing files you will see this:

No Windows Installer Cache Issue

If you have missing files you will see something like this for every missing files. Notice the bolded table row Action Required, it says "This .msp appears to be missing. Machine needs to be rebuilt."

Missing files from Windows Installer Cache

With this reports you have a strong weapon against the platform support teams :). 
However you will need to rebuild this server. I am not sure if restoring the files from backup will be supported. If you have not experienced such issues I think that it is good idea to run the diagnostic tool to check all your servers, it is fairly fast and simple.

Friday, 18 July 2014

Get Hidden Service Instances in SharePoint [Tip]

I am not sure if you know, but there are service instances in SharePoint that you cannot see in the UI no matter if you filter the "Services on Server" view by Configurable or All. You cannot see them even with the PowerShell command Get-SPServiceInstance. You can get "all" service instances by getting the property of SPServer object. Here is the PowerShell line to get them:

(Get-SPFarm).Servers['<ServerName>'].ServiceInstances

The state of the service, if it will be visible or not depends on property of the service instance object that can be True or False. It is the 'Hidden' property. Here is the result from my SharePoint Server 2013 dev. farm when I look for hidden service instances.


In the result you can see some services that are exception from the general rule and are visible in the UI like SharePoint Server Search, there are also services like SSP Job Control Service this is a very important service that is in charge for the configuration synchronization on the different servers in our farm and it is related to Application server administration service Timer Job, but this is a long story for our "short tip" post.

Tuesday, 10 June 2014

ULS viewer is gone...

If you had lately tried to download our favorite ULS log viewer, yes the one from the picture below.
You have noticed that it is not available anymore!

As Todd Klindt told us in his netcast with the retirement of the MSDN Archive Gallery we lost the ULS log viewer tool. Every SharePoint professional knows how precious this tool is for troubleshooting SharePoint.
There is a site called bringbackulsviewer.com there you can fill a survey with only two question.
If you want Microsoft to re-publish the ULS viewer? and If you would like Microsoft to publish its source code?

Meanwhile if you do not have the ULS viewer you can download it from the Todd Klindt mirror Here , just remove the _ when you download it. Or you can download it from my Link .