Sunday 31 August 2014

Display CSOM objects in PowerShell

Microsoft strategy for SharePoint 2013 and their offer for Hosted SharePoint as a service SharePoint Online is to bring the power of the full trust code on the client side. Since we are unable to deploy full trust code in SharePoint Online via the good old solution packages, Microsoft came up with the App model in SharePoint 2013 in order to make the customisation of SharePoint less dependent on the full trust code that is executed on the server. The underlying supporting technologies that are allowing SharePoint Apps and clients to interact with the SharePoint platform "remotely" are client-side object model (CSOM), JavaScript object model (JSOM) and REST. With this technologies you don't need to be on the SharePoint server to read properties or execute methods on SharePoint objects like Webs,Sites,Lists,Users and so on. You can work with this objects from your browser, from a compiled application,  or in my case from the PowerShell of my Windows 8.1 workstation for example.
As SharePoint Administrator I had a few interaction with SharePoint Online and I was disappointed by the limited functionality and control I had with the few commands that the native SharePoint Online Management Shell has. The answer for me to automate some repetitive tasks or to get clear view on the objects, in order to create some reports for example, was to utilize some client-side technology.
So, as Administrator the JavaScript object model is not my primary interest, I can invoke REST requests from PowerShell, but the most similar to the server-side code(that I am used to) is the client-side object model or CSOM.
In this post I am not going to do an in depth guide on how to write PowerShell scripts with CSOM, but I am going to show you a significant difference between CSOM and the server-side code in PowerShell, that can demotivate you to dig further into CSOM if you are new to it. Also I am going to show you how to correct this in certain extension and make CSOM less "abstract" when you work with it in PowerShell.
For the demonstration I am going to use on-premises installation of SharePoint 2013 and the sample function below that can list all webs(the subwebs) under the root web.

Function Get-SPClientSiteSubWebs{
 [CmdletBinding()]
 Param(
    [string]$SiteUrl
 )
 PROCESS{
    $clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl)

    $rootWeb = $clientContext.Web
    $clientContext.Load($rootWeb)
    $clientContext.ExecuteQuery()

    $webs = $rootWeb.Webs
    $clientContext.Load($webs)
    $clientContext.ExecuteQuery()

    Write-Output $webs
 }
}


I will be running the function from client VM that is joined in the domain of my test SharePoint 2013 environment. So in order to use CSOM from computer that is not a SharePoint server you will need the Client Assemblies. You can find the DLLs in every SharePoint 2013 server in the folder "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI", they all start with Microsoft.SharePoint.Client*.dll. Or you can download and install the SharePoint 2013 Client Components SDK, this way you will have the dlls in \15\ISAPI folder, also I think most of them will be registered in GAC. After you get the client DLLs you will need them loaded into the PowerShell session where you will run your scripts, now I already have done this and lets see what will be the result from my sample function. I will need to give a site collection as parameter and since we are running the function against on-premise SharePoint with sitecollection hosted in Web Application that is using Windows Integrated authentication we do not need additional credentials or change the authentication mode for the client context, this however will not be the case if we write functions for SharePoint Online or if we are using different from Windows authentication in on-premises. So it seems that we are ready to test.


Error:

format-default : The collection has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested.

We are receiving no useful result and a couple of red lines, which almost everytime means fail in powershell.
But when I pipe the command to Measure-Object the number of object is different than 0, and here is one of the big differences with CSOM.
Lets see how the function works. First we are getting the Client context from the URL we have passed. Then we are getting the Root Web which is the property Web, then we should get all the subwebs of the root web by getting the property Webs. But in CSOM you need to use the generic method Load() and then to call ExecuteQuery(). We cannot get objects from other objects on the fly as it is with the client-side code. Apparently our functions is working fine, but the Web objects are returned with only part of their properties and for the rest we need to load them with Load() and ExecuteQuery(). You can see this by piping to Select-Object and select Title and Url properties.


So our function is working fine, but first time we ran it the PowerShell failed to display the webs. This is because PowerShell used the default type display template to display Microsoft.SharePoint.Client.Web Type including some properties that are not loaded. For example SiteUsers, if we want to see the users we will need to load them. If we try to get this property from one of the webs we are going to receive below error:

An error occurred while enumerating through a collection: The collection has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested..

The PowerShell is using the default template because there are no display templates for Microsoft.SharePoint.Client types.
PowerShell display templates are XML files with extension .ps1xml. The display templates for the different types can be configured as list or table. Some of the files are stored in "C:\Windows\System32\WindowsPowerShell\v1.0" directory. If you open one of this files you are going to see warning that this file should not be edited but we have reference to the cmdlet Update-FormatData.
With this command we can load additional display templates for the Types that we want.
I have created such .ps1xml with templates for the most popular client objects. I can load it with below command.

Update-FormatData -AppendPath "C:\SPO\SPClient.Format.ps1xml"

Now when we have loaded our custom display templates lets see what will be the output from our function.


I think it is looking better now. You can download the template file below. You can change it and expand to suit your needs. The display templates will be loaded only for the current session. If you close your PowerShell. Next time when you or another user try to use CSOM you will need to load the templates again.

Monday 25 August 2014

Bulk Upload and Update User Profile Photos in SharePoint 2013

This post will be about a script I wrote and some interesting things I learned to write it. This script lets you upload and update multiple user profile photos for multiple user profiles in SharePoint 2013. There are many post on how we can rewrite the PictureURL property in the user profile object and assign it a custom value, but my script does not work that way. This script is emulating on what is happening in one popular out of the box scenario for importing user profile pictures, from AD.
The inspiration for this script came from an issue (of course) that one of my colleague have with one of our customers. The customer has some complex domain structure. They have around 16 domains and even federation. So the customer of course wants their SharePoint users to be able to have profile photos in their SharePoint profiles, so we have User Profile Synchronization in place and it is synchronizing the Picture user property from the AD attribute thumbnailPhoto. The users however can edit their profile picture and they upload new photos.
The issue is that since some CU installation this year, I am not sure, but I think it was April 2014, when a full synchronization is launched in the User Profile Application, the pictures that are uploaded by the users are lost. So my colleague ran the synchronization first on Staging environment that is almost identical to the Production, it has similar configuration of the User Profile Service Application and of course we lost the pictures that the user have uploaded. The Production environment however was not touched, but soon or later we had to apply some property mapping change and run full synchronization on Prod. too. We needed a way to get the profile pictures from Prod. reupload them to Staging for the correct profiles, then run full synchronization on Prod. and reupload again the user profile pictures.
And here I come with my powershell skills and some free time. As I said above I came across many scripts for editing the PictureURL property of the UserProfile object, but this option was not acceptable for me because in SharePoint we have 3 sizes of user profile photo thumbnails.
I first created  the script to download the biggest available picture thumbnail and save them under unique name that can later help me to determine to which account the picture belongs. I did this by getting the values of the PictureURL and the LoginName of the profiles and then simply download the largest thumbnail of the photo. You can get all the profiles in a User Profile Application with the snipped below. You will need a site collection URL to determine the service context and run it under Farm account. Of course the things are a bit different in scenario with multiple User Profile Applications or if the application is Partitioned(for multi tenant deployment).

$SiteURL = 'http://mysite.contoso.com/'
Add-PSSnapin Microsoft.SharePoint.PowerShell
$Site = Get-SPSite -Identity $SiteURL
$context = Get-SPServiceContext -Site $site
$upm =  New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)
$AllProfiles = $upm.GetEnumerator()
ForEach($profile in $AllProfiles)
{
$profile
}

And now I had all the profile pictures from Prod. and we had to figure out  how to upload them on Staging and "connect" them to the correct account.
And here comes the interesting part. So normally the pictures for the user profiles are imported from the thumbnailPhoto, but we have a URL to picture for value of the PictureURL property and the thumbnailPhoto, contain just a picture as binary. As we know after synchronization we need to run the out of the box command Update-SPProfilePhotoStore. 
So I found what is happening between the import and thumbnail generation.
When the import/sync from the AD happens the thumbnailPhoto value is saved as .jpg image in the same picture library with profile photos that also contains the thumbnail versions of the profile pictures. For example if you have My Site host http://mysite.contoso.com, the URL of the library in SharePoint 2013(I forgot to mension that we are working on SP2013 farm) will be http://mysite.contoso.com/User Photos/ and the pictures are in folder Profile Pictures. The thumbnails are named Account_LThump.jpg for large, Account_MThump.jpg, for midsize and Account_SThump.jpg for small. The midsize picture is used as profile photo.
So the initial picture are also saved in this library but under a special name like this 0c37852b-34d0-418e-91c6-2ac25af4be5b_RecordID.jpg. Here is the moment to say that we have only one, nonpartitioned User Profile Service Application. The first part if the picture name the guid (exactly this guid) is the guid of the default partition of nonpartitioned UPSA. It is the same on all SharePoint 2013 farms I have seen. You can check it by doing SQL query on the Profile DB, but only on non-production Farm because doing query on this DB is not supported. It is a column of every profile entry in the Profile DB.


The second part, the RecordID is a unique ID for every user in the partition. It is just a number not a guid. You can also see it in the picture above, you can also get it with powershell as property of the user profile object. But, so far I haven't found a way to see the PartitionID of the default partition of nonpartitioned UPSA in PowerShell.
And when you run Update-SPProfilePhotoStore with the corresponding parameters, it reads the available not converted to thumbnails .JPGs read the users RecordID, creates the thumbnails, add the url of the picture in the user profile and in the general case delete the original(depends on the parameters).
So this was our answer on how to upload the pictures from Prod to Staging and get all three thumbnails variants and map the PictureURL property. One additional feature to this method is that for PictureURL is mapped the url of the biggest thumbnail version(not the midsized as if it was imported from AD) and you will have pretty profile pictures with good resolution.
Here came the idea for this script. What if there are SharePoint Admins/Customers that do not want to get the pictures from the AD, but they have all the profile pictures and want to mass upload them as if the users have uploaded themselfs.
And I came up with this script. You need to run it two times(As Administrator under the Farm account) first time will be generated a CSV file with the RecordIDs,LoginNames and picture paths. The picture path will be empty you should fill it with the local or the network path to the picture. The picture can be whatever resolution and almost any format (JPG,PNG,GIF,BPM,TIFF), the script will convert it to jpeg and upload it under corresponding name. And when you have filled the picture paths you want(it is not mandatory to fill the path for all accounts), run again the script with appropriate parameter for upload and update.
You can download the script from the link below, for instructions and examples see the help of the script like this Get-Help .\Upload-ProfilePhotos.ps1 -full . I always put some fair amount of help topics in my scripts.

Download From: TechNet Gallery

Thursday 14 August 2014

Scripts to Add,Remove and and view SharePoint 2013 Maintenance Windows

My colleague shared a MSDN Blog post about new class in SharePoint 2013. With this class you can configure message that will appear on the sites informing the users for upcoming maintenance activity, this is done on Content DataBase level. The message can look like the picture below. The messages are predefined for several scenarios you can find samples at the mentioned MSDN Post. You can specify Maintenance Start Time,End Time,Notification Star Time,End Time, link with information about the maintenance and read only period. This is very cool feature, but I don't like the script that is shown in the post, so I wrote two scripts for Adding,Removing and View of the maintenance windows. You can choose individual Content DataBase,Site,Web App or All Content DataBases in the Farm. You can use pipeline to pass objects form other cmdlets, the script will ask you for confirmation before adding or removing the maintenance windows. For more information see the Help in the script there are many example how to use it. To run the scripts you will need Microsoft.SharePoint.PowerShell PS snap-in loaded.

                            Download the scripts form: TechNet Gallery 



Tuesday 12 August 2014

Site collection Term Set Groups and how to reconnect Term Set Group to site collection.

When we go to the Term Store manager in SharePoint 2013 Central Administration we can see a bunch of Term Set group with various term sets in it. This term set groups are visible in the entire farm, so far so good.
But we can have a term set group that out of the box will be visible/usable by the the site collection where it is created, this groups are sometimes referred as site collection term set group or local term set group. We can have such term set group if we activate the "SharePoint Server Publishing Infrastructure" site feature or by some manual method like in this Article. As I said this term set group, the term sets and terms in it will be initially visible/usable only for the site collection and this was related to the problem I had this week and of course the solution.
As you have noticed in most of my posts I pretty much use PowerShell and the SharePoint Object Model, so I will show you how this site collection term set group looks in PowerShell and what we can do with it.
For demo purposes in this post I am going to use two HNSC. One based on TeamSite (http://termsetT.auto.l) with no term set group and second (http://termsetP.auto.l) is with publishing features activated, so the second one comes with site collection term set group.
First we will see how to find the site collection term stet group. This can be achieved with below snipped.

Add-PSSnapin microsoft.sharepoint.powershell
$site = Get-SPSite http://termsetp.auto.l
$taxonomySession = Get-SPTaxonomySession -Site $site
$termStore = $taxonomySession.TermStores["Managed Metadata Service"]
$pubSiteGroup = $termStore.GetSiteCollectionGroup($site)

First we will get an instance of our publishing site. Then we are getting an instance of SPTaxonomySession, this is done with out of the box cmdlet Get-SPTaxonomySession, this object will give us access to the term stores of the web application where our site is deployed or to to the site subscription in multi tenant scenario. In our case this is our Managed Metadata Service application. The taxonomy store has method called GetSiteCollectionGroup,  we are giving  our site as argument and here is the output:


We have two interesting properties to look at. The first one is IsSiteCollectionGroup, it has value True, so this group should be site collection term set group for some site. Second one is SiteCollectionAccessIds, this id is actually telling us the ID of the site collection that can see, use the term sets and edit them. This is the ID of our publishing site. The third one is SiteCollectionReadOnlyAccessUrls, in my case this is the url of my team site that has no site collection term set group, now I will be be able to see the group from my team site, use the terms, but I will not be able to edit the content of the group from the Term Store Management Tool in the Team Site, only read only access. You can actually give read only access from the Term Store Management Tool of the site collection that has full access, when you select the term set group on the bottom you can see section called Site Collection Access, there you can add the URLs of the sites you want to grant read only access.
Last week I receive an email from one of our customers, they needed to do a restore of one of their sites using Restore-SPSite cmdlet and after it their site collection term set group disappeared from the term store and they heavily rely on managed metadata to tag documents and to navigate in huge libraries. They wanted to get back the old group with the old terms because there was a big number of documents tagged and if they create new tags or import them somehow they will be with different IDs, tagged documents will be with invalid tags and navigation and filtering will not work.
The issue here was that they had overwritten the old site with the restore and now the site was with different ID. The old term set group was there, but it was not associated to the site or the correct way here to say is that the new site ID had no access to the old term set group. This can be solved by granting access to the new site ID. I did the same restore operation to my publishing site to reproduce the issue and below is the sniped I used to grant access to the new ID. Well if you do this with publishing site you may receive "error to load the navigation because the term set was not attached correctly...". You can fix this by switching to Managed to Structured navigation and then again to Managed with associating with the correct term set. This however was not the case with the customer and here is the sniped that granted access to the new site ID.

$site = Get-SPSite http://termsetp.auto.l
$taxonomySession = Get-SPTaxonomySession -Site $site
$termStore = $taxonomySession.TermStores["Managed Metadata Service"]
$group1 = $termStore.Groups['Site Collection - termsetp.auto.l-1']
$group1.AddSiteCollectionAccess($($site.ID))
$termStore.CommitAll()

The issue was clearly solved. However I had a similar case with different customer, the customer claimed that the site collection term set group just disappeared. I was able to get the old group in PowerShell but nothing helped me to "bring it back to life" , so I created new group and instead to try to do something with the group I just moved the term sets that were in it to the new group via PowerShell and everything worked fine. I created the new site collection term set group in PowerShell. Lets look at the method we used to get the group for certain site in the first sniped GetSiteCollectionGroup. If you give another argument that is boolean (True/False), the method will check if there is site collection term set group and if you give True and there is no such group it will create it with the out of the box naming. If somewhere in the term store you have some group with the same name it will put a number at the end of the name. Remember that after doing some changes in the term store you should use CommitAll() method to save the changes in the database.

Add-PSSnapin microsoft.sharepoint.powershell
$site = Get-SPSite 'http://termsetT.auto.l'
$taxonomySession = Get-SPTaxonomySession -Site $site
$termStore = $taxonomySession.TermStores["Managed Metadata Service"]
$pubSiteGroup = $termStore.GetSiteCollectionGroup($site, $true)
$termStore.CommitAll()