Monday 1 December 2014

Write and Get User Profile Properties in SharePoint Online with CSOM in PowerShell

A couple of weeks ago Vesa Juvonen wrote a post about the new capability in CSOM that is letting us to write user profile properties. This feature is already enabled in all Office 365 tenants.
The features are available in Microsoft.SharePoint.Client.UserProfiles.dll assembly version 16. Here is the moment to mention that if you download the latest CSOM package you will receive client dlls version 15 and version 16. In my earlier post I wrote about using CSOM in powershell I mentioned that the client dlls are located in "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI", well there are version 15 dlls that can be used against SharePoint 2013 on-premise. But if you want to use the latest and greatest features against SharePoint Online you should load the version 16 dlls located in "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI".
The new features are coming  from two new methods in PeopleManager class, SetSingleValueProfileProperty() and SetMultiValuedProfileProperty().
In his post Vesa gives example code that demonstrates the capability in SharePoint app. I found this interesting and since I am not a developer I will show you how this works in PowerShell. Unfortunately, I was unable to get SetMultiValuedProfileProperty() working from powershell, I think that this is because of the issue with the generic lists in powershell. However I will show you a function that edits the AboutMe user profile property and function that will get all user properties. I will authenticate against the admin portal of SharePoint online and I will be able to play with the properties of other users.
Below you can see both function(for editing AboutMe and gettin profile properties) and everything I do befor that.

[*UPDATE*]: I have published "universal" scripts for editing any User Profile property read more HERE

$cDLLS = Get-ChildItem -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI" | Where {$_.Name -notlike "*.Portable.dll"}
ForEach ($dll in $cDLLS)
{
    Add-Type -Path $dll.FullName
}


$username = "******@yankulovdemo.onmicrosoft.com" 
$password = "*******" 
$url = "https://yankulovdemo-admin.sharepoint.com"
$securePassword = ConvertTo-SecureString $Password -AsPlainText -Force

$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $securePassword)
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($url)
$Context.Credentials = $credentials
$Global:oContext = $Context

Function Update-AboutText{
Param(
    [string]$AboutText,
    [string]$UserLogin
)
Process{
    $logIn = ("i:0#.f|membership|" + $UserLogin)
    $ctx = $Global:oContext
    $aboutTextHtml = $AboutText.Replace(([System.Environment]::NewLine), "<br />")
    $peopleManager = New-Object Microsoft.SharePoint.Client.UserProfiles.PeopleManager($ctx)
    $Properties = $peopleManager.GetPropertiesFor($logIn)
    $ctx.Load($Properties)
    $ctx.ExecuteQuery()
    $peopleManager.SetSingleValueProfileProperty($logIn, "AboutMe", $AboutText)
    $ctx.ExecuteQuery()
}
}

Function Get-UserProperties{
Param(
    [string]$UserLogin
)
Process{
    $logIn = ("i:0#.f|membership|" + $UserLogin)
    $ctx = $Global:oContext
    $peopleManager = New-Object Microsoft.SharePoint.Client.UserProfiles.PeopleManager($ctx)
    $user = $peopleManager.GetPropertiesFor($logIn)
    $ctx.Load($user)
    $ctx.ExecuteQuery()
    Write-Output $user.UserProfileProperties

}
}


The first thing I do is to load the client dlls. Second, I am getting the client context, as I said against the admin portal and putting it as global variable for further reuse.
After we are ready we can start editing the properties and here is the outcome:

Write and Get User Profile Property with CSOM

It seems that this worked out. The updated About Me property is shown in Gencho's profile as well.

He is the master of SharePoint



Friday 28 November 2014

Unable to change the password of Managed Service Account from SharePoint

Since SharePoint 2010 we have the concept for managed service accounts in SharePoint. In order to use an account in SharePoint 2010/2013 you first need to register it as managed. After you register the account as managed, you can change the account password from the SharePoint(UI and PowerShell) you can even set an automatic password change in order to comply with the security policies of the company. It is even recommended to change the password of managed account only from SharePoint. This way the password will be changed and SharePoint will be aware of this change. Before we have the managed accounts it was real pain and a risky operation to change the passwords of the accounts used by SharePoint. However, I have seen multiple deployments (including mine done in the early days) where service accounts are created with attributes "Password never expires" and "User cannot change password". This attributes are self-descriptive enough. After some time when the environment is deployed, everything is working fine and we are all happy, comes the time where you need to change the password for security reasons as described above. To do this you go in the Central Administration or in powershell and performing just a routine change of managed account password, but you hit below "Access is Denied" errors:

Access is Denied

In PowerShell:

PS C:\> Set-SPManagedAccount -identity ILABS\SP_PortalApppool -NewPassword `
(ConvertTo-SecureString "SomeNewPass1234" -AsPlainText -Force) -SetNewPassword `
-ConfirmPassword (ConvertTo-SecureString "SomeNewPass1234" -AsPlainText -Force)
Set-SPManagedAccount : Access is denied
At line:1 char:1
+ Set-SPManagedAccount -identity ILABS\SP_PortalApppool -NewPassword (ConvertTo-Se ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidData: (Microsoft.Share...tManagedAccount:SPCmdletSetManagedAccount) [Set-SPManagedAccount], Win32Exception
    + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletSetManagedAccount


Again the "Access is denied" is not telling us what is the reason behind this.
The explanation is that the password change should be done by the user that is subject of this change and if he does not have the permission to do so, the operation will fail. The permission users to change their own password is denied by the attribute "User cannot change password". So if you do not have solid reason to do so, do not turn on the "User cannot change password" attribute of accounts that are used in SharePoint.
The second error you may hit when you enter new password or decide to use SharePoint generated new password is following:

Error when changing the password

In PowerShell:

Set-SPManagedAccount : The password does not meet the password policy requirements. Check the minimum password length, password complexity and password history requirements At line:1 char:1 + Set-SPManagedAccount -identity ILABS\SP_PortalApppool -NewPassword (ConvertTo-Se ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidData: (Microsoft.Share...tManagedAccount:SPCmdletSetManagedAccount) [Set-SPManage dAccount], Win32Exception + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletSetManagedAccount

This error is more descriptive, it is telling us what is wrong.
Unfortunately, it can be a bit misleading, because you are sure that you are compliant with your password policy  and you are still getting this error. Well there is one not so obvious conditions that could lead to this error.
The reason is that the password of the account was recently changed. We all know that there is a GPO policy that after certain time will prompt the user for password change, it is a good security practice to keep this GPO policy enabled, it is called "Maximum Password age".
However, there is another policy that define the minimum password age. As you can guess it is called
"Minimum Password age" and it is enabled by default with value 1 day. This means that if a user change the password, he will not be able to change it again in the next 24 hours from the time of the last password change. If you open your Default domain policy you can find the password policy like this: Default Domain Policy -> Computer Configuration -> Windows Settings -> Security Settings -> Account Policy -> Password Policy.
The third error you may hit is really telling us all:

Error when changing the password 2

In PowerShell:

Set-SPManagedAccount : The password for the account ILABS\sp_portalapppool, as currently stored in SharePoint, is not the same as the current password for the account within Active Directory. Change the password to match the existing password within Active Directory in order to continue. At line:1 char:1 + Set-SPManagedAccount -identity ILABS\SP_PortalApppool -NewPassword (ConvertTo-Se ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidData: (Microsoft.Share...tManagedAccount:SPCmdletSetManagedAccount) [Set-SPManage dAccount], InvalidOperationException + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletSetManagedAccount

In this case the password of the managed account is different from the password stored in SharePoint, most probably due to a recent password change in the AD.
To correct this you first need to know what is the current password in the AD or if you do not know it you may try to reset it again to value that you will be aware of.
Then you go to the Central Administration, select change password, mark use existing password, enter it and you are ready to proceed as shown below.

Use Existing Password

Оr use PowerShell:

Set-SPManagedAccount -identity "ILABS\SP_PortalAppPool" -ExistingPassword (Convertto-Securestring "demo!234" -AsPlainText -Force)

Tuesday 28 October 2014

Microsoft Dynamics CRM 11 and SharePoint 2010 "Alternate Access Mapping" error

Last week a local firm contacted us about an issue that they have with Dynamics CRM 11  and SharePoint 2010 integration, but first I will make you a brief overview of the integration between those two products.
Dynamics CRM 11 can be integrated with SharePoint in order to have centralized Document management functionality based on SharePoint.
CRM 11 On-Premise  can be combined with MOSS 2007, SharePoint 2010 and SharePoint 2013 (All editions including Online). The integration will allow us to have one location where to store the documents from the CRM entities and records, the users can work with the documents from the CRM as well in SharePoint, no documents will be stored in the CRM itself.
At this time we do not have standard support from Microsoft for Microsoft Office SharePoint Server 2007(MOSS 2007), so this article is mainly for the integration with SharePoint 2010, also the issue that is the reason for this article is with SharePoint 2010.
The integration with SharePoint 2010 is fairly simple process, since the CRM is using SharePoint as a document storage. As this is an overview of the integration I am not getting in details.
The process is very straightforward and can be done from the CRM interface. You can check following article for detailed procedure for integrating this products, Here.
The important part(but not mandatory) here is the List Component that is deployed on the SharePoint side, it gives us integrated look of file "grid" and the ability to create the libraries and folders automatically from the CRM. Below you can see how it looks with installed list component and without it.

With installed list component:

CRM11 List Component

Without list component, the SharePoint page is shown as it is in iframe:

CRM11 No List Component

The issue of the customer started when he migrated the sql instance of the SharePoint to a different server, the customer started to receive below error when a new record is created in the CRM.

CRM11 Alternate Access mapping error

Error:

An error occurred while loading the page. The URL may not have been mapped in the SharePoint Server. Ask your system administrator to check the Configure alternate access mappings settings in the SharePoint Central Administration.

It is clear that the CRM was trying to create a new folder for the record in the Accounts  library(in my case). If the users open record with existing folder they are able to view the content correctly, they can upload a new document, but the file view fails to refresh.
My first thoughts were "Well there must be something wrong with the AAM from Sharepoint side.".
Unfortunately after a couple of hours troubleshooting and testing there was nothing wrong with the Alternate Access Mapping.
According to the customer nothing was changed except the migration of the SQL instance. I am not sure what exactly were the versions of the SharePoint and the CRM because I had limited time for access on the systems.
However, I decided that there must be something wrong with the list component. I downloaded the latest available package and deployed it in the Document location site.
Can you guess what happened? Yes everything started to work as charm!
Since I cannot confirm the exact versions of the products and what were the conditions that led to this behaviour. I think that if you get such errors it is worth trying to upgrade to the latest list component solution available.
At this time the latest package version is 05.00.9690.4159 from 7/1/2014.

You can download the solutions for SharePoint 2010 and 2013 HERE

Monday 27 October 2014

SPDeletedSite and SharePoint Deleted Site Collections alert script

Couple of weeks ago I received a question from one of our customers if there is a way to get alert when a site collection from their SharePoint 2013 farm is deleted. This was kind of important for them because the power to create and delete site collections is delegated to project owners, team leaders etc. The customer often receives a requests for restoring a site that was deleted by mistake or they suddenly notice that a site with important information just disappears(deleted by the owner). The fastest way to restore such deleted site is to use the command Restore-SPDeletedSite, but often the request for restore comes after 30 days and the sites are permanently deleted.
As far as I know, there is no OOTB feature that can notify for deleted site collections, so my proposal to the customer was to write a PowerShell script that can track the deleted site collections and send notifications. The script is fairly simple it is just getting the current deleted sites with the command Get-SPDeletedSite -WebApplication <WebAppUrl> and it compare the result with the result from the last run and if there are new deleted sites it will send a mail with details like below one:

Alert for Deleted SharePoint site collection

And since the key point in this script is the retrieval of SPDeletedSite I would like to explain a bit more for this feature and and share the history of it.
I guess that every SharePoint administrator knows about this feature in SharePoint 2013 and a big part of us are so happy that we have it.
The story of it starts with the release of the SharePoint 2010 with a capability called "Gradual Site Delete". This capability was designed in order to mitigate performance degradation or even service interruption caused by so called Lock-escalation that may happen in the content database if we attempt to delete very large site collection. This was an issue in WSS 3.0 and Microsoft SharePoint Server 2007(MOSS 2007).
The short story around "Gradual Site Delete" is that when a user deletes site collection, the site collection is not instantly deleted, it is marked for deletion and the content becomes unavailable. Then for the actual deletion is responsible a timer job definition called "Gradual Site Delete" that is executed on daily schedule by default(configurable).

Gradual Site Delete Timer Job

This timer job deletes the site collection from the content database in small enough portions to prevent lock escalation. If you want to read more on the subject, please check the post from Bill Baer.

Then with SharePoint 2010 SP1 we received 3 additional cmdlets that are allowing us to work with site collections that are "marked" for deletion, but not deleted yet. They are the same as in SharePoint 2013 - Get-SPDeletedSite, Restore-SPDeletedSite, Remove-SPDeletedSite and Move-SPDeletedSite (Only in SharePoint 2013).
And now with the cmdlet Restore-SPDeletedSite we can restore site collection that was deleted by the end user.
As I said eventually this 'deleted' sites should be permanently deleted by the "Gradual Site Delete" timer job. The exact time how many days the site collection should be kept in "Gradual Site Delete" phase as SPDeletedSite object (marked for deletion) before permanent deletion, depends on the time settings of your site collection recycle bin that is set on Web Application level.

SharePoint 2013 Recycle Bin Settings

By default the Recycle Bin is enabled and the retention period for the deleted items is 30 days. This means that if a site collection is deleted it will stay as "restorable" SPDeletedSite object for 30 days.
Be aware that when you delete site collection in powershell with the command Remove-SPSite, you need additionally to specify parameter  -GradualDelete in order to use the Gradual Site Delete. 
If this parameter is not used the site collection will be instantly deleted, it is strongly recommended to use Gradual Site Delete for large site collections.

Download the script from: Technet Gallery

Monday 20 October 2014

Office 365 Demo Tenant available for Microsoft Partners

In this post I am going to tell you a bit more for the Office 365 Demo Tenants available for Microsoft Partners. This is not something new, but I think it is very useful and not popular enough.
Last week I needed to do a SharePoint Online demo, this is not my primary work so I was not not prepared with any Office 365 demo tenant and content, so this came up very useful.
If your company is Microsoft Partner and your Live account is associated with your company you can take advantage of the free Office 365 demo tenant for 90 days in order to do your shiny Office demos. This will include the full Office 365 stack, Exchange, Lync, SharePoint Online, Yammer, Power BI, demo content in order to demonstrate all the good things and 25 users. There is an option if you already have some spare Office 365 tenant to receive only the demo content, but I haven't tested this option.
First thing you need to do is to go to https://www.microsoftofficedemos.com/ and log in as Partner

Microsoft Office Demo Site


After the successful login you go to GET DEMO in order to provision a new Demo Tenant.
There you will see 4 options:

1. Create new demo environment  - This is the "full" package(demo tenant + content)

2 .Refresh a demo environment created with this tool - Do not expect this option to refresh your expired demo tenant after 90 days. If a demo tenant reaches 90 days you need to create new one if you need it. The explanation under the link of this options is descriptive enough(never tested it).

3. Just get demo content - I have my own O365 tenant! - As I said above this should provision only the Never tested this, because I do not have non-production O365 tenant where I can try it.

4. Create an empty Office 356 tenant - I guess that this will create you a Demo tenant without the content.

Microsoft Office Demo Tenant Options


If you you choose the 1 option, a new  tenant with content, you will need to complete two more steps where you should specify the type of the content(Standard, Productivity Solution, Industry-Specific). At the second step you should specify the information for the tenant, like Tenant Name and your Sign-Up information(your mail, phone ...etc).
The next thing you will have to do is to wait. You will be redirected to the below page where you can follow the deployment status of the different components.

Microsoft Office Demo Tenant Provisioning Status

Be aware that if the provisioning process last for more than 48 hours there is a chance to be canceled and you will need to start all over again, you will receive regular status updates via email. You can also check the status in the Microsoft Office Demo site - CHECK DEMO.
So if you need to prepare important demo/presentation do not leave this for the last moment. 
As I said above you will receive 25 demo users to play by the scripts provided in the Microsoft Office Demo site Resources. There will be scripts available somewhere in the SharePoint, in my case there were in https://<MyTenant>.sharepoint.com/Shared Documents/Forms/AllItems.aspx.
Once the Demo is provisioned you will receive the Tenant Admin credentials and you are ready to explore and demonstrate Office 365 to your future customers. 
While you are deploying the Demo you will see the warning that you should not share credentials from this Demo with your customers! Please do not share them!

SharePoint Online Demo Sites

Friday 10 October 2014

Create ZIP Archive in PowerShell and SharePoint IIS and ULS archiving scripts

In my career as Windows Web hosting administrator and now as dedicated SharePoint Administrator I have always needed a way to clean up the IIS logs since the IIS does not provide such functionality.
I guess that this is an issue for everyone involved in some sort of application management that is hosted on IIS.
Most of us leave Log Rollover settings to Schedule, Daily and we end up with bunch of files in our log directory. The size of the logs depends on how many hits the application is receiving. If the application is some public website the logs can become quite big and the volume where the logs are stored  can run out of space (in any case you do not want the IIS logs stored on System drive or on drive with application files).
So, the most recent experience I had was with one of our customers that has their public websites hosted on SharePoint 2013. The sites are quite busy and we have huge 1GB logs almost everyday and with NTFS compression the files are still around 300MB size on the disk.
Since it is a good practice to keep this log files as long as we can, we can have two options.
Option one is to zip(or use our archiving tool of choice) all log files,put it away and delete the original file in order to save some space and option two to write or find a script to do this for us.
I found many scripts for archiving IIS logs. Most of them are using 7-zip to do the zip archives or the cmdlet Write-Zip that comes from the PowerShell Community Extensions. Both tools are using some sort of a 3rd party compiled code that I do not want to run on the servers that are hosting the public sites of our high profile customer.
At the moment we don't have version of PowerShell that has native cmdlets for working with .zip files. The new PowerShell version 5 will have native cmdlets that can create and extract zip files, but it is still in Preview version and it is not recommend for production environments. If you want to check it out there is a preview of the Windows Management Framework V5 available for download.
So, I needed to write my own script to do the work without using any 3rd party tools.
With some research I found some classes that are added to the System.IO.Compression Namespace in .NET 4.5 that might do the work. The .NET Framework 4.5 is a prerequisite for every SharePoint 2013 so I do not need to install anything additional.
Below you can see the function that I have used in the IIS Log Archiving script.


Function New-LogArch{
    [CmdletBinding()]
    Param (
        [parameter(Mandatory=$true)]
        [System.IO.FileInfo]$file,
        [parameter(Mandatory=$true)]
        [string]$Location
    )
    PROCESS{
        Add-Type -Assembly System.IO.Compression.FileSystem
        $tmpFolder = $Location + '\' + $File.BaseName
        New-Item -ItemType directory -Path $tmpFolder | Out-Null
        $destfile = $Location + "\" + $File.BaseName + ".zip"
        $logPath = $file.FullName
        $zippedName = $file.Name
        $compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
        [System.IO.Compression.ZipFile]::CreateFromDirectory($tmpFolder,$destfile,$compressionLevel, $false )
        $archive = [System.IO.Compression.ZipFile]::Open($destfile, [System.IO.Compression.ZipArchiveMode]::Update )
        [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($archive, $logPath, $zippedName, $compressionLevel) | Out-Null
        $archive.Dispose()
        Remove-Item $tmpFolder -Force

    }
}


The function takes two parameters, the information for the file where it can get the literal path to the log file and the location where the archive will be saved.
The first thing we do with the System.IO.Compression classes is to define the compression level in the variable $compressionLevel, I am setting Optimal in order to get smallest file size possible.
Next thing is to create an empty zip archive from a temporary, empty directory I have created earlier. I was unable to find how to create the archive directly from the log file.
When we have the zip file created with the correct Location,Name and Compression Level I "open" it in the variable $archive.
Once we have the archive file opened we use the method CreateEntryFromFile in class System.IO.Compression.ZipFileExtensions, with this method we can add a file to zip archive and we add the original log file to the archive.
Finally we invoke Dispose() method on the $archive, because when we open the zip archive with Update argument the file is locked.
Now when I have the key component that will let me zip the log files I created the IIS log Archive and Cleanup script.
It will find the log location of all IIS sites, will check when they are modified and if they are older than the retention period that you have defined as parameter it will archive the log in the destination you have specified and will delete the original file. You will have a zip file with the name of the original log file that contains only one log file. The script will also check for old archive files and if they are older than the value you have specified it will delete the zip file.
I also wrote a script that can archive the SharePoint ULS logs, because it is a good practice to have them in case you need to do performance analysis and troubleshooting for example.



Download "IIS Log Archive and CleanUp Script" from: Technet Gallery

Download "ULS Log Archive Script" from: Technet Gallery

Sunday 31 August 2014

Display CSOM objects in PowerShell

Microsoft strategy for SharePoint 2013 and their offer for Hosted SharePoint as a service SharePoint Online is to bring the power of the full trust code on the client side. Since we are unable to deploy full trust code in SharePoint Online via the good old solution packages, Microsoft came up with the App model in SharePoint 2013 in order to make the customisation of SharePoint less dependent on the full trust code that is executed on the server. The underlying supporting technologies that are allowing SharePoint Apps and clients to interact with the SharePoint platform "remotely" are client-side object model (CSOM), JavaScript object model (JSOM) and REST. With this technologies you don't need to be on the SharePoint server to read properties or execute methods on SharePoint objects like Webs,Sites,Lists,Users and so on. You can work with this objects from your browser, from a compiled application,  or in my case from the PowerShell of my Windows 8.1 workstation for example.
As SharePoint Administrator I had a few interaction with SharePoint Online and I was disappointed by the limited functionality and control I had with the few commands that the native SharePoint Online Management Shell has. The answer for me to automate some repetitive tasks or to get clear view on the objects, in order to create some reports for example, was to utilize some client-side technology.
So, as Administrator the JavaScript object model is not my primary interest, I can invoke REST requests from PowerShell, but the most similar to the server-side code(that I am used to) is the client-side object model or CSOM.
In this post I am not going to do an in depth guide on how to write PowerShell scripts with CSOM, but I am going to show you a significant difference between CSOM and the server-side code in PowerShell, that can demotivate you to dig further into CSOM if you are new to it. Also I am going to show you how to correct this in certain extension and make CSOM less "abstract" when you work with it in PowerShell.
For the demonstration I am going to use on-premises installation of SharePoint 2013 and the sample function below that can list all webs(the subwebs) under the root web.

Function Get-SPClientSiteSubWebs{
 [CmdletBinding()]
 Param(
    [string]$SiteUrl
 )
 PROCESS{
    $clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl)

    $rootWeb = $clientContext.Web
    $clientContext.Load($rootWeb)
    $clientContext.ExecuteQuery()

    $webs = $rootWeb.Webs
    $clientContext.Load($webs)
    $clientContext.ExecuteQuery()

    Write-Output $webs
 }
}


I will be running the function from client VM that is joined in the domain of my test SharePoint 2013 environment. So in order to use CSOM from computer that is not a SharePoint server you will need the Client Assemblies. You can find the DLLs in every SharePoint 2013 server in the folder "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI", they all start with Microsoft.SharePoint.Client*.dll. Or you can download and install the SharePoint 2013 Client Components SDK, this way you will have the dlls in \15\ISAPI folder, also I think most of them will be registered in GAC. After you get the client DLLs you will need them loaded into the PowerShell session where you will run your scripts, now I already have done this and lets see what will be the result from my sample function. I will need to give a site collection as parameter and since we are running the function against on-premise SharePoint with sitecollection hosted in Web Application that is using Windows Integrated authentication we do not need additional credentials or change the authentication mode for the client context, this however will not be the case if we write functions for SharePoint Online or if we are using different from Windows authentication in on-premises. So it seems that we are ready to test.


Error:

format-default : The collection has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested.

We are receiving no useful result and a couple of red lines, which almost everytime means fail in powershell.
But when I pipe the command to Measure-Object the number of object is different than 0, and here is one of the big differences with CSOM.
Lets see how the function works. First we are getting the Client context from the URL we have passed. Then we are getting the Root Web which is the property Web, then we should get all the subwebs of the root web by getting the property Webs. But in CSOM you need to use the generic method Load() and then to call ExecuteQuery(). We cannot get objects from other objects on the fly as it is with the client-side code. Apparently our functions is working fine, but the Web objects are returned with only part of their properties and for the rest we need to load them with Load() and ExecuteQuery(). You can see this by piping to Select-Object and select Title and Url properties.


So our function is working fine, but first time we ran it the PowerShell failed to display the webs. This is because PowerShell used the default type display template to display Microsoft.SharePoint.Client.Web Type including some properties that are not loaded. For example SiteUsers, if we want to see the users we will need to load them. If we try to get this property from one of the webs we are going to receive below error:

An error occurred while enumerating through a collection: The collection has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested..

The PowerShell is using the default template because there are no display templates for Microsoft.SharePoint.Client types.
PowerShell display templates are XML files with extension .ps1xml. The display templates for the different types can be configured as list or table. Some of the files are stored in "C:\Windows\System32\WindowsPowerShell\v1.0" directory. If you open one of this files you are going to see warning that this file should not be edited but we have reference to the cmdlet Update-FormatData.
With this command we can load additional display templates for the Types that we want.
I have created such .ps1xml with templates for the most popular client objects. I can load it with below command.

Update-FormatData -AppendPath "C:\SPO\SPClient.Format.ps1xml"

Now when we have loaded our custom display templates lets see what will be the output from our function.


I think it is looking better now. You can download the template file below. You can change it and expand to suit your needs. The display templates will be loaded only for the current session. If you close your PowerShell. Next time when you or another user try to use CSOM you will need to load the templates again.

Monday 25 August 2014

Bulk Upload and Update User Profile Photos in SharePoint 2013

This post will be about a script I wrote and some interesting things I learned to write it. This script lets you upload and update multiple user profile photos for multiple user profiles in SharePoint 2013. There are many post on how we can rewrite the PictureURL property in the user profile object and assign it a custom value, but my script does not work that way. This script is emulating on what is happening in one popular out of the box scenario for importing user profile pictures, from AD.
The inspiration for this script came from an issue (of course) that one of my colleague have with one of our customers. The customer has some complex domain structure. They have around 16 domains and even federation. So the customer of course wants their SharePoint users to be able to have profile photos in their SharePoint profiles, so we have User Profile Synchronization in place and it is synchronizing the Picture user property from the AD attribute thumbnailPhoto. The users however can edit their profile picture and they upload new photos.
The issue is that since some CU installation this year, I am not sure, but I think it was April 2014, when a full synchronization is launched in the User Profile Application, the pictures that are uploaded by the users are lost. So my colleague ran the synchronization first on Staging environment that is almost identical to the Production, it has similar configuration of the User Profile Service Application and of course we lost the pictures that the user have uploaded. The Production environment however was not touched, but soon or later we had to apply some property mapping change and run full synchronization on Prod. too. We needed a way to get the profile pictures from Prod. reupload them to Staging for the correct profiles, then run full synchronization on Prod. and reupload again the user profile pictures.
And here I come with my powershell skills and some free time. As I said above I came across many scripts for editing the PictureURL property of the UserProfile object, but this option was not acceptable for me because in SharePoint we have 3 sizes of user profile photo thumbnails.
I first created  the script to download the biggest available picture thumbnail and save them under unique name that can later help me to determine to which account the picture belongs. I did this by getting the values of the PictureURL and the LoginName of the profiles and then simply download the largest thumbnail of the photo. You can get all the profiles in a User Profile Application with the snipped below. You will need a site collection URL to determine the service context and run it under Farm account. Of course the things are a bit different in scenario with multiple User Profile Applications or if the application is Partitioned(for multi tenant deployment).

$SiteURL = 'http://mysite.contoso.com/'
Add-PSSnapin Microsoft.SharePoint.PowerShell
$Site = Get-SPSite -Identity $SiteURL
$context = Get-SPServiceContext -Site $site
$upm =  New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)
$AllProfiles = $upm.GetEnumerator()
ForEach($profile in $AllProfiles)
{
$profile
}

And now I had all the profile pictures from Prod. and we had to figure out  how to upload them on Staging and "connect" them to the correct account.
And here comes the interesting part. So normally the pictures for the user profiles are imported from the thumbnailPhoto, but we have a URL to picture for value of the PictureURL property and the thumbnailPhoto, contain just a picture as binary. As we know after synchronization we need to run the out of the box command Update-SPProfilePhotoStore. 
So I found what is happening between the import and thumbnail generation.
When the import/sync from the AD happens the thumbnailPhoto value is saved as .jpg image in the same picture library with profile photos that also contains the thumbnail versions of the profile pictures. For example if you have My Site host http://mysite.contoso.com, the URL of the library in SharePoint 2013(I forgot to mension that we are working on SP2013 farm) will be http://mysite.contoso.com/User Photos/ and the pictures are in folder Profile Pictures. The thumbnails are named Account_LThump.jpg for large, Account_MThump.jpg, for midsize and Account_SThump.jpg for small. The midsize picture is used as profile photo.
So the initial picture are also saved in this library but under a special name like this 0c37852b-34d0-418e-91c6-2ac25af4be5b_RecordID.jpg. Here is the moment to say that we have only one, nonpartitioned User Profile Service Application. The first part if the picture name the guid (exactly this guid) is the guid of the default partition of nonpartitioned UPSA. It is the same on all SharePoint 2013 farms I have seen. You can check it by doing SQL query on the Profile DB, but only on non-production Farm because doing query on this DB is not supported. It is a column of every profile entry in the Profile DB.


The second part, the RecordID is a unique ID for every user in the partition. It is just a number not a guid. You can also see it in the picture above, you can also get it with powershell as property of the user profile object. But, so far I haven't found a way to see the PartitionID of the default partition of nonpartitioned UPSA in PowerShell.
And when you run Update-SPProfilePhotoStore with the corresponding parameters, it reads the available not converted to thumbnails .JPGs read the users RecordID, creates the thumbnails, add the url of the picture in the user profile and in the general case delete the original(depends on the parameters).
So this was our answer on how to upload the pictures from Prod to Staging and get all three thumbnails variants and map the PictureURL property. One additional feature to this method is that for PictureURL is mapped the url of the biggest thumbnail version(not the midsized as if it was imported from AD) and you will have pretty profile pictures with good resolution.
Here came the idea for this script. What if there are SharePoint Admins/Customers that do not want to get the pictures from the AD, but they have all the profile pictures and want to mass upload them as if the users have uploaded themselfs.
And I came up with this script. You need to run it two times(As Administrator under the Farm account) first time will be generated a CSV file with the RecordIDs,LoginNames and picture paths. The picture path will be empty you should fill it with the local or the network path to the picture. The picture can be whatever resolution and almost any format (JPG,PNG,GIF,BPM,TIFF), the script will convert it to jpeg and upload it under corresponding name. And when you have filled the picture paths you want(it is not mandatory to fill the path for all accounts), run again the script with appropriate parameter for upload and update.
You can download the script from the link below, for instructions and examples see the help of the script like this Get-Help .\Upload-ProfilePhotos.ps1 -full . I always put some fair amount of help topics in my scripts.

Download From: TechNet Gallery

Thursday 14 August 2014

Scripts to Add,Remove and and view SharePoint 2013 Maintenance Windows

My colleague shared a MSDN Blog post about new class in SharePoint 2013. With this class you can configure message that will appear on the sites informing the users for upcoming maintenance activity, this is done on Content DataBase level. The message can look like the picture below. The messages are predefined for several scenarios you can find samples at the mentioned MSDN Post. You can specify Maintenance Start Time,End Time,Notification Star Time,End Time, link with information about the maintenance and read only period. This is very cool feature, but I don't like the script that is shown in the post, so I wrote two scripts for Adding,Removing and View of the maintenance windows. You can choose individual Content DataBase,Site,Web App or All Content DataBases in the Farm. You can use pipeline to pass objects form other cmdlets, the script will ask you for confirmation before adding or removing the maintenance windows. For more information see the Help in the script there are many example how to use it. To run the scripts you will need Microsoft.SharePoint.PowerShell PS snap-in loaded.

                            Download the scripts form: TechNet Gallery 



Tuesday 12 August 2014

Site collection Term Set Groups and how to reconnect Term Set Group to site collection.

When we go to the Term Store manager in SharePoint 2013 Central Administration we can see a bunch of Term Set group with various term sets in it. This term set groups are visible in the entire farm, so far so good.
But we can have a term set group that out of the box will be visible/usable by the the site collection where it is created, this groups are sometimes referred as site collection term set group or local term set group. We can have such term set group if we activate the "SharePoint Server Publishing Infrastructure" site feature or by some manual method like in this Article. As I said this term set group, the term sets and terms in it will be initially visible/usable only for the site collection and this was related to the problem I had this week and of course the solution.
As you have noticed in most of my posts I pretty much use PowerShell and the SharePoint Object Model, so I will show you how this site collection term set group looks in PowerShell and what we can do with it.
For demo purposes in this post I am going to use two HNSC. One based on TeamSite (http://termsetT.auto.l) with no term set group and second (http://termsetP.auto.l) is with publishing features activated, so the second one comes with site collection term set group.
First we will see how to find the site collection term stet group. This can be achieved with below snipped.

Add-PSSnapin microsoft.sharepoint.powershell
$site = Get-SPSite http://termsetp.auto.l
$taxonomySession = Get-SPTaxonomySession -Site $site
$termStore = $taxonomySession.TermStores["Managed Metadata Service"]
$pubSiteGroup = $termStore.GetSiteCollectionGroup($site)

First we will get an instance of our publishing site. Then we are getting an instance of SPTaxonomySession, this is done with out of the box cmdlet Get-SPTaxonomySession, this object will give us access to the term stores of the web application where our site is deployed or to to the site subscription in multi tenant scenario. In our case this is our Managed Metadata Service application. The taxonomy store has method called GetSiteCollectionGroup,  we are giving  our site as argument and here is the output:


We have two interesting properties to look at. The first one is IsSiteCollectionGroup, it has value True, so this group should be site collection term set group for some site. Second one is SiteCollectionAccessIds, this id is actually telling us the ID of the site collection that can see, use the term sets and edit them. This is the ID of our publishing site. The third one is SiteCollectionReadOnlyAccessUrls, in my case this is the url of my team site that has no site collection term set group, now I will be be able to see the group from my team site, use the terms, but I will not be able to edit the content of the group from the Term Store Management Tool in the Team Site, only read only access. You can actually give read only access from the Term Store Management Tool of the site collection that has full access, when you select the term set group on the bottom you can see section called Site Collection Access, there you can add the URLs of the sites you want to grant read only access.
Last week I receive an email from one of our customers, they needed to do a restore of one of their sites using Restore-SPSite cmdlet and after it their site collection term set group disappeared from the term store and they heavily rely on managed metadata to tag documents and to navigate in huge libraries. They wanted to get back the old group with the old terms because there was a big number of documents tagged and if they create new tags or import them somehow they will be with different IDs, tagged documents will be with invalid tags and navigation and filtering will not work.
The issue here was that they had overwritten the old site with the restore and now the site was with different ID. The old term set group was there, but it was not associated to the site or the correct way here to say is that the new site ID had no access to the old term set group. This can be solved by granting access to the new site ID. I did the same restore operation to my publishing site to reproduce the issue and below is the sniped I used to grant access to the new ID. Well if you do this with publishing site you may receive "error to load the navigation because the term set was not attached correctly...". You can fix this by switching to Managed to Structured navigation and then again to Managed with associating with the correct term set. This however was not the case with the customer and here is the sniped that granted access to the new site ID.

$site = Get-SPSite http://termsetp.auto.l
$taxonomySession = Get-SPTaxonomySession -Site $site
$termStore = $taxonomySession.TermStores["Managed Metadata Service"]
$group1 = $termStore.Groups['Site Collection - termsetp.auto.l-1']
$group1.AddSiteCollectionAccess($($site.ID))
$termStore.CommitAll()

The issue was clearly solved. However I had a similar case with different customer, the customer claimed that the site collection term set group just disappeared. I was able to get the old group in PowerShell but nothing helped me to "bring it back to life" , so I created new group and instead to try to do something with the group I just moved the term sets that were in it to the new group via PowerShell and everything worked fine. I created the new site collection term set group in PowerShell. Lets look at the method we used to get the group for certain site in the first sniped GetSiteCollectionGroup. If you give another argument that is boolean (True/False), the method will check if there is site collection term set group and if you give True and there is no such group it will create it with the out of the box naming. If somewhere in the term store you have some group with the same name it will put a number at the end of the name. Remember that after doing some changes in the term store you should use CommitAll() method to save the changes in the database.

Add-PSSnapin microsoft.sharepoint.powershell
$site = Get-SPSite 'http://termsetT.auto.l'
$taxonomySession = Get-SPTaxonomySession -Site $site
$termStore = $taxonomySession.TermStores["Managed Metadata Service"]
$pubSiteGroup = $termStore.GetSiteCollectionGroup($site, $true)
$termStore.CommitAll()


Sunday 27 July 2014

SharePoint 2013 Service Account creation script and how I write scripts.

In this post I am going to share a PowerShell script that I wrote called Create-SPServiceAccounts.ps1 for creating the accounts needed for a SharePoint Server 2013 installation. There are many similar scripts, but I don't like how they are written. Also I have added an additional options in my script for Input file with the data for the accounts (account name,description and password), it can read CSV,XML and Excel files for input. I have also added the ability accounts to be created remotely by running the script from computer in the domain or not. I think that the script is well written and have some useful patterns and techniques for writing a readable, reusable PowerShell scripts. In this post I will do an overview how I am writing scripts and will try to explain some of the techniques I used in the Service Account creation script.

What tool  I am using for writing scripts?

Most of the time I am using a free third party ISE (Integrated Scripting Environment) tool called PowerShell Plus Professional by Idera . It has some great feature like code highlighting and it is easy to run and debug scripts. A great feature is that you can copy the code as HTML. I find this very useful because I don't use any code highlighters in this blog, instead I just copy my code from the tool and paste it in the HTML source of the posts. The tool comes with big script library for Windows,AD,SharePoint,SQL and many more. But it has some limitations and I also use the native PowerShell ISE and of course the powershell console to run and debug.

How I am reading three different types of Input files?

A bit of a background. One of the best practices for PowerShell scripting is modularity. Every piece of code with different  functionality that will be reused is organised in functions with own parameters and scope. The code from this script that do initial checks and logic is only 25 rows, but the functions are 222 rows.
So I have three different functions for reading different type of Input files. The script will check what is the file extension and will decide what function to use for reading the input file.

The function for CSV:

function Read-CSV{
 [CmdletBinding()]
Param (
    [parameter(Mandatory=$true)]
    [string]$Path,
    [parameter(Mandatory=$false)]
    [string]$DefaultPass
)
$csvImput = Import-Csv -Path $Path
$OutputAll = @()
ForEach($row in $csvImput)
{
    $Output = @{
        "AccountName" = $row.AccountName
        "Description" = $row.Description
        "Password" = $row.Password
    }
    If($DefaultPass){
        $Output['Password'] = $DefaultPass
    }
    If (Test-AccountData -Data $Output){
        $OutputAll += $Output
    }

}
Write-Output $OutputAll
}

The key for reading csv files is the native cmdlet Import-Csv. It will read the file and will create an object for each row(without the column names) in the csv with properties named as the csv columns and corresponding value. Then I am building a hash table for each object and the function returns a collection of hash tables for every account that needs to be created.

The function for XML:

function Read-XML{
 [CmdletBinding()]
Param (
    [parameter(Mandatory=$true)]
    [string]$Path,
    [parameter(Mandatory=$false)]
    [string]$DefaultPass
)
[xml]$xmlInput = Get-Content $Path
$OutputAll = @()
ForEach ($xElement in ($xmlInput.ServiceAccounts.Account))
{
    $Output = @{
        "AccountName" = $xElement.AccountName
        "Description" = $xElement.Description
        "Password" = $xElement.Password
    }
    If($DefaultPass){
        $Output['Password'] = $DefaultPass
    }
    If (Test-AccountData -Data $Output){
        $OutputAll += $Output
    }

}
Write-Output $OutputAll
}

For reading XML I am creating new XmlDocument object by reading the content of the xml file. Then we can work with every xmlElement that describes an account and take it's properties. Again as in the other functions for every account we are creating hash table and when we read all the information the function will return us a collection of hash tables.

The function for Excel files:

function Read-Excel{
 [CmdletBinding()]
Param (
    [parameter(Mandatory=$true)]
    [string]$Path,
    [parameter(Mandatory=$false)]
    [string]$DefaultPass
)
$objExcel = New-Object -ComObject Excel.Application
$objExcel.Visible = $false
$WorkBook = $objExcel.Workbooks.Open($Path)
$WorkSheet = $WorkBook.sheets | Where {$_.Index -eq '1'}
$intRowMax = ($WorkSheet.UsedRange.Rows).count
$OutputAll = @()
for($intRow = 2 ; $intRow -le $intRowMax ; $intRow++)
{
    $Output = @{
        "AccountName" = $WorkSheet.Range("A$($intRow)").Text
        "Description" = $WorkSheet.Range("B$($intRow)").Text
        "Password" = $WorkSheet.Range("C$($intRow )").Text
    }
    If($DefaultPass){
        $Output['Password'] = $DefaultPass
    }
    If (Test-AccountData -Data $Output){
        $OutputAll += $Output
    }
}
Write-Output $OutputAll
$objExcel.Quit()
(Get-Process -name excel -ErrorAction SilentlyContinue | Sort-Object StartTime)[-1] | Stop-Process -ErrorAction SilentlyContinue
Remove-Variable objExcel
}

For reading Excel file(.xlsx) we will need Excel office application installed. The script is creating a COM Excel application instance, opening the file, choosing the first workbook. Then unfortunately we need to read the content of each cell, as far as I know there is no way to treat every row as an object or something similar so we can take its content like in the CSV. So we are taking the count of the rows, skipping the first and we read every cell and creating a hash table. This is done in For loop where it depend what is the number of the row we are reading. And again the function will return a collection of hash table. All function will output the same object(s), so after the processing the Input file we will not care what type it was we will just process every hash table in our collection.
For the record, killing the Excel.exe process is not the correct way of removing COM object instance, but I had some weird behavior when I used below code to get rid of the COM instance.

[System.Runtime.Interopservices.Marshal]::ReleaseComObject($x)

All three functions are very similar, they have similar parameters and outputs, so we we will not care for input/output when we read different input file.
For more information on PowerShell function you can read about_Functions, about_Functions_Advanced , about_Functions_Advanced and all about files from the PowerShell Help.

Why so many Try/Catch constructions?

Try/Catch/Finally constructions are preferred way for error handling in PowerShell. This statements were introduced in v2.0, before that there was only available Trap statement. The error handling is big subject in Powershell, I will try to explain how Try/Catch/Finally works in example for creating new AD user.


Try{
    New-ADUser -SamAccountName "JohnD1111111111111111111" -Name "John Doe" -AccountPassword $secPass -DisplayName "John Doe" -ErrorAction Stop
}
Catch [System.Exception]{
    Write-Host "Unable to create Account, folowing exception occurred: $($_.Exception.Message)" -ForegroundColor Red
}

In this sample we will run the block in Try statement. If there is an error/exception the block in the 'Catch' statement will be executed. I have not included 'Finally' statement, but if there was its code will be executed anyway no matter if we have exception or not and it is not required. In our example the New-ADUser will fail because the value for SamAccountName is with length more than 20 symbols. However the 'Catch' block will be triggered only if the error produced in the 'Try'  is Terminating. Most of the PowerShell commandlets produce non-Terminating errors, that means that if the command fails the script/loop/pipeline will continue.Our cmdlet for creating new AD user also produces non-Terminating error, this is why we have explicitly set ErrorAction to Stop, this will make the error Terminating, The whole idea of error handling is to identify if we have error and do something. In this sample we will not see the error that the cmdlet produces, instead we will execute Catch and it will show us customer message that contains the message of the original exception and will tell us what is wrong, without seeing the entire exception from PowerShell. And because the error is in Try/Catch construction the whole script will not stop and will continue.

How locally defined function is executed on remote computer?

You can launch the account creation script from the DC that you want to create the accounts in or you can run it from completely different computer and create the accounts in the target DC remotely.
This is achieved very easy, I am just executing the function that creates the accounts locally on our remote DC and also remoting the parameters that I will use. The line looks like this:

Invoke-Command -Session $session -ScriptBlock ${function:Create-UsersLocal} -ArgumentList $Hash,$OUnit

Our function for creating accounts when the script run on the DC is Create-UsersLocal, I am just promoting it via the ScriptBlock and its parameter values via the ArgumentList.
And now explained. Let say that we have a functions called Hello-World, a very simple function that just say hello from the computer it runs in different color. The color of the text can be set as parameter. If we load the function it goes to a PSDrive called 'Function'. We can see on the screenshot below that this PSDrive contains our function. If we supply just the path to the function it will execute it on the remote computer as it is and we will fail if the function is not defined in the remote session. Our goal is to get the definition of the FunctionInfo object, this is done by adding dollar sign when supplying the script block for the remote session, this will force reading the definition of the function and convert it to useful script block for the remote session we are also transferring a variable with the same name as the parameter of the function and this is how the magic happens. You can see on the screenshot below.



For more information on PowerShell scripting I highly recommend to visit http://powershell.org/ especialy their Free eBooks section.

You can download my script from the    Technet Gallery