Monday 30 June 2014

RSS Feeds Web Part page in SharePoint

In this post I will not show anything new. Maybe some of you know it or some of you use it, but I think it is a cool idea and decided to share it with you, also this is a POC for my colleagues.
So, we have this internal project for rebuilding our corporate intranet. We are having meetings to gather the requirements from all departments across the company and of course we had a meeting for our own SharePoint team site.
My idea was to create a place in our site where we can follow all interesting posts and articles from the SharePoint community, because the SharePoint community is big part (luckily) from our business. Here is a way to do this.
Let's say you have a site based on the default "Team site" template. With this template you will get a "Site pages" Wiki Library. Go to the library via Site contents --> Site Pages. Then go to FILES --> New Document and choose to add  Web Part Page. Give the page appropriate name and choose desired layout.



Then you will receive a page with zoned work space where you can put the key element, which is the RSS Viewer Web part.
You can find the web part in Content Rollup --> RSS Viewer. Open the Web part tool pane and add the RSS Feed URL, refresh interval, feed limit and edit everything you want.


After you populate the zones you like, you can link the page in your navigation. You can see how it looks on the picture below. Of course you can use different arrangement, add some Community feature web parts or edit the display template of the RSS viewer web part.
However, you will need some internet connectivity on your servers to get the feeds working. 

Friday 27 June 2014

Get User Profile Synchronization connections in PowersShell and Advanced SharePoint functions in PowerShell

Maybe you have noticed that in SharePoint 2013 and SharePoint 2010 SP1 we have two cmdlets about Profile Sync. connections Add-SPProfileSyncConnection and Remove-SPProfileSyncConnection.
But against the PowerShell logic we have no Get-SPProfileSyncConnection cmdlet to list the current connections. In this post I am going to show you a short PowerShell function that I wrote that does this and going to explain some of the good practices when you develop advanced PowerShell functions for SharePoint. You can see the function below, it is really short.

function Get-SPProfileSyncConnection
{
 [CmdletBinding(DefaultParameterSetName="PSet1")]

Param(
    [parameter(Mandatory=$true,
        ParameterSetName="PSet1",
        ValueFromPipeline=$true)]
    [Microsoft.SharePoint.PowerShell.SPServiceApplicationPipeBind]$UPA,
    [parameter(Mandatory=$true,
        ParameterSetName="PSet2",
        ValueFromPipeline=$false)]
    [System.String]$UPAName
)
PROCESS{
    switch ($PsCmdlet.ParameterSetName) 
    {
        "PSet1" { $UPApp = $UPA.Read(); break} 
        "PSet2" { $UPApp = Get-SPServiceApplication | Where {$_.Name -eq $UPAName}; break} 
    }
    $context = [Microsoft.SharePoint.SPServiceContext]::GetContext($UPApp.ServiceApplicationProxyGroup,[Microsoft.SharePoint.SPSiteSubscriptionIdentifier]::Default)
    $upcMan = New-Object Microsoft.Office.Server.UserProfiles.UserProfileConfigManager($context)
    
    Write-Output $upcMan.ConnectionManager
    
}
}

Because of the specifics in User Profile Service Application(UPA) the function must run under the Farm account identity.
Now lets have a look on what is going on. In all cases we need to have a User Profile Service Application from where we can get the sync. connection(s) or other service application with proxy in the same proxy group as our target UPA, so we can get the Service context , this should be defined by the user as parameter. At the parameter section of the function we have defined two parameters from two parameter sets, UPA from parameter set PSet1 and UPAName from parameter set PSet2. The script can get the desired User Profile Service Application from both parameters. If the user have supplied a value for parameter UPAName, the script will decide how to assign value to the UPApp that represent our User Profile Service Application object. If the UPAName is present that means that PSet2 is used. This logic is achieved with the PowerShell switch statement. If the PSet2 is used the script will assign value of Service Application where the Name property is equal to the UPAName supplied by the user.
The interesting part is when UPA parameter is used (PSet1). This parameter is set to receive value from pipeline. This is achieved by parameter argument ValueFromPipeline=$true , it means that this parameter will take the value of object from the pipeline if it is the same type as the parameter (bind byValue). The type of the parameter is not Microsoft.Office.Server.Administration.UserProfileApplication
for example, it is Microsoft.SharePoint.PowerShell.SPServiceApplicationPipeBind .
In the native SharePoint cmdlets parameter 'Identity' can be represented by multiple ways. For examaple if you want to get SPSite with the cmdlet Get-SPSite. You can supply URL, GUID or entire SPSite object. If you need to get all sites from a web application, you will use "Get-SPSite -WebApplication"  and for the WebApplication parameter you can supply URL,ID, entire SPWebApplication object or use pipeline, because the expected type is SPWebApplicationPipeBind. In order to prevent the need for multiple parameters and multiple parameter sets Microsoft presented the PipeBind objects. After you supply valid object and valid object is object that can be converted by the PipeBind (GUID,String,..etc) you are converting it to the original object by calling Read() method. You can have a look at the available PipeBind classes in this MSDN Library .
You may have issues with the above function in advanced scenario with multiple proxies/proxy groups or with partitioned (multi-tenant) environment(haven't tested it yet), maybe it is better to take the ServiceContext from SPSite. I think it is a good example for creating a SharePoint cmdlet and it is working well in my case .


Monday 23 June 2014

SharePoint 2013 BCS Service Application Throttle Management

As we know the BCS Service Application is the out of the box way to connect SharePoint to external data sources. Once the "external data connection" is configured we can do Create, Read, Update, Delete and Query (CRUDQ) operations over the data. We can crawl the data, represent it as SharePoint list and so on.
In this post I will not explain how to set up the Service Application and how to use it, but I will show you how to configure the  BCS throttling in on-premise, non-multitenant SharePoint 2013 environment. Since this settings are working on service application proxy level I am not sure how this throttling works (if there is such) in SharePoint Online and I am unable to find any official information for multi-tenant SharePoint 2013. In multi-tenant SharePoint 2010 environment you cannot configure the BCS throttling per-tenant. If you want to do so, you can set up a unique proxy for each tenant, and associate those proxies with their respective web applications. 
The BCS throttling is enabled by default to prevent from Denial of Service attacks or just to limit the performance impact of some poorly designed solution on SharePoint or the external data provider system by stopping for example some huge transactions. It can be triggered by other conditions too as we will see below. But you may have the business need to change the values of the throttling rules or if you consider to turn it off. This rules are different from the "Resource Throttling" options in the Central Admin,Search Crawl throttling or the Office 2013 client application limitations. Before editing the default throttling rules, please consider the option to define filters when you create External Content Type if it is applicable in your case, for more information on the subject see following MSDN Article .
There are four Throttle Types and five throttle scopes. The Throttle Type is the 'metric' that will trigger the rule and the Throttle Scope is the connector type that the rule can be applied on. In below tables you can see the Types,Scopes and the applied rules.

Throttle Types:

ThrottleType Meaning
Items The number of records returned
Size The amount of data returned, in bytes
Connections The number of connections opened to the database, web service, or .NET assembly
Timeout The time until an open connection is terminated, in milliseconds

Throttle Scopes:

Scope Meaning
Global Applies to Database, Web Service, WCF, and .NET Assembly Connectors (not to Custom Connectors)
Database Applies to Database Connectors
WebService Applies to Web Service Connectors
Wcf Applies to WCF Connectors
Custom Applies to Custom Connectors

Throttle Rules:


Global
Database
WebService
Wcf
Items

image

Size


image
image
Connections
image
Timeout

image

image


You can see that there are no rules for the "Custom" scope. This is because this scope is reserved for advance scenario with custom connectors.
This may sound a bit confusing (it was confusing to me the first time), but I will try to make it more clear.
So, we cannot manage the BCS throttling rules from the web interface or SharePoint Designer, we can only do this in PowerShell or via public APIs and this should be done from one of the SharePoint servers. As we told earlier this settings are working on the proxy level, so we need to have the BCS Service Application Proxy first in order to see or change the throttling settings.
With below lines you can see the with Type: Items in Scope: Database :

$bcsProxy = Get-SPServiceApplicationProxy | where {$_.GetType().FullName -eq ('Microsoft.SharePoint.BusinessData.SharedService.' + 'BdcServiceApplicationProxy')}

Get-SPBusinessDataCatalogThrottleConfig -Scope Database -ThrottleType Items -ServiceApplicationProxy $bcsProxy

And here is the outcome of this command:


The Enforced property shows us, well if the rule is enabled or not. The Default is the limit that is applied on external lists, it can be overridden by custom web parts, but they are limited by the Max value.

Now what will happen if we leave the "Database" scope and chose ThrottleType "Connections"


We receive an error that tells us that such rule do not exist for the combination of Scope: Database and Type: Connections .
We have only two cmdlets for Get-SPBusinessDataCatalogThrottleConfig and Set-SPBusinessDataCatalogThrottleConfig, so there is nothing that can create new configuration. For Database connectors Scope we can use only throttling by Items (rows for database query) and Timeout. We can use "Connections" Throttling Type but it should be in combination with "Global" Scope and it will be applied on the rest of the connector types (see The Scope table) as well.

You can see the default values of 'default' and 'maximum' settings of the throttling rules:



In following example we have External Content Type (ECT) that is reading a table from MS SQL Database. The ECT has no filtering applied and the table contains 2517 rows. We are creating External List app so we can visualize the data from the ECT. The BCS throttling rules are with the default values. And we are receiving below message (you can see this in ULS log as well) :



Here comes the Set-SPBusinessDataCatalogThrottleConfig cmdlet, you can use below snippets to manipulate the throttling rule:


$bcsProxy = Get-SPServiceApplicationProxy | where {$_.GetType().FullName -eq ('Microsoft.SharePoint.BusinessData.SharedService.' + 'BdcServiceApplicationProxy')}

$dbRule = Get-SPBusinessDataCatalogThrottleConfig -Scope Database -ThrottleType Items -ServiceApplicationProxy $bcsProxy

#Default and Maximum must be provided together. This increases the limit for external lists to 3000. 
Set-SPBusinessDataCatalogThrottleConfig -Identity $dbRule -Maximum 1000000 -Default 3000

#This disables a throttling rule. 
Set-SPBusinessDataCatalogThrottleConfig -Identity $dbRule -Enforced:$false

#This enables a throttling rule. 
Set-SPBusinessDataCatalogThrottleConfig -Identity $dbRule -Enforced:$true

And after the change of the throttling rule we have all 2517 items from our table. After the change you may not receive immediate result, you may need to execute Get-SPBusinessDataCatalogThrottleConfig
against the rule you changed.


Sunday 15 June 2014

User Profile Synchronization reporting script

User Profile Synchronization Service in SharePoint have always been a sensitive topic. For the full featured User SharePoint Profile Synchronization we need to have this service instance running. In SharePoint 2013 we have lightweight "mode" called SharePoint Active Directory Import that is leveraging the System.DirectoryServices .NET classes and when we use this mode we don't need User Profile Synchronization Service running .
Under the hood SharePoint relies on a version of Forefront Identity Management to do the profile synchronization. Unfortunately sometimes the headache does not stop when you manage to properly start/restart the service instance. The profile information synchronization issues can be very visible for the users.
So we have a customer that is using the classic SharePoint Profile Synchronization with SharePoint 2013 to sync from AD.
The customer have a large number of domains that are hosted on remote sites around the globe. There isn't  something like Read-Only DCs  to be closer to the SharePoint and that can replicate from the remote domains and the SharePoint have to contact the remote DCs to do the profile synchronization.
Because of the distance between the SharePoint and the domains that should sync from, we often end up with unsuccessful profile sync. operations for some of the domains.
In this case we need to find a way to be notified for such issue on time and take some actions. It would be nice if SherePoint provide something like the crawl log  where you can see the status of the operation.
We can see the status if we go to the server where the User Profile Synchronization Service is running and open "C:\Program Files\Microsoft Office Servers\15.0\Synchronization Service\UIShell\miisclient.exe".



Some of the events can also be logged in the Event Viewer of the server.


I did some research and found that there is WMI Namespace for Microsoft Identity Integration Server (MIIS), there are some promising classes like ManagementAgent and RunHistory, also the WMI query can be executed remotely. For each Synchronization Connection in the User Profile Application we have a FIM Management Agent (MA) that link specific connected data source to FIM. Management Agent is responsible for moving the data source to FIM. For the usual setup where SharePoint is synchronizing profile information from AD DS the naming of the agent is MOSSAD-*CONNECTION_NAME*. For example, if we have synchronization connection named CONTOSO.COM, the name of the MA will be MOSSAD-CONTOSO.COM.
So I wrote a script that can do a report on the sync. operations for a given time. The report is in a form of CSV file and it can be mailed like an HTML table. Let say that you have scheduled an Incremental Synchronization  at 1:00 AM and usually entire process finishes for 1 hour and you want to have the status of the operation when it finish. You can schedule the script to run at 2:30 AM with this parameters:

.\Fim-SyncReport.ps1 -ServerName 'spfimsync.contoso.net' -Hours 3 `
-FromMailAdress 'spupsreporter@contoso.com' `
-MailAdress 'aaronp@contoso.com','sp-admin-team@contoso.com' `
-SMTPServer 'mailer.contoso.com' `
-ReportLoction '\\fileserver\fimnightreport'

This will do the WMI query on the server where the User Profile Synchronization Service instance is running, will save the report for the last 3 hours on a fileshare and will send the report via mail. The report will contain the MA name, status, synchronization profile, when the operation started, the number of sync. errors, discovery errors and retry errors. The status field is very important the script will take all the operations that are not with status "success", the status can be very helpful when you troubleshoot synchronization issue. Complete list of the RunStatus return string can be found in this MSDN Article . If all operations are in success no report will be generated and no mail will be send.
The report looks like this when it is sent via mail :



The key function from the script is :

function Get-FimMARuns
{
 [CmdletBinding()]
Param(
    [parameter(Mandatory=$true)]
    [string]$MaName,
    [parameter(Mandatory=$true)]
    [string]$Hours,
    [parameter(Mandatory=$true)]
    [string]$ComputerName
)
Process
{
    $timeSpan = New-TimeSpan -Hours $Hours
    $nowUTC = (Get-Date).ToUniversalTime()
    $timeToStart = $nowUTC.Add(-$timeSpan)
    $filter = ("MaName = '{0}'" -F $MaName)
    $allHistory = Get-WmiObject -ComputerName $ComputerName `
                  -Class MIIS_RunHistory `
                  -Namespace root/MicrosoftIdentityIntegrationServer `
                  -Filter $filter
    ForEach ($history in $allHistory)
    {
        $startTimeinDateTime = $history.RunStartTime | Get-Date
        if ($startTimeinDateTime -gt $timeToStart)
        {
            Write-Output $history
        }
    }
}
}

In this function we are utilizing the MIIS_RunHistory class to get all the runs for given MA name (we get all MAs staring with 'MOSS' in different function ). Then we are calculating a filter based on the start time so we can get the one that fit to the timeframe for our report.
When we have all operation that we need we can read the information from them. The class return objects for every run (run history entry). We can see the full information with this method:

[xml]$asXML = $faultyOp.RunDetails().ReturnValue

The history entry has method RunDetails that has property ReturnValue . When we get this property we are receiving an XML with all details from the run, then we can get whatever value we want. You can get the script from the link below and edit it (specific filtering by MA,Sync Profile, Status ...etc.) as you find useful in your case. Just remove the underscore from the file extension, this is a zip file with the script and mail body template.

Download From: TechNet Gallery

Tuesday 10 June 2014

ULS viewer is gone...

If you had lately tried to download our favorite ULS log viewer, yes the one from the picture below.
You have noticed that it is not available anymore!

As Todd Klindt told us in his netcast with the retirement of the MSDN Archive Gallery we lost the ULS log viewer tool. Every SharePoint professional knows how precious this tool is for troubleshooting SharePoint.
There is a site called bringbackulsviewer.com there you can fill a survey with only two question.
If you want Microsoft to re-publish the ULS viewer? and If you would like Microsoft to publish its source code?

Meanwhile if you do not have the ULS viewer you can download it from the Todd Klindt mirror Here , just remove the _ when you download it. Or you can download it from my Link .




Monday 9 June 2014

SharePoint 2013 Quota Template creation script

As SharePoint administrator I frequently have requests for for setting up new site collection quotas and associating site collection with them.
For me, the process for this task is very annoying when it is done via the UI in the Central administration.
The problem is that we do not have OOB PowerShell functions to do this. So I created a script that can create new quota templates and can associate it with one or more site collections.
You just need to give a Name of the new quota template (the script will check first if there is a quota whit this name) Max Level of storage. Optionally you can give a Warning Level, but if it is not specified the script will set 80% from the Max value.

The script is creating the quota via the Object Model and assignment of the quota is done with the Set-SPSite cmdlet.


function CreateQuotaTemplate ($qName, $qMaxLevelMB, $qWarnLevelMB)
{
    $quotaTemplate = New-Object Microsoft.SharePoint.Administration.SPQuotaTemplate
    $quotaTemplate.Name = $qName
    $quotaTemplate.StorageMaximumLevel = ($qMaxLevelMB * 1024)*1024
    $quotaTemplate.StorageWarningLevel = ($qWarnLevelMB * 1024)*1024
    $contentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
    $contentService.QuotaTemplates.Add($quotaTemplate)
    $contentService.Update()
}


Below you can see how the new quota is created. It may not appear immediately in the Central Admin page, but you can see it by listing all quotas with flowing line:


[Microsoft.SharePoint.Administration.SPWebService]::ContentService.QuotaTemplates



As I said above. You can create a quota and directly apply it to one or many site collection.



Feel free to use and modify the script to best suit your needs. I am working with it on SharePoint 2013 on PowerShell 4.0


The "Sign in as Different User" feature missing from SharePoint 2013

Most probably you have noticed that "Sign in as Different User" option is missing from SharePoint 2013.
There are certain reasons for this.
There are many posts on the subject like for example the one of Stefan Goßner where he describes some of the issues of using it on SharePoint 2010. When you use this option it is doing an artificial 401 response to prompt the user for new credentials. But with this the artifacts (Cookies, session variables, etc..) of the previous user are not cleaned up. If you are interested in the topic you can check KB2435214 .
But as administrators sometimes we need some quick and handy way to sign in under different user on some non-productive environment for testing purposes. In this post I am going to describe three ways to achieve this.

Approach 1:
   Just call this URL :

http://siteurl/_layouts/closeConnection.aspx?loginasanotheruser=true

As described in KB2752600

Approach 2 :
  This method is not recommended as it requires to edit a file under 15 hive folder and this change can be rewriten by update of the farm.

  1 . Locate the following file under your 15 hive folder : \TEMPLATE\CONTROLTEMPLATES\Welcome.ascx

  2. Open it in your editor of choice, and add below element just before the existing "ID_RequestAccess"


<SharePoint:MenuItemTemplate runat="server" ID="ID_LoginAsDifferentUser" Text="<%$Resources:wss,personalactions_loginasdifferentuser%>" Description="<%$Resources:wss,personalactions_loginasdifferentuserdescription%>" MenuGroupId="100" Sequence="100" UseShortId="true" />
  3. Save the file

Approach 3 :

  There is a free solution called "Alt Login" developed by Kaboodle.
  You can find more information and download the wsp  package here . Just deploy the solution, activate the feature and you will have "Sign in as Different User" in SharePoint 2013.






This are my three options if I need to switch on different user in some non-production test environment.