Skip Ribbon Commands
Skip to main content

Anders Rask on SharePoint

:

Home
Random ramblings from a SharePoint Solution Architect and Developer
April 18
Slides and code from my Azure Global Bootcamp 2016 session

The slides from this weekends session on Azure Global Bootcamp 2016 Aarhus are now available for download here.

It was nice seeing so many people showing interest in Application Insights, and all the other Azure sessions. :)

Thanx alot to our sponsors from ProActive, Klestrup partners, Microsoft and ITC for cake and service :)

Also thanx toPeter Selch for arranging this excellent event :)

http://aarhus-azurebootcamp.azurewebsites.net/

http://global.azurebootcamp.net/

March 14
Slides and code from my Office 365 Saturday Copenhagen 2016 session

This saturday I did a session at Office 365 Saturday: Monitoring SharePoint usage and Performance using Application Insights​.

The session went ok, but demos weren't that exiting, since Azure Application Insights (which is still in preview) were down during my session. I knew they were doing updates on the service, so I took some screenshots.

Please find my presentation and PowerShell/JS used in demo in this zip.

I also made the presentation available on SlideShare.

March 31
Cisco AnyConnect and Windows 8.1

Since the company I work for are working with a lot of different clients, most of my work happen in remote desktops on different client networks.

Hence I have created a dedicated VPN client machine on HyperV that I use for connecting to these networks using different VPN clients.

Since most customers have disabled that you can run VPN from a remote desktop client, you need to connect directly from a host session in HyperV rather than connecting to it remotely. 

I recently upgraded that machine to Windows 8.1 Enterprise 86x, and noticed that even after upgrading my Cisco AnyConnect​ to latest release (that works with Win 8.1) my session was detected as a remote desktop and connection was rejected with the error "VPN Establishment capacity from a remote desktop is disabled. A VPN connection could not be established".

One of the new features in Windows 8.1 is the ability to copy/paste directly into the HyperV host session, which was something you could only do in Remote Desktop in earlier versions. This turned out to be the caveat: once i disable Enhanced Session from View > Enhanced Sessions and log in again, the VPN connection worked fine!

June 05
Getting an overview over used accounts on a SharePoint 2010 farm

​In SharePoint 2010 accounts comes in two flavours: Managed accounts and Service accounts. This means we have to dig a bit around to get an overview over what accounts are actually used. Since some are managed account and some are not, the PowerShell commands below in some cases return the same accounts. Think of the commands as a quick shortcut to get an overview of where certain accounts are used in your farm:

First off, you can get an overview of the existing managed accounts simply by typing

Get-SPManagedAccount

This however does not tell you where an account is used, so lets dig a bit deeper.

First lets see where we should expect accounts to surface. The below list is probably not complete but drop me a comment and I will add any accounts I have missed out:

  • Service Application Pools (managed accounts)
  • Service Applications  (mostly managed accounts)
  • Web Application Pools (managed accounts)
  • Service instances - (mostly managed accounts)
  • Services (like SPTimerV4)
  • Object cache accounts (reader and user)
  • Search crawler account (managed account)

<Update>

Get Farm administrators

Find the farm administrators using the following cmdlets

Get-SPWebApplication -IncludeCentralAdministration | ? IsAdministrationWebApplication | Select -Expand Sites | ? ServerRelativeUrl -eq "/" | Get-SPWeb | Select -Expand SiteGroups | ? Name -eq "Farm Administrators" | Select -expand Users

 </Update>

Service Application Pool accounts

Using the cmdlet

Get-SPServiceApplicationPool

gives you both service application pool name and process account name.

Service Application accounts

To find out what service application pools are used for a given service application use this command:

Get-SPServiceApplication | select -expand applicationpool -EA 0

Note that the -EA = 0 (-ErrorAction SilentlyContinue) will swallow any exceptions due to the fact that not all service applications are web based (inherits from SPIisWebServiceApplication).

A special case  to be aware of, is the User Profile Synchronization Service Connection. This account is not managed, and can be a bit tricky to find using PowerShell.

First get a hold of the UserProfileConfigManager, then select the connection manager and get the account name:

$configManager = New-Object Microsoft.Office.Server.UserProfiles.UserProfileConfigManager( $(Get-SPServiceContext http://yourSite))
$configManager | select -expand connectionmanager | select AccountUserName

Web Application Pool accounts

Getting to the web application pools are not straight forward, as they do not have cmdlets defined like Service Application Pools. To access existing web application pools we use the Content Service:

[Microsoft.SharePoint.Administration.SPWebService]::ContentService.ApplicationPools | Select Name, Username

 If you want to find out what application pools, and hence accounts, are used by existing web applications this is pretty straight forward:

Get-SPWebApplication | select -expand applicationpool | Select name , username

 

Service Instance accounts

The command to get these gets a bit longwinded to account for that some are managed and some not:

Get-SPServiceInstance | select -expand service | % { if ( $_.ProcessIdentity -and $_.ProcessIdentity.GetType() -eq "String") { $_.ProcessIdentity } elseif ( $_.ProcessIdentity ) { $_.ProcessIdentity.UserName }}

 

Services

Using Get-Process does not contain information about what accounts the services are running under. Getting this information would require us to dig a bit deeper.

Fire up PowerShell and type in the following:

Get-WmiObject -Query "select * from win32_service where name LIKE 'SP%v4'" | select name, startname

This should give you output like this:

name               startname
----               ---------
SPAdminV4          LocalSystem
SPTimerV4          CONTOSO\svcSPFarm
SPTraceV4          NT AUTHORITY\LocalService
SPUserCodeV4       CONTOSO\svcSPUserCode
SPWriterV4         CONTOSO\svcSPFarm

Other processes ends with "14":

Get-WmiObject -Query "select * from win32_service where name LIKE '%14'" | select name, startname 

Object cache accounts

These accounts are used for accessing cached data. Not setting them causes a performance overhead as explained here.

The values are stored in the Web Application properties and can be fetched like this:

Get-SPWebApplication| % {$_.Properties["portalsuperuseraccount"]} 

Get-SPWebApplication| % {$_.Properties["portalsuperreaderaccount"]}

 

Search crawler account 

Setting this account can be done using Set-SPEnterpriseSearchServiceApplication -DefaultContentAccessAccountName, but querying it is a bit tricky:

New-Object Microsoft.Office.Server.Search.Administration.content $(Get-SPEnterpriseSearchServiceApplication) | Select DefaultGatheringAccount

 

Conclusion 

The above commands should give you an overview of where your accounts are used. There are more accounts not listed above, for example all accounts used for Secure Store, unattended service accounts for services like Visio, but most are covered above.

If I have forgotten some important accounts or if you see something blatently wrong in the above, feel free to comment :-)

March 20
Importing and Exporting Search Configuration Settings in SharePoint 2013

It is often useful when you do SharePoint development to be able to deploy indexed and managed properties and property mappings in a repeatable way across environments.

In SharePoint 2010 we did this as part of our PowerShell provisioning, but things have changed a bit in SharePoint 2013. Not only have the managed properties changed quite a bit with the introduction of Managed Navigation, but also as a developer and architect you really need to think Cloud First if you want your deployment frameworks to work both on premises and in the cloud.

In the Client Side Object Model (CSOM) there is a framework for managing search settings, namely the SearchConfigurationPortability class.

This class makes it possible to import and export search configuration settings as XML. This includes Crawled Properties, Managed Properties and Mappings, but also Query Rules, result types etc.

Here is some sample PowerShell code to export the search configuration settings to XML: 

[reflection.assembly]::LoadWithPartialName("Microsoft.SharePoint.Client") | Out-Null
[reflection.assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.search") | Out-Null
$context = New-Object Microsoft.SharePoint.Client.ClientContext("http://intranet")
$searchConfigurationPortability = New-Object Microsoft.SharePoint.Client.Search.Portability.searchconfigurationportability($context)
$owner = New-Object Microsoft.SharePoint.Client.Search.Administration.searchobjectowner($context,"SSA")
$value = $searchConfigurationPortability.ExportSearchConfiguration($owner)
$context.ExecuteQuery()
[xml]$schema = $value.Value
$schema.OuterXml | Out-File schema.xml -Encoding UTF8

Note that I export the configuration from SSA. You can use the SearchObjectLevel enum to decide from what level you want to grab the settings: SSA, SPSiteSubscription, SPSite, SPWeb.

So to import the settings again on another environment we can do something like this:

[reflection.assembly]::LoadWithPartialName("Microsoft.SharePoint.Client") | Out-Null
[reflection.assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.search") | Out-Null
$context = New-Object Microsoft.SharePoint.Client.ClientContext("http://intranet")

$searchConfigurationPortability = New-Object Microsoft.SharePoint.Client.Search.Portability.searchconfigurationportability($context)
#$owner = New-Object Microsoft.SharePoint.Client.Search.Administration.searchobjectowner($context,"SSA")
$owner = New-Object Microsoft.SharePoint.Client.Search.Administration.searchobjectowner($context,"SPSite")
[xml]$schema = gc .\schema.xml
$searchConfigurationPortability.ImportSearchConfiguration($owner,$schema.OuterXml)
$context.ExecuteQuery()

 

So the idea here is that you configure your SSA as you want to, with crawled properties, managed properties, mappings, query rules etc, you then import them into your "next" environment, that being either TEST, PREPROD or PROD. This especially helpful when you use TFS and Lab Management to spawn up and test your code and deployment.

The XML looks something like this (edited it since it is HUGE):

 <SearchConfigurationSettings xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Portability">
  <SearchQueryConfigurationSettings>
    <SearchQueryConfigurationSettings>
      <BestBets xmlns:d4p1="http://www.microsoft.com/sharepoint/search/KnownTypes/2008/08" />
      <DefaultSourceId>00000000-0000-0000-0000-000000000000</DefaultSourceId>
      <DefaultSourceIdSet>true</DefaultSourceIdSet>
      <DeployToParent>false</DeployToParent>
      <DisableInheritanceOnImport>false</DisableInheritanceOnImport>
      <QueryRuleGroups xmlns:d4p1="http://www.microsoft.com/sharepoint/search/KnownTypes/2008/08" />
      <QueryRules xmlns:d4p1="http://www.microsoft.com/sharepoint/search/KnownTypes/2008/08" />
      <ResultTypes xmlns:d4p1="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Administration" />
      <Sources xmlns:d4p1="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Administration.Query" />
      <UserSegments xmlns:d4p1="http://www.microsoft.com/sharepoint/search/KnownTypes/2008/08" />
    </SearchQueryConfigurationSettings>
  </SearchQueryConfigurationSettings>
  <SearchRankingModelConfigurationSettings>
    <RankingModels xmlns:d3p1="http://schemas.microsoft.com/2003/10/Serialization/Arrays" />
  </SearchRankingModelConfigurationSettings>
  <SearchSchemaConfigurationSettings>
    <Aliases xmlns:d3p1="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Administration">
       ...
    </Aliases>
    <CategoriesAndCrawledProperties xmlns:d3p1="http://schemas.microsoft.com/2003/10/Serialization/Arrays">
      ...
    </CategoriesAndCrawledProperties>
    <CrawledProperties xmlns:d3p1="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Administration" i:nil="true" />
    <ManagedProperties xmlns:d3p1="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Administration">
      ...
    </ManagedProperties>
    <Mappings xmlns:d3p1="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Administration">
      ...
    </Mappings>
    <Overrides xmlns:d3p1="http://schemas.datacontract.org/2004/07/Microsoft.Office.Server.Search.Administration">
      <d3p1:LastItemName i:nil="true" />
      <d3p1:dictionary xmlns:d4p1="http://schemas.microsoft.com/2003/10/Serialization/Arrays" />
    </Overrides>
  </SearchSchemaConfigurationSettings>
</SearchConfigurationSettings>

The crawled properties will show up in the CategoriesAndCrawledProperties section. 

NOTE: I only got the Mappings part of search configuration settings to work for SearchObjectLevel SPSite, not SSA. I will look further into this, but if anyone has an idea to why this does not work, throw me a comment!

The error I get if i use SSA when importing Mappings are:

Exception calling "ExecuteQuery" with "0" argument(s): "Search schema could not be successfully imported:
*  Search schema could not be successfully imported:
*  The master schema can't be imported.
"
At C:\Users\spinstall\Downloads\csomsearch\importSearchConfiguration.ps1:28 char:1
+ $context.ExecuteQuery()
+ ~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : ServerException

 

Thanx to my brainy colleague Allan Hvam for boilerplate code.

January 15
Throttling performance of your SharePoint 2013 Enterprise Search Service on your dev box

​If you have set up a SharePoint 2013 farm, for example for your development box, chances are you have wondered where all that good RAM went, and why performance is so slow when you start to add service applications to your farm.

A check in Process Monitor reveals that a fair chunk of RAM is used by a number of processes called Noderunner.exe (Microsoft® SharePoint® Search Component). These are the components you configured when creating your Search Service Topology. If you did this in PowerShell you can decide which components should run on what server (eg. Analytics Processing or Admin component).

In beta these processes had a memory leak, and the "cure" was to limit the space used by changing noderunner.exe.config. This however is *not* supported, and not recommended, so don't do this on production environment! If your components run out of memory they will start acting very weird and/or crash. I even have managed to kill my farm beyond repair by setting the memory limit too low, so consider yourself warned!

Another option for your dev farm is to change your performance level of your search service using

Set-SPEnterpriseSearchService -PerformanceLevel Reduced

This will reduce the maximum numbers of threads to the number of available processors. Note that this will only have effect on the server instance running the crawl component.

For more information on this topic read "Scale search for performance and availability in SharePoint Server 2013" on TechNet and the guide "Enterprise search architectures for SharePoint Server 2013".

October 28
Configuring LDAP filters in AD Import

So if you have played around a bit with SharePoint 2013, you might have noticed one of the new features: AD Import for User Profile Service Application.

This baby is lightning fast! I haven't tried it on the RTM yet against a serious AD, but it is safe to say it is way way faster than its FIM counterpart: User Profile Synchronization Service Instance!

Instead I will show how to set up LDAP filters, and how that effect what profiles are imported and how they affect already imported profiles (or dont).

I will also talk a bit about a couple of problems that you may bump into when working with AD Import combined with LDAP filters.

First off a quick guide on how to set up AD Import on an already configured UPA (if you are interested in further details on how AD Import works, check out Spence Harbar's post on that subject here).

After setting up your User Profile Service Application, click Configure Synchronization Settings

configureADimport.JPG

and select "Use SharePoint Active Directory Import".

configureADimport2.JPG


Fair warning: If you already have created a User Profile Synchronization connection that uses UPS, this and all its mappings will be deleted at this point!

It is true that articles mention that you can "swap between" UPS and AD Import, but it comes at a price: all your mappings will be gone! If you provision and export those using PowerShell and XML (as I do) and those XML files are up to date with what ever user configuration your users have been up to, this is fine! But if not, you are out of luck! Anyone who have tried setting up mappings manually, feeling like an idiot clicking those small "up" and "down" arrows to move properties on the public profile page will know what I am talking about :-)

Now all we need to do is create a new Synchronization Connection:

configureADimport3.JPG

You can only select Active Directory Import. Authentication provider can be Windows, Forms or Trusted Claims Provider.

At the very bottom you can select LDAP Filters in a tiny text box (almost comically small, considering how large those LDAP filters can grow really!).

adimport.JPG



<EDIT>

As Spence pointed out after me publishing this post, there is a nice little check box for filtering out disabled users.

The quirks are the same though when you use the checkbox, and for other filters as well, so the conclusion (if there are any) stands :)

</EDIT>

First try and set up your connection without filters: add domain name, user name and password (remember that the account must have Replicating Directory Changes permissions, just like with UPS, nothing changed there!), click Populate Containers and select the OU's you want to include in your synchronization.

Click OK and go back to the User Profile management page. Here you select Start Profile Synchronization and select a Full Synchronization for the initial sync.

configureADimport4.JPG

So while that one finishes... uhh wait!??! It is already finished! WOW that was fast huh? ;-) Nothing slow about this babe comparing it to the FIM sync!

So next step is LDAP filters! What is the syntax?

It is probably no surprise to the admin guys out there that LDAP filters are no new invention. There are tons of info out there on the syntax of these bad-boys, here are a few:

This is the kind of stuff that you would set back in MOSS on your SSP import connection back in the days, and the good old KB article from back then still applies on how to filter out disabled accounts: How to import user profile information of enabled user accounts from Active Directory to SharePoint

A thing to point out here, that may or may not seem illogical to you, is that in LDAP filters you dont tell what items to exclude, but what to include (in contrast to when you create UPS Exclusion filters). So in order to tell the Synchronization Connection to filter out disabled users, we need to tell it to only include enabled users:

(&(objectClass=User)(!userAccountControl:1.2.840.113556.1.4.803:=2))

The thing to point out here is the "!" (or "not" if you happen to be a developer). In plain wording it say: "Give me all users who are not disabled". Hint: You can use ADSIedit.exe to check the userAccountControl setting on your AD objects.

So why use LDAP filters at all when we have UPS Exclusion filters? Well as the name kind of gives away, those babies only works on UPS, and we are not using UPS, we are using AD Import, right?

OK so lets try and set this up on our synchronization connection: just paste the above (or any other filter you may need) into the Filter in LDAP syntax for Active Directory Import field and populate the container. First thing you notice is that the filter dont seem to be applied on the tree view that is opened. This is probably per design I would venture...

After setting it up, go to your AD and list (for example with an LDAP filter ;-) disabled users in the selected UA. Also try and add some new users, both disabled and not.

Now do a sync (an incremental seems to be enough to make the filter take effect).

What happens now in my experience (on beta2 anyway), is that any disabled users already in your user profile stays in the user profile. This may and may not surprise you, but apparently that is how this importer works: since the filter is no longer including these disabled users, they are simply left alone! Trying to update these users confirms this: adding a value in a mapped property is not propegated to the user profile either.

So what happens if you disable a user that is already in your user profile, and you have the above LDAP filter applied? Well, still nothing! Same as before: since the user is not in your import its user profile is no longer updated. Eg. if you set an email on the disabled account and synchronize, changes are not moved to the user profile.

In both cases this kind of makes sense, since the user is not included any more, but the fact that it is not considered "deleted" or "removed" might surprise you administrators out there: you will have to delete already included user profiles for disabled users yourself!

So what happens when you delete a user in your AD and you are using AD Import?

Just like when running FIM users are marked for deletion in the Profile database when an incremental sync would detect that the users were gone. After a while (every hour in SP2010, but at least in SP2013 beta2 this was changed to daily) the My Site Cleanup Job would finish up removing User Profiles marked for deletion (for more info read another blog post by Spence here on Account Deletion).

So this works like with UPS. Well almost:

An unexpected gotcha: the users with disabled accounts that was initially included in the User Profiles are not deleted when the LDAP filter to only include enabled users are active on the AD Import synchronization connection! The logic behind this is the same as before: since the user is no longer a part of the import, it being deleted is ignored!

So what do we conclude? Old skool LDAP filters does work -kindof, but be aware of the above mentioned quirks!

If you use AD Import, LDAP filters are the only option! And I never thought I would have to say this, but the nerdy LDAP filter syntax actually is easier than working with UPS Exclusion filters, but that is mostly because of the really non-intuitive and quirky UI that UPS Exclusion filters have...

September 21
Microsoft Message Analyzer messes up PowerShell on SharePoint 2010

Microsoft Network Monitor is a very useful tool for capturing network traffic. A few days ago a new version was announced on Connect as a successor of NetMon:

Microsoft Message Analyzer beta 1 with a lot of new features.

But you might want to think twice before installing this cool tool on your SharePoint developer box: Shortly after installing it, I opened a SharePoint PowerShell Console and got this message:

The local farm is not accessible. Cmdlets with FeatureDependencyId are not registered.

Uhmmm... what? The farm was running fine in browser, so I checked the farm status in PowerShell:

PS>Get-SPFarm
Get-SPFarm : Microsoft SharePoint is not supported with version 4.0.30319.269 of the Microsoft .Net Runtime.
At line:1 char:11
+ Get-SPFarm <<<<
    + CategoryInfo          : InvalidData: (Microsoft.Share...SpCmdletGetFarm:SpCmdletGetFarm) [Get-SPFarm], PlatformNotSupportedException
    + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SpCmdletGetFarm

Oh dear... Obviously PowerShell is now running .NET 4! How did that happen?

So I check if PowerShell somehow has updated to V3:

PS>$host.Version 

Major  Minor  Build  Revision
-----  -----  -----  --------
2      0      -1     -1

 

Ok, so we are good regarding that, so what about .NET version?

PS>[Environment]::Version
Major  Minor  Build  Revision
-----  -----  -----  --------
4      0      30319  269

Oooops! So apparently Microsoft Message Analyzer has changed the default .NET version on PowerShell.

Uninstalling Microsoft Message Analyzer put everything back to normal ;-)

So if you want to try out this tool, don't do it on your SharePoint 2010 box, try it on your SharePoint 2013 beta box instead :-)

EDIT:

Another option for running PowerShell 3.0 on a SharePoint 2010 box is to use the -Version 2 parameter when starting up PowerShell on your SP2010 boxes as specified here​

September 11
Redirect to User Info instead of My Site Public Profile

We had a request by a customer that external users should not be able to view My Site Public Profile pages. This proves a challenge because SharePoint links to _LAYOUTS/UserDisp.aspx?id=123 all over the site (for example in Modified By columns on list views). Out of box SharePoint have delegate controls that check if you have a My Site Host running or not, or if the user id you requested is a group or a user. Clicking on these links for users without access to My Site Host would simply give them a prompt to a site they would't have access to, so not optimal!

Instead I wanted to link them to the view that you get if you *dont* have a My Site Host: The item view in the user info list instead. As a plus that would make it possible for me to decide what info these limited access users should see using the "replicable" checkbox in Edit User Profile Properties:

replicable.GIF

There are alot of info out there on SharePoints Hidden User List, for example check my buddy Tobias Zimmergrens blog post on this topic.

So to short-wire this logic I created a new delegate control with same ControlId "Redirect ProfileRedirection" just with a lower squence (40) (shout-out to Keith Dahlby for putting me on the right track with this post).

The check could have been done in many ways, but since I knew that the users who didn't have access to My Site Host was in a specific security group in AD, I chose to use an audience for "internal" and "external" users that I was already using for other purposes.

I use the IFormDelegateControlSource interface in my delegate control as this gives me the proper hook and object (in this case the relevant users SPListItem from user information list).

The code also includes the original code that I override (Microsoft.SharePoint.ApplicationPages.UserDisplayPage) that checks if a user is a person or group. To make sure that the user info list item view is used instead of the profile, I add a Force=true parameter to the query string. This is the OOB way of enforcing this view is used no matter if a User Profile Service is running on the farm.

 

    /// <summary>
    /// This class is called by a delegate control in AdditionalPageHead before the UserDisplayPage delegate control
    /// that takes care of redirection to My Site public profile.
    /// We want to short wire this redirection for users with no access to My Site Host
    /// </summary>
    public class RedirectToUserList : UserControl, IFormDelegateControlSource
    {
        public void OnFormInit(object objOfInterest)
        {

 

            CancelRedirect(objOfInterest as SPListItem);
        }
        public void OnFormSave(object objOfInterest)
        {}
        protected void CancelRedirect(SPListItem item)
        {
            if (item == null)
                return;
            try
            {
                // check if user is really a group
                var contentTypeId = (SPContentTypeId)item["ContentTypeId"];
                if (SPBuiltInContentTypeId.SharePointGroup.IsParentOf(contentTypeId))
                {
                    string url = "people.aspx?MembershipGroupId=" + item.ID.ToString(CultureInfo.InvariantCulture);
                    string keyOrValueToEncode = base.Request.QueryString["Source"];
                    if (keyOrValueToEncode != null)
                    {
                        url = url + "&Source=" + SPHttpUtility.UrlKeyValueEncode(keyOrValueToEncode);
                    }
                    SPUtility.Redirect(url, SPRedirectFlags.Static | SPRedirectFlags.RelativeToLayoutsPage, HttpContext.Current);
                }
                int id;
                // if user is member of "internal" audience redirect to MySite
                if (IsMember("internal"))
                {
                    var context = SPServiceContext.Current;
                    var userProfileManager = new UserProfileManager(context);
                    if (userProfileManager == null)
                        return;
                    if (userProfileManager != null && !string.IsNullOrEmpty(userProfileManager.MySiteHostUrl) && userProfileManager.UserExists(item.Name) )
                    {
                        if (int.TryParse(Request.QueryString["id"], out id))
                        {
                            SPUtility.Redirect(string.Format("{0}/Person.aspx?accountname={1}", userProfileManager.MySiteHostUrl, item.Name), SPRedirectFlags.Trusted, HttpContext.Current);
                        }
                    }                   
                }
                // else force redirect to user list
                else
                {
                    bool force = false;
                    if (!bool.TryParse(Request.QueryString["force"], out force) || !force)
                    {
                        if (int.TryParse(Request.QueryString["id"], out id))
                        {
                            // force SharePoint to redirect to "hidden user list" instead of redirecting to My Site profile
                            SPUtility.Redirect(string.Format("userdisp.aspx?Force=True&ID={0}", id), SPRedirectFlags.Static | SPRedirectFlags.RelativeToLayoutsPage, HttpContext.Current);
                        }
                    }
                }
            }
            catch (SPException)
            {
                // TODO: log
            }
        }
       
        private static bool IsMember(string audience)
        {
            AudienceManager audienceManager = new AudienceManager(SPServiceContext.Current);
            try
            {
                // we dont use
                // audienceManager.GetAudience(audience);
                // since this requires elevation (calling index directly doesnt! )
                Audience internalAudience = audienceManager.Audiences[audience];
                return internalAudience.IsMember(SPContext.Current.Web.CurrentUser.LoginName);
            }
            catch (AudienceArgumentException)
            {
                // audience does not exist
                return false;
            }
        }

 

        
    }

 

After creating this class, we need to include it in a feature using a delegate control (I use the Delegate Control Visual Studio item template from from CKSDev, but you can also use an Empty Element template):

<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
  <Control
    Id="ProfileRedirection"
    Sequence="40"
    ControlAssembly="Demo.SharePoint.Intranet.Controls, Version=1.0.0.0, Culture=neutral, PublicKeyToken=ca4699af43726411"
    ControlClass="Demo.SharePoint.Intranet.Controls.CustomControls.RedirectToUserList" />
</Elements>

Also remember to set the safe controls on the delegate control so that the class is registered as safe. You do this on the properties panel of the delegate control:

delegatesafecontrols.GIF

Now we only need to add the SharePoint item to a feature (i made mine farm scoped) and deploy it to the farm!

Happy coding :)

July 11
PowerShell function to export mappings and crawled / managed properties

​I recently was tasked with moving a few mappings, crawled and managed properties from our own test site to the customers integration test environment.

To avoid typo's (and manual labour as such:-) I decided to throw together a few lines of PowerShell to do the tedious work for me.

You could also use this script to document your currently deployed Managed Properties and mappings.

Now it is quite forward to import the managed properties etc. into a new farm using the OOB cmdlets (Hint: Get-Command New-SPEnterpriseSearchMetadata*) now that you have the XML.

<#
.SYNOPSIS
Exports Search Metadata Mappings to an XmlDocument
.DESCRIPTION
This method can be used to retrieve Search Metadata Mappings from an existing farm.
The syntax can also be used as documentation of a farm
.EXAMPLE
$managedProperties | % {
New-Object PSobject -Property @{
Mapping = Get-SPEnterpriseSearchMetadataMapping -SearchApplication (Get-SPEnterpriseSearchServiceApplication) -ManagedProperty $_
ManagedPropertyName = $_.Name }
} | Export-SPEnterpriseSearchMetadataMapping | % {
$node = $xmlMappings.ImportNode($_.Mapping,$false)
$xmlMappings.DocumentElement.AppendChild($node)
} | Out-Null
.LINK
Export-SPEnterpriseSearchMetadataMapping
#>
function Export-SPEnterpriseSearchMetadataMapping
{
    [CmdletBinding()]
    param(
    [Parameter(Mandatory=$true, ValueFromPipeline = $true)]
    [Microsoft.Office.Server.Search.Administration.Mapping] $Mapping,
    [Parameter(Mandatory=$true, ValueFromPipeline = $true)]
    [string] $ManagedPropertyName
)
 
    process
    {
        [xml]$xml = "<Mapping />"
        $xml.DocumentElement.SetAttribute('ManagedProperty', $ManagedPropertyName)
        $xml.DocumentElement.SetAttribute('CrawledProperty', $Mapping.CrawledPropertyName)
        $xml.DocumentElement.SetAttribute('PropertySet', $Mapping.CrawledPropset)
        $xml.DocumentElement.SetAttribute('VariantType', $Mapping.CrawledPropertyVariantType)
        $xml
    }
}
 
<#
.SYNOPSIS
Exports Search Metadata Managed Properties to an XmlDocument
.DESCRIPTION
This method can be used to retrieve Search Metadata Managed Properties from an existing farm.
The syntax can also be used as documentation of a farm
.EXAMPLE
$managedProperties | Export-SPEnterpriseSearchMetadataManagedProperty | % {
$node = $xmlManagedProperties.ImportNode($_.ManagedProperty,$false)
$xmlManagedProperties.DocumentElement.AppendChild($node)
} | Out-Null
.LINK
Export-SPEnterpriseSearchMetadataManagedProperty
#>
function Export-SPEnterpriseSearchMetadataManagedProperty
{
    [CmdletBinding()]
    param(
    [Parameter(Mandatory=$true, ValueFromPipeline = $true)]
    [Microsoft.Office.Server.Search.Cmdlet.ManagedPropertyPipeBind] $ManagedProperty
)
process
{
 
    [xml]$xml = "<ManagedProperty />"
    $xml.DocumentElement.SetAttribute('Name', $_.Name)
    $xml.DocumentElement.SetAttribute('Description', $_.Description)
    $xml.DocumentElement.SetAttribute('Type', [int]$_.ManagedType)
    $xml.DocumentElement.SetAttribute('MaxCharactersInPropertyStoreIndex', $_.MaxCharactersInPropertyStoreIndex)
    $xml.DocumentElement.SetAttribute('FullTextQueriable', $_.FullTextQueriable)
    $xml.DocumentElement.SetAttribute('HasMultipleValues', $_.HasMultipleValues)
    $xml.DocumentElement.SetAttribute('Retrievable', $_.Retrievable)
    $xml.DocumentElement.SetAttribute('EnabledForScoping', $_.EnabledForScoping)
    $xml.DocumentElement.SetAttribute('RemoveDuplicates', $_.RemoveDuplicates)
    $xml.DocumentElement.SetAttribute('PutInPropertyBlob', $_.PutInPropertyBlob)
    $xml.DocumentElement.SetAttribute('QueryPropertyBlob', $_.QueryPropertyBlob)
    $xml.DocumentElement.SetAttribute('RespectPriority', $_.RespectPriority)
    $xml
}
 
<#
.SYNOPSIS
Exports Search Metadata Crawled Properties to an XmlDocument
.DESCRIPTION
This method can be used to retrieve Search Metadata Crawled Properties from an existing farm.
The syntax can also be used as documentation of a farm
.EXAMPLE
$CrawledProperty | Export-SPEnterpriseSearchMetadataCrawledProperty | % {

 

$node = $xmlCrawledProperties.ImportNode($_.CrawledProperty,$false)

 

$xmlCrawledProperties.DocumentElement.AppendChild($node)
} | Out-Null
.LINK
Export-SPEnterpriseSearchMetadataCrawledProperty
#>
function Export-SPEnterpriseSearchMetadataCrawledProperty
{
    [CmdletBinding()]
    param(
    [Parameter(Mandatory=$true, ValueFromPipeline = $true)]
    [Microsoft.Office.Server.Search.Cmdlet.CrawledPropertyPipeBind] $CrawledProperty
)
process
{
    [xml] $xml = "<CrawledProperty />"
    $xml.DocumentElement.SetAttribute('Name', $_.Name)
    $xml.DocumentElement.SetAttribute('Category', $_.CategoryName)
    $xml.DocumentElement.SetAttribute('PropertySet', $_.PropSet)
    $xml.DocumentElement.SetAttribute('VariantType', $_.VariantType)
    $xml.DocumentElement.SetAttribute('IsMappedToContents', $_.IsMappedToContents)
    $xml
}
}
 
<#
.SYNOPSIS
Exports Search Metadata to an XmlDocument
.DESCRIPTION
This method can be used to retrieve Search Metadata from an existing farm.
The syntax can also be used as documentation of a farm
.EXAMPLE
[xml]$SearchMetadata = Export-SPEnterpriseSearchMetadata -CrawledProperty $crawledProperties -ManagedProperty $managedProperties

 

$SearchMetadata.OuterXml | Out-File -FilePath "MetadataMappings.xml" -Encoding UTF8

 

.LINK
Export-SPEnterpriseSearchMetadata
#>
function Export-SPEnterpriseSearchMetadata
{
    [CmdletBinding()]
    param(
    [Parameter(Mandatory=$true)]
    [Microsoft.Office.Server.Search.Administration.CrawledProperty[]] $CrawledProperty,
    [Parameter(Mandatory=$true)]
    [Microsoft.Office.Server.Search.Administration.ManagedProperty[]] $ManagedProperty
)
process
{
    [xml]$xml = "<Metadata />"
    [xml]$xmlCrawledProperties = "<CrawledProperties />"
 
    # append CrawledProperty to CrawledProperties node
    $CrawledProperty | Export-SPEnterpriseSearchMetadataCrawledProperty | % {
    $node = $xmlCrawledProperties.ImportNode($_.CrawledProperty,$false)
    $xmlCrawledProperties.DocumentElement.AppendChild($node)
    } | Out-Null
    # append to MetaData
    $xml.DocumentElement.AppendChild($xml.ImportNode($xmlCrawledProperties.CrawledProperties, $true)) | Out-Null
 
    [xml]$xmlManagedProperties = "<ManagedProperties />"
    # append ManagedProperty to ManagedProperties node
    $managedProperties | Export-SPEnterpriseSearchMetadataManagedProperty | % {
    $node = $xmlManagedProperties.ImportNode($_.ManagedProperty,$false)
    $xmlManagedProperties.DocumentElement.AppendChild($node)
    } | Out-Null
    # append to Metadata
    $xml.DocumentElement.AppendChild($xml.ImportNode($xmlManagedProperties.ManagedProperties, $true)) | Out-Null
 
    [xml]$xmlMappings = "<Mappings />"
 
    foreach ( $managedProperty in $managedProperties )
    {
        $mappings = Get-SPEnterpriseSearchMetadataMapping -SearchApplication (Get-SPEnterpriseSearchServiceApplication) -ManagedProperty $managedProperty
        # A managed property can have multiple mappings
        if ( $mappings -is [Array] )
        {
        foreach ($mapping in $mappings)
        {
            $xmlMapping = $mapping | Export-SPEnterpriseSearchMetadataMapping -ManagedPropertyName $managedProperty.Name
            $node = $xmlMappings.ImportNode($xmlMapping.Mapping,$false)
$xmlMappings.DocumentElement.AppendChild($node) | Out-Null
        }
    }
    else
    {
        $xmlMapping = Export-SPEnterpriseSearchMetadataMapping -ManagedPropertyName $managedProperty.Name -Mapping $mappings
        $node = $xmlMappings.ImportNode($xmlMapping.Mapping,$false)
        $xmlMappings.DocumentElement.AppendChild($node) | Out-Null
    }
}
 
    # append to Metadata
    $xml.DocumentElement.AppendChild($xml.ImportNode($xmlMappings.Mappings, $true)) | Out-Null
    $xml
}
}

 

 
Find example syntax of calling the advanced functions below
 
# get crawled properties
$crawledProperties = Get-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication (Get-SPEnterpriseSearchServiceApplication) | ?{$_.Name -like "ows_yourCrawledPropertyPrefix*"}
 
# get managed properties
$managedProperties = Get-SPEnterpriseSearchMetadataManagedProperty -SearchApplication (Get-SPEnterpriseSearchServiceApplication) | ?{$_.Name -like "SomePrefix*" -or $_.Name -Like "SomeOtherManagedProperty" }
[xml]$SearchMetadata = Export-SPEnterpriseSearchMetadata -CrawledProperty $crawledProperties -ManagedProperty $managedProperties
 
$SearchMetadata.OuterXml | Out-File -FilePath "SearchProperties2.xml" -Encoding UTF8
 
 
 

 

I have attached the complete script here (remember to remove .txt from extension!)

ExportCrawledProperties.ps1.txtExportCrawledProperties.ps1.txt

   
1 - 10Next
 

 Blogroll

 
  
Ali Mazaheri (MCM)
Anders Dissing (MVP)
Andrew Connell (MVP)
Ram (MCM)
Spence Harbar (MCA)
Tobias Zimmergren (MVP)
Todd Carter (MCM)
Todd Klindt (MVP)
Vesa "vesku" Juvonen
Wictor Wilén (MCA)
 

 5 Latest Posts

 
  
Slides and code from my Azure Global Bootcamp 2016 session
Slides and code from my Office 365 Saturday Copenhagen 2016 session
Cisco AnyConnect and Windows 8.1
Getting an overview over used accounts on a SharePoint 2010 farm
Importing and Exporting Search Configuration Settings in SharePoint 2013