Update Theme on all Webs of Office365

So I have been doing a small migration project for a customer moving to Office 365. Finally I have a reason to bother working on Office 365. I have to say upfront: this is not a finished platform (yet!). Great potential, though.

As usual the initial idea is something completely different than what I spent most of my time on up to now.

Care to venture a guess?…of course: design. As usual theming is a b**** because you cannot deploy a solution on Office 365.

First I tried all the usual stuff: UI, but there are about 30-40 Sites (Subsites).
CSOM/ Webservices (CSOM is using the api webservices, if I am correct…).
Too bad the SPWeb.ApplyTheme Method doesn’t work as intended. Funny how Microsoft is, the method has four parameters. If you give it 3 it will tell you: you gave me only 3, I need 4, even though Technet tells you it’s fine to pass $null values, but that’s no biggy, because you could use dummy-data, right? So if you do that and you pass 4 values you get: “Too many resources requested” or a similar message translated from German (customer wants it in 1031 – good, that Office 365 has all the lang packs).

So the result I created is a PowerShell Script combining the chocolatey goodness of SPO Management PowerShell (get all the SPO-Site Objects of your tenant) with the caramely filling of SCOM (you cannot get any Web Objects, so we use SCOM for that)…and to top it off we use sweet-old Internet Explorer as a COM Application to fill out the form for applying the theme for each of the webs while iterating.

I would have liked to do it differently. In the traditional On-Premise Shell I could have used a one-line script to get this done. I could have guessed it in the beginning, that it would be a bit more difficult, but three different paradigms to get a simple thing done like theming-automation is a bit hilarious. That doesn’t compare to anything – at all!

I would have been fine with combining Office 365 Management Shell with SCOM. Okay – I mean the Management Shell includes only about 30 Commandlets at this point. It’s pretty much only good for creating Site Collections, emptying the recycle bin (which cannot be done at the moment via UI as I write this post – meeh) and adding users to SharePoint groups.

So we all know how bad this solution is – I am not even going to try to sell this to you as a good idea. It doesn’t work reliably. But it does reduce your workload. So that’s definitely sensible. I am imagining my customer telling me: “That theme is not good enough. Let’s use a different one.” This is probably what will happen. It usually does.

So it took me about the same time to complete the script as I would have to go through all of these webs and done all of the changes manually. So I have already won. Maybe I can help somebody else this way as well, so I am sharing the script here.

Keep in mind that you need to use the admin url to connect to the SPO-Service. Check this article on how to set up your environment accordingly. Set up the SharePoint Online Management Shell Windows PowerShell environment.

You should be aware, that if you wanna do this on a SharePoint 2010 machine you will have to open your powershell or your ISE with the suffix “-v2” like “%windir%\system32\WindowsPowerShell\v1.0\PowerShell_ISE.exe” because SharePoint 2010 will not work with .NET 4.0 when you install the management shell.

So the script needs the following:
– you need a palette and you can create that using an existing palette and edit it with the
– you need the management shell for Office 365 installed
– you need an Office 365 tenant and the credentials to access (duuuh!)

The script does the following:
– iterate over all sites that are not search or mysite (use oslo.master) or the public site
– when you get the rootweb from the site url: this is where you upload the palette (theme)
– get all the webs and iterate over them
– for the next step you will need to have an ie opened and authenticated against your tenant
– apply the theme via the two forms (you may think that because the second form has its own url you can skip the first one, but this one actually creates a cache version of the theme, so you will need to fill out the first form to be able to fill out the second one successfully).
– you will also say: why does he need the sleep commands? any good script will work without, but this is not one of those. We actually have to wait for the requests to be responded. It may even take a couple of seconds more than I have in my script. The anchors I use for clicking are not going to be found from the getelementby* methods if there isnÄt enough time in between.

So here is the “masterpiece”. Drop me a comment if it does in fact help you. Apart from that it may be a stepping stone for a much nicer script in the future.

param (
[string] $LocalPalettePath = “C:\Backup\my-palette.spcolor”,
[string] $username = “myusername@mytenant”,
[string] $password = “mypassword”,
[string] $url = “mytenant-admin.url”
)

function Process-File($Context, $File, $RemoteFolder) {
Write-Output (“Uploading ” + $File.FullName);
$FileStream = New-Object IO.FileStream($File.FullName,[System.IO.FileMode]::Open)
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.ContentStream = $FileStream
$FileCreationInfo.URL = $File.Name
$Upload = $RemoteFolder.Files.Add($FileCreationInfo)
$Context.Load($Upload)
$Context.ExecuteQuery()
}

function Upload-Palette($ctx, $web, [string] $localPalettePath) {
$ctx.Load($web);
$ctx.ExecuteQuery();

#Write-Host “Web URL is” $web.Url
Write-Host($localPalettePath)
if($web.ServerRelativeUrl -ne “/”) {
$remoteFolderRelativeUrl = $web.serverRelativeUrl + “/_catalogs/theme/15/”;
} else {
$remoteFolderRelativeUrl = “/_catalogs/theme/15/”;
}
$remoteFolder = $web.getFolderByServerRelativeUrl($remoteFolderRelativeUrl);
$ctx.Load($remoteFolder)
$ctx.ExecuteQuery();

$file = get-item $localPalettePath;
Process-File $ctx $File $RemoteFolder;
}

function CreateTheme($ie, $ctx, $web, [string] $localPalettePath, [string] $rootWebUrl, [string] $rootWebRelativeUrl) {
$ctx.Load($web);
$ctx.ExecuteQuery();

$baseUrl = $web.Url.Replace($web.ServerRelativeUrl, “”);
$relativeWebUrl = $web.ServerRelativeUrl;

$localPaletteItem = get-item $localPalettePath
$localPaletteName = $localPaletteItem.Name;

$url = $web.Url;

if($relativeWebUrl -eq “/”) {
$relativeBase = “”;
} else {
$relativeBase = $relativeWebUrl;
}
if($rootWebRelativeUrl -eq “/”) {
$relativeBaseRoot = “”;
} else {
$relativeBaseRoot = $rootWebRelativeUrl;
}

$gallery = “/_layouts/15/start.aspx#/_layouts/15/designbuilder.aspx?masterUrl={0}&themeUrl={1}&imageUrl={2}&fontSchemeUrl={3}”;

$masterUrl = $relativeBase + “/_catalogs/masterpage/seattle.master”;
$themeUrl = $relativeBaseRoot + “/_catalogs/theme/15/$localPaletteName”;
$fontUrl = $relativeBaseRoot + “/_catalogs/theme/15/SharePointPersonality.spfont”;
$imageUrl = “”;

$masterUrlEncoded = [System.Web.HttpUtility]::UrlEncode($masterUrl);
$themeUrlEncoded = [System.Web.HttpUtility]::UrlEncode($themeUrl);
$fontUrlEncoded = [System.Web.HttpUtility]::UrlEncode($fontUrl);
$imageUrlEncoded = [System.Web.HttpUtility]::UrlEncode($imageUrl);

$formUrl = $url + [string]::Format($gallery, $masterUrlEncoded, $themeUrlEncoded, $imageUrlEncoded, $fontUrlEncoded);
$ie.Navigate($formUrl);
$ie.Visible = $true;
sleep 5;

“First Form:”
$formUrl

$ieDoc = $ie.Document;

$div = $ieDoc.getElementById(“ms-designbuilder-main”)

$anchor = $div.GetElementsByTagName(“a”) | ? { $_.id -and $_.id.endswith(“btnLivePreview”) }

$anchor.Click();

sleep 2;

“Second Form:”
$ie.LocationURL

$ieDoc = $ie.Document;

sleep 15;
$anchor = $ieDoc.GetElementById(“btnOk”);

$anchor.Click();

sleep 2;

“PostLocation:”
$ie.LocationUrl;

sleep 1;
}

# Control #

$cred = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $userName, $(convertto-securestring $password -asplaintext -force)
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $(convertto-securestring $password -asplaintext -force));

Connect-SPOService -Url $url -Credential $cred

$sposites = get-sposite | ? { -not $_.Url.Endswith(“search”) -and -not $_.Url.Contains(“-public.”) -and -not $_.Url.Contains(“-my.”) }

clear;

foreach($sposite in $sposites) {
if($sposite) {
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($sposite.Url);
$ctx.Credentials = $credentials;

$rootWeb = $ctx.Web
$childWebs = $rootWeb.Webs
$ctx.Load($rootWeb);
$ctx.ExecuteQuery();

Write-Host $rootWeb.Url;

Upload-Palette $ctx $rootWeb $localPalettePath

$app = new-object -com shell.application
$ie = $app.windows() | ? { $_.Name -eq “Internet Explorer” } | select -first 1;

CreateTheme $ie $ctx $rootWeb $localPalettePath $rootWeb.Url $rootWeb.ServerRelativeUrl;

$ctx.Load($childWebs)
$ctx.ExecuteQuery()

foreach ($childWeb in $childWebs)
{
CreateTheme $ie $ctx $childWeb $localPalettePath $rootWeb.Url $rootWeb.ServerRelativeUrl;
}
}
}

Advertisements

Copy List Fields, Views and Items From List to List

Today I had to recreate a SharePoint 2013 List because the old one had an error (Content Approval errored out with “Sorry something went wrong” – Null-Pointer Exception).

My first guess was to create a new list and so I did manually. Of course with a dummy Name, so I had to recreate it again. I didn’t want to get stuck having to do it a third time, so I created a little script as seen below.

The script copies list fields and adds them to the new list, then does the same with all the views and then it copies all the items (which was the initial idea) to the new list.

The Input is fairly simple. You need to specify a url to identify the web you want to perform this operation on (you could amend the script to allow providing also a target url, so you can copy the fields, views and items
across site and site collection boundaries. However you might get an issue, for site fields used in your list that do not exist on the target site collection (Publishing Infrastructure, Custom Fields. You will need to do
a bit more than just add a parameter and init another web object). Also this works well for lists, but not for document libraries. Another limitation are content types. I did not include those either.

So you see this is more of a starting point than anything else. But it does the job and it was pretty quick to write, so I thought I would share it with you.

param (
[Parameter(Mandatory=$True)]
[string] $Url,
[Parameter(Mandatory=$True)]
[string] $SourceList,
[Parameter(Mandatory=$True)]
[string] $TargetList
)

add-pssnapin microsoft.sharepoint.powershell -ea 0;

$spWeb = get-spweb $url;

$spListCollection = $spweb.Lists;

$spSourceList = $spListCollection.TryGetList($SourceList);
$spTargetList = $spListCollection.TryGetList($TargetList);

if($spSourceList) {
if($spTargetList) {
$spTargetList.EnableModeration = $true;

$spSourceFields = $spSourceList.Fields;
$spTargetFields = $spTargetList.Fields;

$spFields = new-object System.Collections.ArrayList;
foreach($field in $spSourceFields) {
if(-not ($spTargetFields.Contains($field.ID))) {
$spFields.Add($field) | Out-Null;
}
}

foreach($field in $spFields) {
if($field) {
Write-Host -ForegroundColor Yellow ("Adding field " + $field.Title + " (" + $field.InternalName + ")");
$spTargetFields.Add($field);
}
}

$spViews = new-object System.Collections.ArrayList;

$spSourceViews = $spSourceList.Views;
$spTargetViews = $spTargetList.Views;
foreach($view in $spSourceViews) {
$contains = $spTargetViews | ? { $_.Title -eq $view.Title }
if(-not ($contains)) {
$spTargetViews.Add($view.Title, $view.ViewFields.ToStringCollection(), $view.Query, $view.RowLimit, $view.Paged, $view.DefaultView);
}
}

$spTargetList.Update();

$spSourceItems = $spSourceList.Items;

foreach($item in $spSourceItems) {
if($item) {
$newItem = $spTargetList.Items.Add();
foreach($spField in $spSourceFields) {
try {
if($spField -and $spField.Hidden -ne $true -and $spField.ReadOnlyField -ne $true -and $spField.InternalName -ne "ID") {
$newItem[$spField.InternalName] = $item[$spField.InternalName];
}
} catch [Exception] { Write-Host -f Red ("Could not copy content " + $item[$spField.InternalName] + " from field " + $spField.InternalName) }
}
$newItem.Update();
#Write-Host -f Green "Item copied";
}
}
} else {
Write-Host -f Red "List $TargetList does not exist";
}
} else {
Write-Host -f Red "List $SourceList does not exist";
}

Ensuring an LDAP Claim and what that means for your SPUser Object

So I have a customer using LDAP as an authentication Provider on SharePoint 2010.

I wrote a script a couple of weeks ago, that migrates the permissions of a user from one account to another on either Farm, WebApplication, Site or Web Level (taking into consideration Site Collection Admin Permissions, Group Memberships and any ISecurableObject [Web, List, Item, Folder, Document] RoleAssignments excluding ‘Limited Access’).

The Move-SPUser only does the trick for any situation where you have an existing user object and you create a new user object and then migrate. If the user is actually using both users simultaneously Move-SPUser is not your friend.

This is the reason:

Detailed Description

The Move-SPUser cmdlet migrates user access from one domain user account to another. If an entry for the new login name already exists, the entry is marked for deletion to make way for the Migration.

source: http://technet.microsoft.com/en-us/library/ff607729(v=office.15).aspx

 

So now I have my script but the difference between ensuring an LDAP Account and an AD Claim is that with the LDAP Account you need to explicitly give the ClaimString. With the AD Account that is not the case.

LDAP ClaimString:

i:0#.f|ldapmember|firstname.lastname@mydomain.tld

AD ClaimString:

i:0#.w|domain\SAMAccountName

With both the best idea is to follow the following way of ensuring the user:

$claim = New-SPClaimsPrincipal -identity $line.Name -IdentityType “WindowsSamAccountName”;

$user = $spweb.EnsureUser($claim.ToEncodedString());

Additionally with the LDAP Claim the email property is not set. Interestingly enough the email is the Claim identifier though, so the Name-property of the SPUser Object is in this case the email. So you will want to add the following two lines:

$user.Email = $user.Name;

$user.Update();

Now you have really ensured that the user object is on the site collection in the same way!

 

 

 

Static IP? No thanks, i’ve got ftp!

So yes, there is a bit of a logical issue in the title. If I have ftp, I already have a static ip of course, which is connected to the servername, but maybe I don’t want that static ip, I want it for a different purpose and it costs me 15 EUR/ month to get it via my Internet Provider. I could start with using a service that can tunnel my requests via a static IP to my dynamic one, but I will have to register with somebody.

I thought, why can I not do the following? Trigger a timer job on my home machine, get the IP Address and store it in a file. This file I could either push via a service like dropbox (but I don’t want dropbox on my server) or I can use ftp.

I took the code from this site.

Here it is:


function UploadFTP {
param(
[string] $user,
[string] $url,
[string] $port,
[string] $pass,
[string] $localPath,
[string] $remotePath
)

# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]::Create("ftp://" + $url + ":" + $port + "/" + $remotePath);
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user,$pass);
$ftp.UseBinary = $true
$ftp.UsePassive = $true
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes($localPath);
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
}

function DownloadFTP {
param(
[string] $user,
[string] $url,
[string] $port,
[string] $pass,
[string] $downloadPath,
[string] $remotePath
)
# Create a FTPWebRequest
$FTPRequest = [System.Net.FtpWebRequest]::Create("ftp://" + $url + ":" + $port + "/" + $remotePath);
$FTPRequest.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$FTPRequest.Method = [System.Net.WebRequestMethods+Ftp]::DownloadFile
$FTPRequest.UseBinary = $true
$FTPRequest.KeepAlive = $false

# Send the ftp request
$FTPResponse = $FTPRequest.GetResponse()
# Get a download stream from the server response
$ResponseStream = $FTPResponse.GetResponseStream()
# Create the target file on the local system and the download buffer
$LocalFile = New-Object IO.FileStream ($downloadPath,[IO.FileMode]::Create)
[byte[]]$ReadBuffer = New-Object byte[] 1024
# Loop through the download
do {
$ReadLength = $ResponseStream.Read($ReadBuffer,0,1024)
$LocalFile.Write($ReadBuffer,0,$ReadLength)
}
while ($ReadLength -ne 0)

$LocalFile.Close();
$LocalFile.Dispose();
}

$user = "someusername"
$url = "some.ftp.server"
$port = "21";
$pass = "somepassword";
$localPath = "C:\tmp\myfile.txt";
$downloadPath = "C:\tmp\myfiledown.txt";
$remotePath = "myuploadedfile.txt";

$ip = Get-NetIPAddress | ? { $_.AddressFamily -eq "IPv4" -and $_.InterfaceAlias -eq "Ethernet"}
$ip.IPv4Address > $localPath;

UploadFTP $user $url $port $pass $localPath $remotePath
DownloadFTP $user $url $port $pass $downloadPath $remotePath

So what I am doing is defining my variables, writing my IP to my localpath and uploading that file as well as downloading it. So my PoC was with one machine. The expectation is that the downloaded file and the original file are the same. Which is true.

The eventual setup will look a bit different because I will have to get at the public ip as well as setup the job which will then upload the file. On the other side I will need the part of the script, that downloads the file.

So my use case is I want to connect to a server connected to the internet, but I don’t know the IP, because it is dynamic/ DHCP.

IIS WAMREG admin Service – Windows Event 10016 – Wictor Wilén Addendum

The reason for sharing this post today is that I had the issue described in Wictor Wilen’s Post and the solution posted there did not work for me at first. So I wanted to elaborate a little bit. 

I was pretty sure that Wictor knows his stuff, because he is an MVP and a notorious one at that, so I thought I was doing something wrong. So true.

The first wrong turn I took when I tried fixing this the first time I didn’t understand the take ownership solution he proposed. Once I figured that out and it still didn’t work I tried finding other sources, but didn’t. Enter “The benefits of Social/ Web 2.0”.

When checking the comments on Wictor’s blog I saw that a lot of others faced the same issue I did and then I saw Blair’s post with the solution… I should have known this. x86/ x64 issue yet again.

Find the full description of how I solved this below and again a special shout-out to Wictor for the solution:

This is the error you will see in Windows Event Log:


Windows Event Log

You can find the details of this error here:

To resolve this issue, you need to give permissions to the executing accounts for SharePoint, so you can either get them from the services snapin or you add the permissions via the local SharePoint/ IIS Groups.

As the security tab of the IIS WAMREG admin service is greyed out you need to give full permissions to the local administration group. To do this you need to take ownership of the following two keys:

HKEY_CLASSES_ROOT\AppID\{61738644-F196-11D0-9953-00C04FD919C1}


Registry Entry 1

and

HKCR\Wow6432Node\AppID\{61738644-F196-11D0-9953-00C04FD919C1}


Registry Entry 2

after that you will be able to edit the permissions in the permissions tab of the component services


Component Services

AppManagement and SubscriptionSettings Services, Multiple Web Applications and SSL

So currently I am setting up four environments of which one is production, 2 are staging and another is was a playground installation.

My staging environments (TEST, QA, PROD) are multi-server, multi-farm systems (multi-farm because the 2013 Farm publishes Search and UPA to existing 2010 Farms).
They are running SPS2013 Standard with March PU 2013 + June CU 2013. They will be using App Pool Isolation and App Management and SubscriptSettings Services have their own account (svc_sp{t, q, p}_app, i.e. svc_spt_app, svc_spq_app and svc_spp_app).

I have three web applications of which all are secured by SSL making a wildcard certificate necessary for the app domain. Each has their own account (svc_sp{t, q, p}_{col, tws, upa}). The reason for this is that I will be using Kerberos Authentication and for the SPNs I need dedicated accounts for each Application URL.

My playground was once a 4 server farm, but now 3 servers have been removed. It does not run the March PU 2013 nor June CU 2013. There app pool isolation wihtout SSL is used.

On the playground the app management worked well. I actually encountered my problem on my test first and tried to replicate on the playground, but couldn’t. But I am getting ahead of myself. The system was setup via autospinstaller and the necessary certificates and IPs involved were requested and implemented. The AD Team did the domain setup for me. I didn’t setup my environment following this article, but it is a good one to read. I also got the idea of creating a separate dummy web application for attaching my IIS Bindings and Certificate from it, which makes a lot of sense, because of security considerations and kerberos.

The first article to read to get an overview of what is necessary and what’s trying to be achieved can be found here.

So I set up everything and still it wasn’t working. What does that mean? I will explain. When I subscribe to an app, download it from the store and add it in a site collection of my choosing I get to click on it once it is finished installing. The link then leads me to my app domain. With SSL only when I was using the same application pools I could actually get anywhere, otherwise I say the below.

This is what I wanted to see:
Expected

This is what I saw on any of the web applications with SSL and that had a different app pool account than the one I was using for my dummy web application.
Actual

So this blank page is actually exactly what you see when you leave the request management service running on the frontends without doing any topology configuration.

So I tried to work with the user policy from the web application management page in hopes of giving the users permissions on the content databases. This was actually not happening as I found out later, but which was actually exactly what was needed. I had to manually add the account of the app pool for the app domain to the SPDataAccess Group of the content databases. Then it also works with SSL. I actually set up three web applications WITHOUT SSL on the Test Staging Environment with the same users as the SSL Web Applications and this worked like a charm, but for any SSL web application I needed to explicitly give permissions to the content database. This is a nightmare to maintain. For my migration of 20 databases from 2010 to 2013 I need to do this again and again and for each new content database I will create in the future. Just imagine you create a new content database and forget to do this. Now for any site collection in this content database the above issue will show up. Hard to debug later on.

Not sure what Microsoft is thinking here, but I am happy that it only took me 4 days to figure this one out.

Testing Incoming Email

After all of the infrastructure blog articles the last couple of days, here now a short one for development.

When you have a dev machine and you want to test incoming email you can actually use the pickup folder in your IIS mail root folder to simulate everything after what exchange usually does for you (route the email to your SharePoint Server from outside your Server, e.g. when you write an email with your Outlook client).

So you have the pickup folder and you have the drop folder which is the one SharePoint picks up its emails via the job-email-delivery job (which runs every minute per default).

You will see that if you place a file into the pickup folder it will move quickly (in a matter of seconds) from there to the drop folder and be transformed to an eml file with a filename based on an ID.

This is the content of a text file you can use to simulate this. Take a look at the To: property. That is the email address of my list that I want the email to eventually end up in.

From:blog@yourdomain.org
To:Mailbox@dev.meiringer.com 
Subject: MySubject
This is the body of the email

So what you need to do when you are testing and you have an email queue you purge, you will want to have a folder (I call it dump) where you put your test objects you want to use as incoming emails and copy them to the pickup folder. From there you can either wait the 1 minute or if you are not as patient run the job after 3 seconds of waiting.

if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction silentlyContinue) -eq $null) { Add-PSSnapin "Microsoft.SharePoint.PowerShell" }

get-childitem -Path "C:\inetpub\mailroot\Dump" -Filter *.txt | % {
    $_.FullName
    copy-item -LiteralPath $_.FullName -Destination ("C:\inetpub\mailroot\Pickup\" + $_.Name) 
}

sleep 3

$job = Get-SPTimerJob job-email-delivery;
$job.RunNow();

That pretty much does it. You now have the possibility to concentrate on your test content and let the script handle the rest. The great thing here is of course that it’s re-runnable and thus you can generate as many emails in the target list as you please.