Task Runner Explorer is the Best Visual Studio Feature You Probably Aren’t Using

TL;DR – There is a handy feature in VS 2015 called Task Runner Explorer that you can use to run PowerShell or batch commands to do just about anything. You can also bind these to build events.

A task runner is a program that runs tasks. If you’ve been doing much web development these past couple of years you are probably familiar with this concept and popular task runners like Grunt and Gulp. In fact one or both of these might be essential to your development workflow. And, since many web developers consider these to be essential tools, the Visual Studio team released the Task Runner Explorer extension for Visual Studio 2013 and later made Task Runner Explorer an out of box feature in Visual Studio 2015.

If you aren’t aware that this feature exists, you aren’t alone! I took a poll on twitter.

<sarcasm>I was a bit surprised by this as the feature is prominently available by going to View | Other Windows | Task Runner Explorer.</sarcasm>

JavaScript Task Runner? No Thanks!

If your work isn’t mostly JavaScript, using a JavaScript based task runner probably sounds pretty unappealing. Happily there is an extension that supports .exe, .cmd, .bat, .ps1 and .psm1 files called Command Task Runner.

We use this in the Azure Functions for SharePoint project to automate deployment at build time by binding the script to the build event.

The deploy script is complicated, but there are a couple others that are pretty simple and are not bound to any events. We run them manually and I think they illustrate best why this tool is something that belongs in your everyday toolkit.

For example, each Azure Function for SharePoint relies on a config.json file. It would be an error-prone pain to create them by hand or by copying an existing configuration and so we have a script that creates a new config and puts it on the clipboard:

$scriptdir = $PSScriptRoot
$config = New-Object AzureFunctionsForSharePoint.Core.ClientConfiguration

#Pretty print output to the PowerShell host window
ConvertTo-Json -InputObject $config -Depth 4

#Send to clipboard
ConvertTo-Json -InputObject $config -Depth 4 -Compress | clip

When a new client config.json is needed, all one must do is run the command from Task Runner Explorer.

Pretty cool eh?

–Doug Ware

Introducing Azure Functions for SharePoint

I’m excited to announce the first public release of Azure Functions for SharePoint, a powerful but inexpensive to operate open source backbone for SharePoint add-ins. We’ve been using Azure Functions in production for a while now, and I love it!

I’ll be speaking about Azure Functions next Saturday, January 21, 2017 at Cloud Saturday Atlanta. You should come!

About Azure Functions for SharePoint

AzureFunctionsForSharePoint is a multi-tenant, multi-add-in back-end for SharePoint add-ins built on Azure Functions. The goal of this project is to provide the minimal set of functions necessary to support the common scenarios shared by most SharePoint provider hosted add-ins cheaply and reliably.

Features include:

  • Centralized Identity and ACS token management
  • Installation and provisioning of add-in components to SharePoint
  • Remote event dispatching to add-in specific back-end services via message queues including
    • App installation
    • App launch
    • SharePoint Remote Events

Navigating the Documentation

These documents consist of articles that explain what the functions do, how to set up the hosting environment, and how to use the functions in your add-ins and API documentation for .NET developers linked to the source code in GitHub.

A Note on Terminology

These documents use the term client to refer to a given SharePoint add-in. A client is identified using its client ID which is the GUID that identifies the add-in’s ACS client ID in the SharePoint add-in’s AppManifest.xml.


There are three functions in this function app.

  1. AppLaunch
  2. EventDispatch
  3. GetAccessToken

Setup Guide

We’re working on full automation with an ARM template, etc. The Visual Studio Solution includes a PowerShell script you can use with Task Runner Explorer and Command Task Runner. Until then, create a function app and copy the contents of this zip file into the function app’s wwwroot folder.

Configuring the Function App

Until the automation is fully baked, you can use this video to guide you through the relatively easy setup of the function app.

Configuring SharePoint Add-ins to use the Function App

Azure Functions for SharePoint is multi-tenant in that it can service add-ins installed broadly across SharePoint Online and also because the back-end processes that respond to client specific events in SharePoint or rely on Azure Functions for SharePoint for security token management can be located anywhere with a connection to the Internet.

See the Client Configuration Guide for more information.

Using the Function App to Support Custom Back-ends

It is possible to use Azure Functions for SharePoint to deliver pure client-side solutions, i.e. HTML/JS. However, many add-ins must support scenarios that are difficult or impossible to achieve through pure JavaScript. Azure Functions for SharePoint supports custom back-ends in two ways:

  1. Notification of add-in and SharePoint events via Azure Service Bus queues via the EventDispatch Function
  2. A REST service that provides security access tokens for registered clients via the GetAccessToken Function

In both cases the client back-end receives all the information it needs to connect to SharePoint as either the user or as an app-only identity with full control. The function app does the actual authorization flow and its client configuration is the only place where the client secret is stored.

Your custom back-ends can live anywhere from the same Function App where you deployed Azure Functions for SharePoint to completely different Azure tenancies or on-premises servers. All that is required is that the back-end can read Azure Service Bus Queues and access the REST services via the Internet. Aside from these requirements, the back-end can run on any platform and be written in any language.

That said, if you are using .NET, this project included an assembly named AzureFunctionsForSharePoint.Common that you can use to make things even easier!

API Docs

Complete documentation of the Azure Functions for SharePoint API see the API Guide.

Want to Contribute to this Project?

We’re still working on that too, but please reach out to me if you want to help!


Receiving BrokeredMessages Instead of Strings with Service Bus Queue Triggers

When you create a new Azure Function with a Service Bus queue trigger, the initial run.csx takes a string as input and looks like this:

The benefit of this is that the function infrastructure hides all of the complexity of the service bus from you making everything nice and simple. However, I like to get the real message object instead of its payload because the messages properties support a number of useful scenarios. Among these properties is the message’s ContentType which is useful when the bus delivers more than one type of message.

It isn’t obvious from the documentation how to get the brokered message instead of a simple string, and there is a wrinkle involved if you like to do as I do and deliver the bulk of your functionality in complied assemblies.

Scenario #1 – C# Script Only

As your function is being triggered by a service bus message, it makes sense that Microsoft.Azure.ServiceBus.dll is loaded by default. You don’t need to do anything other than reference it and change the method signature and you can leave your function.json file alone. In this case I changed the binding in function.json so that the name is receivedMessage instead of myQueueItem, but that’s only because I felt like it! J

You can write your function as follows:

Notice that the assembly import line does not include an extension. If you include one, you will get a compile error!

Scenario #2 – Compiled Assemblies

If you are using compiled assemblies, the story is a little more nuanced and also potentially more dangerous because the potential for a version mismatch exists.

Assemblies deployed to the function go into the bin folder below the run.csx file. Generally, what this entails is copying all of the build output from your project to bin. When you do this, it becomes possible to reference the assembly using the dll file extension as follows:

This compiles because the assembly is in bin. You can still leave the file extension out. But, should you?

To test this out I made a little test assembly that references a very old version of Microsoft.ServiceBus.dll; version It returns a string that contains the assembly name along with the message’s content type.

The calling function logs the assembly name and then the output of the compiled function.

Somewhat surprisingly, this works and the output of the function’s compilation and execution looks like this:

As expected, everything is using the version of the assembly that was loaded by Azure Functions ( and not the version referenced by the compiled project ( I actually expected to get a runtime error or at the very least a warning!

If I remove the file extension, it still works, but at least this time I get a warning!

The Moral of the Story

That this works at all has a lot to do with the fact that what I am using of the BrokeredMessage class is compatible between versions 1.8 and 3.0. Had I written the test differently, it would not have worked.

There is clearly some danger here that requires a function developer to know when they are using assemblies that will already be loaded and have the potential to not match the build. The function compilation does not recompile the deployed assemblies and has no way to know about this potential runtime mismatch, but it can know that what the C# Script file is using conflicts with what is in the bin folder as long as you leave the file extension off of the reference.

–Doug Ware

An Azure Functions Design Pattern for Your Consideration

Without question, Azure Functions is my favorite new offering from Microsoft in 2016. One reason I like it so much is that it is extremely flexible, scalable and inexpensive, making it useful in a wide range of scenarios. One reason for this is that you can create functions using your choice of nine different languages (many of these are experimental as of this writing). Naturally, each of these languages has its own nuances.

Here at InstantQuick we have several functions in production based upon C#. Naturally, the documentation for C# is comparatively good, there is some best practices guidance, and there are tools for Visual Studio in preview. However, in each of these cases, the primary focus is C# Script, not C# compiled to assembly dll files. Fortunately, the documentation does describe how to load assemblies in a variety of ways.

Minimizing the Use of CSX

The file extension of a C# Script file is CSX. By default a C# based function has a single CSX file named run.csx. You can, however, reference other C# Script files and share C# Script files between functions. So, you can theoretically build a complex solution using nothing but C# Script and for very small functions written to augment something like a Logic App C# Script makes perfect sense. However, in our case, and I suspect in many others, we want to deliver the bulk of our functionality as built assemblies for a few important reasons.

  1. We are moving existing functionality previously developed in a traditional manner to functions
  2. Sometimes, functions aren’t an appropriate delivery vehicle and we want to host the functionality in traditional cloud services or on premises
  3. The tooling for CS files is, at the moment, much better than the tooling for CSX files

Pattern for run.csx

Most of our run.csx files look like this:

#r "AppLaunch.dll"
#r "FunctionsCore.dll"
using System.Net;
using System.Configuration;
using System.Net.Http.Formatting;
using AppLaunch;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
   Log(log, $"C# HTTP trigger function processed a request! RequestUri={req.RequestUri}");
   var func = new AppLaunchHandler(req);
   func.FunctionNotify += (sender, args) => Log(log, args.Message);
   var appLauncherFunctionArgs = new AppLauncherFunctionArgs()
      StorageAccount = ConfigurationManager.AppSettings["ConfigurationStorageAccount"],
      StorageAccountKey = ConfigurationManager.AppSettings["ConfigurationStorageAccountKey"]
   return func.Execute(appLauncherFunctionArgs);

public static void Log(TraceWriter log, string message)

The code does only five things

  1. Load the assemblies that do the actual work
  2. Receive the input from the function’s trigger via its bindings – this includes a TraceWriter for logging
  3. Gets the configuration from the Function App’s configuration settings
  4. Invokes the code that does the real work, passing the function input and the configuration values and returning its output
  5. Receive logging notifications as events and log them via the TraceWriter

Avoiding Dependencies on Azure Functions

The very first function I wrote included a dependency on the Web Jobs SDK for logging. Since one of our primary needs is to be able to host the functionality outside of Azure Functions, that wasn’t something I could keep doing. The workers should be delivered as plain old class libraries with minimal dependency on the runtime environment. To that end I wrote a simple little base class that uses an event to raise notifications. This allows the hosting environment to deal with that information in whatever way is needed.

The base class and event look delegate like this:

namespace FunctionsCore
   public delegate void FunctionNotificationEventHandler(object sender, FunctionNotificationEventArgs eventArgs);
   public class FunctionNotificationEventArgs : EventArgs
      public string Message { get; set; }
   public class FunctionBase
      public event FunctionNotificationEventHandler FunctionNotify;

      public void Log(string message)
         FunctionNotify?.Invoke(this, new FunctionNotificationEventArgs { Message = message });

A subclass can then simply notify the caller of anything interesting via Log function as follows.

Log($"Error creating view {ex}");

How Much Logging is Too Much Logging?

The SDK’s best practices guidance notes that excessive amounts of logging can slow your function down. I suppose there may be times when that is a concern, but most of our functions are not interactive and still execute in the range of sub-second to a few seconds. The cost savings we get from using a consumption plan instead of dedicated cores is so dramatic that whatever extra CPU time the logging causes is a non-issue. So, my opinion is that minimizing the logging is a premature optimization and not great advice.


The final piece of the puzzle is how the deployment works. The current preview tools for Visual Studio are not especially helpful when it comes to working with regular old C#, and that is being generous. Instead, we use plain old class library projects, Visual Studio’s Task Runners with the Command Task Runner, and PowerShell scripts. The full script that can also create the Function App via Azure Resource Manager is a work in progress, but when it’s done, I’ll write another blog post and publish the project template to GitHub.

–Doug Ware

SharePoint Online App Infrastructure Outage – Not Yet Resolved

Update (Sept 15, 9:02 AM ET): The issue is still on going and intermittent with lots of mentions on various social networks. The problem described on the Office 365 service status page still doesn’t mention the general issues with provider hosted apps but only focuses on workflows which it claims are in recovery. In related news, Azure appears to be having widespread outages today as well.

Update (Sept 14, 5:02 PM ET): The issue appears to be resolved at this time. If you are still having trouble, please leave a comment.

Update (Sept 14, 12:36 PM ET): A few others have reached out to me on twitter reporting similar problems with their apps and tenants. In another environment we’ve seen JavaScript CSOM fail for a site collection administrator performing basic operations with access denied and similar operations fail in SharePoint Designer with 403 Permission Denied. Whatever is causing this seems to be a systemic issue in the SharePoint Online auth layer. Additionally, the Office 365 Service Status Page now says:

SP77334 - SharePoint Features - Service degradation

Service degradation - Sep 14, 2016 11:23 AM

Current Status: We're investigating system logs to determine the source of the issue.

User Impact: Users may be unable to execute SharePoint 2013 workflows. Additionally, users may be unable to view SharePoint 2010 workflows.

Scope of Impact: A few customers have reported this issue, and our analysis indicates that this issue may potentially affect any of your users attempting to execute or view SharePoint workflows.

Start Time: Monday, September 12, 2016, at 12:56 PM UTC

Next Update by: Wednesday, September 14, 2016, at 5:30 PM UTC

I’ll bet you a dollar that what they see in the system logs indicates some sort of auth failure! 😉 Today might be a good day to do work that doesn’t require SharePoint Online. Original: There seems to be a major problem with SharePoint Online today that we first observed yesterday evening. I’m seeing it here and a few customers have reported it already although the Office 365 service status dashboard currently shows no issues at the time of this writing. When you try launch Instant Consulting Practice or Instant Legal Practice you may see the following error. Note that this is our error screen, other apps will behave differently, but all apps on our tenants are currently affected.
Basically, what that means is that SharePoint is sending an empty request when you try to launch the app instead of including the values an app needs to connect to SharePoint. If you are affected by this, please open a support ticket with Microsoft. –Doug Ware

Comparing InstantQuick SharePoint Provisioning and OfficeDev PnP

The purpose of this post is to help you understand when to use InstantQuick SharePoint Provisioning and when to use OfficeDev PnP-PowerShell by contrasting the two. At the outset I will admit I am biased as I created the former, but I also use the latter. And, not to put the conclusion before the comparison, I think both have a place in your toolbox. Each PowerShell module gets its functionality by wrapping a set of class libraries that you can use independently of the module in your own solutions, and so a comparison of the PowerShell modules is also a comparison of the class libraries they wrap which are also open source and located in GitHub.

Where InstantQuick SharePoint Provisioning Shines – Templated Provisioning

PnP and IQ each offers the ability to create provisioning templates by reading from an existing site and the ability to deploy the resulting templates to other SharePoint sites. PnP has a number of other very useful and granular features. At the time of this writing PnP offers 170! individual PowerShell commands. Most of these are not for creating provisioning templates or application of provisioning templates, but are instead for administration and manipulation of individual elements of a SharePoint tenant or site. Examples include such diverse commands Send-SPOMail, Enable-SPOFeature, and Get-SPOTimeZoneId. Most of the commands wrap types from the SharePoint Client Object Model which are extended by OfficeDevPnP.Core.dll.

I’ll talk about these PnP commands in general in the next section. In this section I’m comparing the primary PnP commands dedicated to templating and template provisioning. These are:

PnP has two different package formats and there are a few other commands in PnP for the management and conversion of the templates themselves that are outside the scope of this comparison as they don’t affect the actual provisioning functionality.

In contrast IQ has only 24 PowerShell commands, is considerably less granular, and is more or less focused on creating, installing, and uninstalling provisioning templates. IQ offers two types of templates: AppManifest and SiteDefinition. An AppManifest describes fields, content types, lists, etc. and a SiteDefinition defines a hierarchy of one or more Webs each with a corresponding AppManifest. To keep this comparison simple, I’ll focus on only the two commands that most closely match PnP:

For a more detailed look at the IQ commands in action see the samples on GitHub. For this comparison I am using the Board of Directors Site sample. If you’d like to play along at home, use IQ to create a site for this samples, use PnP to create a template from the sample site, and then apply the PnP template to another site for comparison – do not use the same site collections or the items will collide!

Get-SPOProvisioningTemplate Versus Get-WebCreator

Each of these commands creates a template by reading from a site that contains customizations. Get-SPOProvisioning determines what to include by comparing the site to an XML document embedded in the PnP stack whereas Get-WebCreator compares the site with the customizations with another site to produce a delta. Here at InstantQuick we prefer the latter because we don’t have to maintain a static template as Microsoft changes SharePoint between versions but primarily because we treat AppManifests as modules we combine to produce larger customizations. For example, our Practice Manger apps are composed of many different manifests, some of which are shared making customizations and maintenance much easier. To create a custom solution for a client we can install a base solution to a site, modify it, and extract the delta as a new package. To deploy it, we install the AppManifests in order so that the delta takes precedence over the original.

Big Difference #1 – Support for Pages with Web Parts

One side-effect of the PnP approach is that they have different installers for different versions of SharePoint. At present there are three, SP2013, SP2016, and SharePoint Online. This can be a pain if you work with more than one version of SharePoint. The core PnP solution uses build configurations and conditional compilation to deal with the differences between versions whereas IQ has one build configuration with conditional logic and heuristics to decide how to behave at runtime based on the server’s version.

If you are using a version of PnP compiled for a version where the Client Object Model doesn’t support a particular operation, for example reading a Web Part definition from a page in SharePoint 2013, PnP will simply skip those steps. Among IQ’s conditional logic is code that will fall back to older SOAP based API’s as necessary to read Web Parts and 2010 style workflows.

PnP therefore doesn’t support Web Part, Wiki, or Publishing Pages that contain Web Parts for anything but the newest versions of SharePoint and (as far as I can tell) doesn’t support SharePoint 2010 workflows at all.

Big Difference #2 – Site Specific Fixups

Both Get-SPOProvisioningTemplate and Get-WebCreator will include the source site’s home page in the template by default. The screen shots below are from sites created by IQ and PnP respectively. The PnP version was created from a template produced by using Get-SPOProvisioningTemplate against the IQ version and then applying the template to a web in a different site collection.

The second site’s home page looks pretty good, but unfortunately all of the URLs in the template point to the original site. This includes the navigation links, the image’s source URL, etc.

Get-WebCreator tokenizes a variety of things including URLs, list IDs, and group IDs in links, filed, web parts, and files. Install-IQAppManifest substitutes the correct values during provisioning and does things in the proper order when necessary. Both Get-SPOProvisioningTemplate and Apply-SPOProvisioningTemplate simply read the values and reproduce them on the target without modification.

Example one from IQ

Example two from PnP


Big Difference #3 – Creation of Complete Templates

If you are following along at home, there are a number of other differences you will quickly notice. Among them are the fact that the Board Events list has a workflow that is missing in the PnP site, that the Meeting Minutes library has a custom document template that is missing in the PnP version, and that several of the lists have custom view pages that are missing in the PnP version. To be fair, Get-WebCreator won’t include all of these files by default as part of the differencing process, but it will if you use the -Options switch to specify the lists and libraries with items you wish to include. You can also extend a created template using IQ commands such as Get-FileCreatorAndFolders and Get-ListCreatorListItems.

PnP has the ability to extend the provider with custom handlers for such situations, and the Office PnP samples repository has samples for just about anything you might want to do, but to get a complete template generally means you will have to understand PnP at a fairly deep level and be willing to write a bunch of code.

Where OfficeDev PnP-PowerShell Shines – Formal and AdHoc Admin Scripting

If you are responsible for a SharePoint environment and you use PowerShell, I have no reservations saying that you should be using OfficeDev PnP-PowerShell.

You need to add a file to 1000 sites? Add-SPOFile
You need to clean up some dodgy Custom Action that’s breaking a site? Remove-SPOCustomAction
You need to ….? I could go on all day because PnP has 170 really useful commands!

I could wax poetic all day about the good things in PnP, but it isn’t my project. J

–Doug Ware



The InstantQuick SharePoint Provisioning Engine is Now Open Source and on GitHub

The InstantQuick SharePoint Provisioning stack is the easiest to use and most complete SharePoint provisioning library currently available. It is the core engine used in the InstantQuick line of products and can read from and provision to SharePoint 2013, SharePoint 2016, and SharePoint Online.

This repository includes the .NET class libraries we use at InstantQuick and a companion PowerShell module.

Minimal Path to Awesome

  1. Download and Install the PowerShell Module – Setup
  2. Visit the wiki
  3. Pick one of the three samples and follow the instructions

About the Project

InstantQuick SharePoint Provisioning predates Office PnPCore by a couple of years and differs in that it is designed to be a complete and turnkey provisioning engine that is easy to use with minimal setup as opposed to being an extensible demonstration project of the SharePoint Client Object Model and CSOM development patterns. It offers more features out of the box for provisioning including the ability to read and provision Web Part Pages, Wiki Pages, Publishing Pages, and 2010 style workflows against versions of SharePoint that do not support the latest SharePoint Client Object Model API’s by falling back to older API’s as needed.

If it sounds like we are bashing the PnPCore stuff, we aren’t. This project even includes some of its (properly attributed) code! If you are looking for a great library to extend, it might well be a better choice. But we think this one is likely to satisfy most scenarios with less setup and without the need to extend the base functionality (or even understand how it uses the API’s).

As with the Microsoft Patterns and Practices library, InstantQuick SharePoint Provisioning can generate templates by comparing a customized site to a base site. Unlike the PnP engine you can easily include any file in the site (including publishing pages and page layouts) without writing code or otherwise extending the library. It also has the capabilty to provision site hierarchies and to both install and/or remove multiple template manifests as a single operation.


InstantQuick SharePoint Provisioning can read and recreate the following out of the box

  • Webs and subwebs
  • Fields
  • Content Types
  • Lists and Libraries with or without custom views
  • List items
  • Documents
  • Folders
  • Web Part Pages
  • Wiki Pages
  • Publishing Pages
  • Master Pages
  • Page layouts
  • Display templates
  • Composed looks and themes
  • Other arbitrary file types with or without document properties
  • Feature activation and deactivation
  • Permission levels
  • Groups
  • Role assignments (item permissions)
  • Top and left navigation
  • Document templates
  • 2010 Workflows
  • Managed metadata fields and list item values
  • Site, Web, and List custom actions
  • AppWeb navigation surfaces
  • Remote Event Receivers
  • …and more

SharePoint Sandbox Rescue Services Available!

Normally, I would not do this, but the clock is running…

If you have critical sandbox solutions that must be fixed before Microsoft pulls the plug on sandbox code  and renders them unusable, please contact me for a consultation immediately! This week we assembled a team of experts who understand sandbox solutions and how to migrate most scenarios. Naturally, our availability is finite, so please don’t wait until the last minute to reach out.

Good luck!
–Doug Ware

Six Things to Know About the Newly Announced SharePoint Framework

Today at The Future of SharePoint event in San Francisco Microsoft made several important announcements and released quite a bit of information. Without a doubt, the new SharePoint Framework was the biggest bombshell. As a longtime SharePoint MVP and founder of an ISV that sells SharePoint add-ins and tools, I was fortunate to be in Redmond early in the process and more recently to offer feedback. In fact, I and my associate, fellow MVP Dan Attis, actually got to spend a few days playing with the new bits at a recent Dev Kitchen in exchange for our feedback. Here are my top six big picture thoughts on the new stuff.

#1: You Are Not Screwed

One thing that was immediately noticeable in my recent interactions with the SharePoint team is that they are excited. They believe in this stuff and they want to share their new awesomeness with you. They think you’ll love it as they do! For many of us though, the most likely initial reaction is to be very disturbed. It’s the kind of feeling I imagine a homeowner gets when she receives the Notice of Eminent Domain for the road widening that will destroy her front yard.

So, the first thing you need to know is that the existing UI upon which your house stands on is not scheduled to be bulldozed. SharePoint 2016 is not shipping with the new experience. The features it does ship with reach the end of mainstream support in 2021 and extended support in 2026. Those dates represent the earliest possible end of support.

But what about SharePoint Online?

There as well, you can expect to be able to use the current experience for a long time to come. The hybrid story depends upon it, but more importantly Microsoft has huge customers that are in the process of moving to SharePoint Online from on-prem environments based on the current UI.

In fact, if you have significant investments in custom solutions based on the current UI, you should be feeling a sense of relief. Most such solutions have some level of dependency on UI things Microsoft has said should not be depended upon and we all live with some level of fear that we will wake up one day to find one of the dependencies changed and our solution is broken. Microsoft’s new approach makes this much less likely.

The only reason to be concerned is if you are an ASP.Net Web Forms developer who hasn’t made moves to learn modern web development – in which case it is well past time you hit the books. I promise that it is easier than what you are doing now which is why the rest of us have moved on.

#2: It Restores a Key Value Proposition of SharePoint for SharePoint as a Platform

A big reason for SharePoint’s success is that it is based on ASP.Net Web Forms. Ten years ago, a big chunk of enterprise web developers used ASP.Net as their primary tool. This meant that SharePoint development was pretty approachable to many enterprise developers. Since then though, times have changed. New developers aren’t learning ASP.Net Web Forms. What was a big strength has now become a big weakness and traditional SharePoint development skills are niche skills in today’s job market.

This is a problem for everyone in SharePoint land that needs developers including Microsoft. There are people working as developers on the SharePoint platform who were not old enough to use the kitchen stove when ASP.Net Web Forms came out!

The new SharePoint Framework means that SharePoint development is now in step with mainstream web development once again, except this time you have a lot more choices. It also matters less what choices you make because the framework offers a better way to isolate your solution while also being as deeply integrated as is required.

Furthermore, the new framework embraces the fact that you need the ability to participate in the page’s real DOM. It no longer forces the use of things like iframes and weird URLs. You don’t need to be clever or go against the grain as is often required now to build the types of customizations required by the business you are in.

#3: Open Processes Work Better

I’ve been focused on SharePoint for a decade and I’ve been an MVP for the last five years. For most of that time what Microsoft intended to deliver was completely set in stone by the time anyone outside the SharePoint team knew what was coming. As a stakeholder, the best you could do was complain loudly in hopes that the thing will be better in three years when the next version comes out. In some areas, the Office development teams have fully embraced open source software and open development. In this case they opened up on the core platform for the first time and got feedback at several points in the process. I think the result will be much closer to ‘good’ out of the gate than I had previously come to expect because this time around I’ve actually seen them change their minds based on feedback from outside their own team!

I admire the SharePoint team for how far they have come in this regard. Opening yourself up to criticism is not easy. Personally, I hate it.

I could sense the stress at the Dev Kitchen. It must have been terrifying for each of them. At dinner on the last night, after what was clearly a successful event I could sense the relief. They’d deliberately put themselves through the wringer and survived, but they all looked like they were going to fall asleep!

Make no mistake – it is still early. The first releases of this stuff are guaranteed to have holes and annoying flaws, but things should improve quickly because…

#4: The Biggest Issue with Modern SharePoint as a Platform is Being Fixed

In my opinion, the single biggest factor with the limited success of sandbox solutions and the first versions of the app/add-in model was not that each lacked or blocked functionality that was easily accessible in farm solutions. No, the biggest issue with modern SharePoint as a platform is that the SharePoint development team did not use either model themselves. They had no real skin in the game and no internal incentive to fix what was always somebody else’s problem.

The SharePoint Framework does not just align SharePoint development with modern web development. It also more closely aligns the SharePoint development team with the developers who are customizing SharePoint. Unless they make a habit of cheating, the really big flaws are going to get fixed.

#5: Microsoft Needs Your Help

All that said, there is still some cause for concern. The document library preview release left many scratching their heads. The fact that they dropped something like into first release that without even putting it on the roadmap was hard to understand. And the absence of support for fundamental features we and PnP rely upon such as script custom actions or jsLink was just dumb.

Please note that I know dumb when I see it. As a very experienced and well-respected software developer and business owner I work in dumb like other artists work in clay or oil.

As an outsider it’s fun to speculate where that sort of dumb comes from. Is there a powerful manager that hates PnP and the rest of us so much that they would withhold a couple key lynchpins to see it fail? That would make a great story, but still, it is hard to believe one person could generate that amount of dumb. It probably required a whole team!

I believe that, for a long time, the SharePoint team generally thought that the use of either of these missing features was a hack that should be discouraged because script injection was bad and therefore they were bad. Perhaps now that SharePoint is embracing script injection people still subconsciously want to kill those features but don’t realize their ‘flaws’ are now seen as virtues. This a sort of diffuse cloud of dumb that is common in groups with a long history even where all of the individuals are very smart.

Or maybe the dumb wasn’t dumb at all. Maybe it’s the people focused on building the new document library had a narrow focus that didn’t include any integration scenarios.

Whatever the reason, feedback is to dumb as sunshine is to nasty germs. Their development process is now built to generate lots of feedback before delivering and act upon it. You can help by getting involved and giving lots of feedback in the public Yammer network and on SharePoint’s User Voice page.

Speaking of User Voice and the missing document library features…

#6: It’s not Vaporware

The Dev Kitchen I attended had several accomplished SharePoint development experts, but it also met a good number of people from ISVs who didn’t have much, if any, experience working with SharePoint. I saw people working in OSX, Linux, and Windows. There were a number of scripted hands on labs, but we also had free time to implement our own solutions. Over the course of three days I saw several really impressive samples from this diverse bunch of people.

As for myself, I took a moderately complex sample that uses AngularJS and Bootstrap for pluggable components and wrapped it in the new stuff. It took very little time, and the end result worked in both the current UI and in the new UI. I should probably mention that I was pretty unlucky and was in pain from a kidney stone during much of the event! Even in my diminished and pitiful state I was able to work completely outside the script and it worked!

Powering SharePoint customizations…