Node Sass could not find a binding for your current environment


Now, this is a strange one. Suddenly, my Task Runner Explorer in VS2015 stopped loading Sass with this error:
Node Sass could not find a binding for your current environment

Not the right solutions

VS gives you a helpful suggestion along with the error: “Run `npm rebuild node-sass` to build the binding for your current environment.”
So you do that and everything is fine, right? WRONG. It doesn’t help.

Also the trouble shooting guide at “” has a section about “Debugging installation issues”. So you try that. Doesn’t help either.


But, then, on the bottom of that same guide you see this: “Using node-sass with Visual Studio 2015 Task Runner
Now, you’re in luck. Switching the external web tools to use the system wide installed versions, does actually help. After following the description on the linked page (“”), Task Runner Explorer loads your sass module again!

Determine if a .NET assembly was built for x86 or x64 with PowerShell

An alternative for using the CorFlags CLI tool to determine the processor architecture of a .dll file is to use PowerShell and reflection:

C:\> [reflection.assemblyname]::GetAssemblyName("${pwd}\TargetAssembly.dll") | fl

Name : TargetAssembly
Version :
CultureInfo :
CodeBase : file:///C:/path/to/file/...
EscapedCodeBase : file:///C:/path/to/file/...
ProcessorArchitecture : MSIL
Flags : PublicKey
HashAlgorithm : SHA1
VersionCompatibility : SameMachine
KeyPair :
FullName : TargetAssembly, Version=, Culture=neut...

Other values for ProcessorArchitecture are

Member name Description
Amd64 A 64-bit AMD processor only.
Arm An ARM processor.
IA64 A 64-bit Intel processor only.
MSIL Neutral with respect to processor and bits-per-word.
None An unknown or unspecified combination of processor and bits-per-word.
X86 A 32-bit Intel processor, either native or in the Windows on Windows environment on a 64-bit platform (WOW64).

(see MSDN: ProcessorArchitecture Enumeration)

Rename a team project on VSO with Git source control

As other companies do, we also start our projects with an internal project name. Ours might not be as ambitious as some out there (e.g. “Project Spartan”) but the subject applies anyway. There’s a moment in every project when you want to go public, even if it’s an entirely internal project. In this case, ‘public’ might relate to the company instead of the whole wide world.

At this stage, you have to decide on the product name. After a lengthy discussion with the entire team, you either have a cool name (e.g. “UPETO”) or you decide to go with the project name you had from the start. If the former is the case, you probably want to reflect the new name across all documentations, SharePoint sites, namespaces and also source control.

Having your project hosted on VSO, there is a simple solution. Basically you

  • go to your project admin page in VSO and
  • in the project’s overview tab edit the name.

Then, you have to update all existing references to the Git repositories. If you already use VS 2015 it’s as easy as updating the remotes in Repositories Settings. If you still use VS 2013, as I was when a renamed our project, then the easiest way is to use the command prompt:

  • open a developer command prompt
  • go to the location of your local clone of the repository
  • issue this command:
    git remote set-url origin <new remote url>

Microsoft published a guide on this topic as well:

But, be aware that the command they posted in their guide is wrong. Be sure to take the one I posted here, i.e. “git remote set-url” and not “git set-url remote”.

Code on and have fun!

Git remote branch not visible in Visual Studio

Multiple people working on the same code has its pitfalls. One of them is seeing remote branches in Visual Studio when using Git as source control. If someone else created a branch you don’t see it immediately in Visual Studio. Even hitting the refresh button doesn’t help.
Refresh button not enough

What you do is go to “Unsynced commits” and hit “Fetch”. Usually, you would do that to fetch remote changes. But it also helps to discover new remote branches.

After a second or two, you’ll see the missing remote branch and can create a new local one, as you usually would.

Code on and have fun!

Boot Windows from a VHD (Native Boot)

Virtual disk images (VHD) cannot only be used in a virtual machine, modern computers can boot directly into an OS that is installed to a virtual disk.

After creating and installing windows on a virtual disk (e.g. with bootstrapping from an iso-file), the VHD needs to be registered at the bootloader. The following PowerShell script takes care of all the steps that are needed to do so

 [ValidateScript({Test-Path $_ -PathType 'Leaf'})]

Import-Module hyper-v

# Some variables
$targetDiskPath = Get-Item $SourceVhd

# Mount disk
Mount-VHD $targetDiskPath.FullName
$driveLetter = (Get-DiskImage -ImagePath $targetDiskPath.FullName | Get-Disk | Get-Partition | Get-Volume | Where-Object {$_.FileSystemLabel -ne "System Reserverd"}).DriveLetter

# Add boot entry
$vmWindowsPath = "$($driveLetter):\Windows"
$exeBcdboot = Join-Path $vmWindowsPath "system32\bcdboot.exe"
$exeBcdbootParams = "{0} /addlast" -f $vmWindowsPath

& $exeBcdboot $exeBcdbootParams

# Unmount disk
Dismount-VHD $targetDiskPath.FullName

Just call this script from an elevated PowerShell prompt and pass the path to a VHD that has Windows already installed. After restarting the computer an additional boot entry is displayed


Bootstrap a Virtual Machine based on a Windows Install Medium

On the Technet Script Center there is a great PowerShell script that handles the basic steps to create sysprepped VHDx: WIM2VHD for Windows 8 and Windows 8.1. This works for Windows 7/Server 2008 R2, Windows 8/8.1/Server 2012/R2 and even for Windows 10 installation mediums.

To create a VHDx based on “Windows_8.1-en.iso” for example just open a powershell prompt and type

.\Convert-WindowsImage.ps1 -SourcePath C:\Path\To\Image\Windows_8.1-en.iso -VHDFormat VHDX -VHDPartitionStyle GPT -VHDPath C:\Path\to\VirtualDisk\Windows81.vhdx -Edition "Professional" -Verbose

After that you can use the “Hyper-V Manager” to create a new VM and attach this disk, boot into the brand new system and configure it accordingly





Nancy host in Azure Worker Role

For a lightweight back-end service we were looking for a suitable solution on Microsoft Azure. As we are building an Azure Cloud Service, there are basically two possibilities: Web Role or Worker Role; the main difference between the two being the presence of IIS. So, one way to go would be to build a Web API on a Web Role. This, however, seemed to heavy for a simple service which essentially provides data input/output capabilities without much business logic. Enter Nancy.

Our solution is to create a Nancy host on a Worker Role. This is a pretty lightweight solution, maybe topped only by a custom implementation. Here’s how we do it:

In the Worker Role project we need the NuGet packages

  • Nancy
  • Nancy.Hosting.Self

In the Worker Roles entry point class

public class WorkerRole : RoleEntryPoint

we need

using Nancy.Hosting.Self;


Then, in the Run method we open the NancyHost like this:

private void Run(CancellationToken cancellationToken)
     const string endpointName = "Endpoint1";
     var roleInstance = RoleEnvironment.CurrentRoleInstance;
     var internalEndpoint = roleInstance.InstanceEndpoints[endpointName].IPEndpoint;
     var uri = new Uri("http://" + internalEndpoint);
     using (var host = new NancyHost(uri))
          while (!cancellationToken.IsCancellationRequested)
  • line 3: The name of the endpoint defined in the Worker Roles properties:
    Endpoint settings
  • lines 4 & 5: Get the endpoint to open the host on.
  • line 8: Create the NancyHost on the Role endpoint.
  • line 10: Start the host.

To react to external calls we add a NancyModule:

namespace BackEnd.Host
    public class IndexModule : NancyModule
        public IndexModule(IRepository repository)
            Get["/ReturnData"] = parameters =&gt; Response.AsJson(repository.GetData());

This enables other roles in the Cloud Service to call our Nancy host at the address “/ReturnData” to get the required data. The data access logic we defer to a repository. We inject the repository into the module. For this constructor injection to work we need to register the implementation. To achieve this we create a custom bootstrapper and profit from TinyIoC‘s AutoRegister method (as long as we have only one implementation of IRepository in the assembly).

public class Bootstrapper : DefaultNancyBootstrapper
     protected override void ConfigureApplicationContainer(TinyIoCContainer container)
          container.AutoRegister(new[] {typeof(IRepository).Assembly});

For more details on Nancy have a look at the documentation on GitHub.

Code on and have fun!

Register and log on to a Web API 2 from MVC – part 2

In part 1, we talked about how to register a user on a Web API 2 from an MVC application. In this post, we are taking the next step: logging in.

Log in

The endpoint to call on the API for logging in is:


That’s right, the endpoint for logging in is the token endpoint, i.e. logging in actually means requesting an access token. For calling the endpoint and getting a token we use a library provided by thinktecture ( To get the library install the NuGet package


Then, after adding

using Thinktecture.IdentityModel.Client;

you have access to the client:

var client = new OAuth2Client(new Uri("http://yourAPIaddress/token"));
TokenResponse response = await client.RequestResourceOwnerPasswordAsync(userName, password);

Now, we inspect the received TokenReponse:

if (!tokenResponse.IsError)
    var tokenExpiresBy = DateTime.Now.AddSeconds(tokenResponse.ExpiresIn);
    string accessToken = tokenResponse.AccessToken;

With this access token, we can now send authorized requests to the API:

using (var client = new HttpClient())
    return await client.GetAsync(new Uri("http://yourAPIaddress/someRestrictedResource"));

Code on and have fun!

Register and log on to a Web API 2 from MVC – part 1

With the current ASP.NET Web Application project templates in Visual Studio 2013 you get a very nice starting point for some authentication scenarios, e.g. logging on to a Web API using local or external accounts. Unfortunately, the templates as well as most examples you find on the Web interact with the Web API directly, i.e. calling a Web page on the Web API itself.
In our applications, we have a different situation: We log on to the Web API from a separate MVC application which serves as the client interface. Assuming we have a Web API with the default setup from the VS template.


Essentially, what you need to do for registering a new user is call the API at this endpoint:


So, let’s post to that endpoint:

using (var client = new HttpClient())
   await client.PostAsync(new Uri("http://yourAPIaddress/api/account/register"), content);

Now, what’s the “content”?

var dict = new Dictionary<string, string>
                {"Email", model.Email},
                {"Password", model.Password},
                {"ConfirmPassword", model.ConfirmPassword}
var content = new FormUrlEncodedContent(dict);

If the HttpResponseMessage returned from our post signals success, the user has been registered successfully. But, you’ll probably want to add additional steps before a user is able to log on to the API, for example email verification. Regarding this topic, there is a good article describing the general setup on the official ASP.NET page: Create a secure ASP.NET MVC 5 web app with log in, email confirmation and password reset. We’ll cover our adaptation of it in a future blog post.

Code on and have fun!

Use Autofac in MVC app

We all have our favourites when it comes to dependency injection containers. For apps, autofac has really grown on me.

So, for those who want to give it a try, here’s how you set up your MVC 5 application to use autofac as a dependency injection container:

  1. Install the Autofac.Mvc5 nuget package.
  2. In Global.asax, add
    using Autofac;
    using Autofac.Integration.Mvc;
  3. In the Application_Start method, add
    var builder = new ContainerBuilder();
    builder.RegisterAssemblyTypes(<a referenced assembly>).AsImplementedInterfaces();
    var container = builder.Build();
    DependencyResolver.SetResolver(new AutofacDependencyResolver(container));

Let’s see what’s going on here:

  • line 2: Register all controllers of the MVC application itself. This is needed because we are going to replace the dependency resolver later on.
  • line 3: Register all types implementing an interface in a referenced assembly, e.g. a shared library.
  • line 5: Set autofac as the dependency resolver for the application.

Code on and have fun!