If you’ve read my earlier post on authentication of actions invoked in a Microsoft Teams MessageCard, you’ve probably seen the only useful information we get in the user’s token is the Object Id (`oid`).

{
  "iat": 1560799130,
  "ver": "STI.ExternalAccessToken.V1",
  "appid": "48afc8dc-f6d2-4c5f-bca7-069acd9cc086",
  "sub": "bc6c3ca0-5acd-4cd4-b54c-f9c83925e7e3",
  "appidacr": "2",
  "acr": "0",
  "tid": "4b1fa0f3-862b-4951-a3a8-df1c72935c79",
  "oid": "b26c3c10-5fad-4cd3-b54c-f9283922e7e2",
  "iss": "https://substrate.office.com/sts/",
  "aud": "https://serverlessdevops.azurewebsites.net",
  "exp": 1560800030,
  "nbf": 1560799130
}

While this is nice, it doesn’t really tell us much.

However, because we have the object id, we can use this to query the Azure Active Directory to check up on who this user actually is and implement some authorization logic with it.

However, when I was searching for a workable piece of code describing how to access AAD and retrieve users from it, the information was…not very useful. In the end, I have found something workable and I’ll be sharing my solution in this post.

How to set up my application in AAD

In order to do something inside AAD, you need to have an identity over there. Since we’re creating an application (console or API), we need to create an Application Registration.

To do this, navigate to your Azure Active Directory blade inside the Azure Portal and create a new `App registration`. I’ve called mine `ConsoleGraph` because I’m creating a console application to query my AAD.

overview of app registration

On the overview page, you can see both the `Directory (tenant) ID` and `Application (client) ID`. You’ll be needing these later on.

Next thing you need to do is to add a new secret to this application. It doesn’t matter much how you call this secret, just be sure to remember/copy the value as it’s very important and you won’t be seeing it again inside the Azure Portal.

application secret blade

Now that you have all of the details for this application, you still need to make sure this application has the appropriate permissions to query the Azure Active Directory.

Navigate to the `API permissions` blade. Over there, you should be able to add the permission to `Read directory data` from the AAD.

api permissions overview

Your application, `ConsoleGraph` in this sample, will now be able to read all data from the AAD. If you’re very keen on security, you might want to strip down the permissions a bit, but this is good enough for me.

How to connect my application to AAD

Making a connection to the AAD was the trickiest part for me. You have to create an `ActiveDirectoryClient` and create an `AuthenticationContext` which is able to acquire tokens from the AAD.

While this all works, you HAVE to know which settings to use on each specific property. Making a typo or messing up a setting will result in you not receiving a valid token and the error messages aren’t very useful (to me).

Put your application settings in a configuration file

The details of the `Application registration` should be put inside some configuration file, like an app.config file. In a production system, you might want to store the secret values in a more secure system, like Azure KeyVault. For this sample project the app.config is good enough.

Having added the details to your app.config file, your `appSettings` will probably look something like this.

<appSettings>
	<add key="TenantId" value="b1f9cb25-7c7a-4ecd-96c1-513c2b42c350"/>
	<add key="TenantName" value="myTentantName.onmicrosoft.com"/>
	<add key="ClientId" value="d82c0c6a-8c14-4c42-8aca-60c79fcfc9b4"/>
	<add key="ClientSecret" value="27?_MOh_qM633Hcccct;cw:@*$9ojcsNxve)rYI"/>
</appSettings>

I’ve created a small `Settings` class where all of these values are loaded in the appropriate format.

internal class Settings
{
    public const string ResourceUrl = "https://graph.windows.net";
    public static string TenantId => ConfigurationManager.AppSettings["TenantId"];
    public static string TenantName => ConfigurationManager.AppSettings["TenantName"];
    public static string ClientId => ConfigurationManager.AppSettings["ClientId"];
    public static string ClientSecret => ConfigurationManager.AppSettings["ClientSecret"];
    public static string AuthString => "https://login.microsoftonline.com/" + TenantName; 
}

Getting the right values for the `AuthString` and the `ResourceUrl` was the hardest part. The posts I found on the internet weren’t very helpful as each post used some other value and it wasn’t very clear to me what they are for. Eventually, I found these values to work for me.

Connect to AAD

Connecting to AAD is fairly straightforward.
As I mentioned, you need to create an `ActiveDirectoryClient` and use an `AuthenticationContext` in order to retrieve valid tokens.

I’ve used the following block of code to connect to Azure Active Directory and retrieve data from it.

public static ActiveDirectoryClient GetActiveDirectoryClientAsApplication()
{
    Uri servicePointUri = new Uri(Settings.ResourceUrl);
    Uri serviceRoot = new Uri(servicePointUri, Settings.TenantId);
    ActiveDirectoryClient activeDirectoryClient = new ActiveDirectoryClient(
        serviceRoot,
        async () => await AcquireTokenAsyncForApplication());
    return activeDirectoryClient;
}

private static async Task<string> AcquireTokenAsyncForApplication()
{
    AuthenticationContext authenticationContext = new AuthenticationContext(Settings.AuthString, false);

    ClientCredential clientCred = new ClientCredential(Settings.ClientId, Settings.ClientSecret);
    AuthenticationResult authenticationResult =
        await authenticationContext.AcquireTokenAsync(
            Settings.ResourceUrl,
            clientCred);
    string token = authenticationResult.AccessToken;
    return token;
}

How to retrieve data from AAD

By using the Active Directory helper method from the code block above you’re able to query everything inside your AAD.

The `ActiveDirectoryClient` can query all of AAD, including the users.
I’ve used it myself to iterate through all of the users and print them per line. You can also use the client to retrieve a specific user by querying on the `ObjectId`. This will result in retrieving a single user.

var client = AuthenticationHelper.GetActiveDirectoryClientAsApplication();

try
{
	var users = await client.Users.OrderBy(user => user.DisplayName).ExecuteAsync();
	var foundUser = await client.Users.Where(user => user.ObjectId == "d62d8c6a-dc69-46c1-99c4-36cd672f0c12").ExecuteAsync();
	foreach (var user in users.CurrentPage)
	{
		Console.WriteLine(user.DisplayName + " " + user.ObjectId);
	}
}
catch (Exception exception)
{
	Console.WriteLine(exception);

By using this `ActiveDirectoryClient`, you can now start to authorize users based on their details like for example a role, group or e-mail address.

If you’re interested in a ready-to-go sample, you can check out my GitHub repository containing all the details and a working console application.

I hope this post will help others because it has surely taken up too much of my time to find out what I needed to do exactly in order to retrieve user data from AAD.

Being able to create Message Cards or Actionable Messages in Microsoft Teams via a Logic App or an Azure Function is great. Especially if you can use this to invoke logic on your API and update the message in the Teams channel.

However, you don’t want everyone to invoke a management API endpoint you’ve exposed to ‘do stuff’ in your cloud environment. Normally, you’d want to authenticate if the user pressing the button (read: invoking the endpoint).

Lucky for us, this is very doable when invoking the endpoint via a Teams MessageCard/Actionable Message.

The token

Because Microsoft Teams is part of the Office 365 suite, you will be logged in as a user on the tenant. Therefore, the software has a user context and is able to pass this along to your API via a JWT Bearer token.

If you log in to the web client of Microsoft Teams (https://teams.microsoft.com) with your favorite browser you’ll be able to find the token which belongs to you.

In order to test this, I’ve created a new MessageCard in my Teams channel with 1 `potentialAction` which will invoke an Azure Function.

messagecard with AuthorizationTest button

If you open up the network tab of your browser’s Developer Tools and press the AuthorizationTest button you’ll see the request is made to a Teams endpoint called `executeAction` with a bearer token in the `Authorization` header.

request in network tab to executeAction

When decoding this token at https://jwt.io/ you’ll see a lot of details which match with your Office 365 user.

{
  "aud": "https://api.spaces.skype.com",
  "iss": "https://sts.windows.net/4b1fa0f3-862b-4951-a3a8-df1c72935c79/",
  "iat": 1560882424,
  "nbf": 1560882424,
  "exp": 1560886324,
  "acct": 0,
  "acr": "1",
  "aio": "AVQAq/8LBACA8+mMRGmy37A7sPouo42hawcsCtG7iqUz//lmEAUCmK67lc2GmhtZIA2LM+1nw18wtIeREMejFpXpmH7uUsKbZGQYV3vyRRmlH7guw3JTBuk=",
  "amr": [
    "pwd",
    "mfa"
  ],
  "appid": "5e3ce6f0-2b1f-4285-8a4b-75ec7a757346",
  "appidacr": "0",
  "family_name": "de Vries",
  "given_name": "Jan",
  "ipaddr": "211.107.84.235",
  "name": "Jan de Vries",
  "oid": "b26c3c10-5fad-4cd3-b54c-f9283922e7e2",
  "puid": "10037FFF9443BDEA",
  "scp": "user_impersonation",
  "sub": "U02i9QRWudZWzrZeQzhaPLpgsGo0go4qjBk5A8Qv1-g",
  "tid": "4c1fc0f3-8c2b-4c51-c3a8-df3c72936c79",
  "unique_name": "jan@jan-v.nl",
  "upn": "jan@jan-v.nl",
  "uti": "avUcwdSBc0SXZfbcANocAA",
  "ver": "1.0"
}

My original assumption was this would be the token which is also sent to your backend API. I was ready to use this information in order to authenticate & authorize if a user was allowed to access the specific endpoint.

This assumption, however, is incorrect!

The token you’ll receive in your API has the following content.

{
  "iat": 1560799130,
  "ver": "STI.ExternalAccessToken.V1",
  "appid": "48afc8dc-f6d2-4c5f-bca7-069acd9cc086",
  "sub": "bc6c3ca0-5acd-4cd4-b54c-f9c83925e7e3",
  "appidacr": "2",
  "acr": "0",
  "tid": "4b1fa0f3-862b-4951-a3a8-df1c72935c79",
  "oid": "b26c3c10-5fad-4cd3-b54c-f9283922e7e2",
  "iss": "https://substrate.office.com/sts/",
  "aud": "https://serverlessdevops.azurewebsites.net",
  "exp": 1560800030,
  "nbf": 1560799130
}

As you can see, the content is very different. This (much) smaller token is still useful as it has the `tid` specified, which is your tenant identifier and the `oid`, which is the object identifier of the user who pressed the button.

How to validate

On GitHub, you can find an (old) repository containing some sample code which you can use to validate the incoming bearer token from Teams. This repository can be found over here: https://github.com/OfficeDev/o365-actionable-messages-utilities-for-dotnet

The validation logic can be found in the `ActionableMessageTokenValidator`.

var o365OpenIdConfig = await _configurationManager.GetConfigurationAsync(new CancellationToken());
var result = new ActionableMessageTokenValidationResult();

var parameters = new TokenValidationParameters
{
  ValidateIssuer = true,
  ValidIssuers = new[] { O365OpenIdConfiguration.TokenIssuer },
  ValidateAudience = true,
  ValidAudiences = new[] { targetServiceBaseUrl },
  ValidateLifetime = true,
  ClockSkew = TimeSpan.FromMinutes(TokenTimeValidationClockSkewBufferInMinutes),
  RequireSignedTokens = true,
  IssuerSigningKeys = o365OpenIdConfig.SigningKeys
};

ClaimsPrincipal claimsPrincipal;
var tokenHandler = new JwtSecurityTokenHandler();

try
{
  // This will validate the token's lifetime and the following claims:
  // 
  // iss
  // aud
  //
  SecurityToken validatedToken;
  claimsPrincipal = tokenHandler.ValidateToken(token, parameters, out validatedToken);
}

What we’re doing over here is creating the validation parameters and the actual validation. The `O365OpenIdConfiguration` contains some constants which are true for every MessageCard action.

If using an Azure Function for your API endpoint, your token validation code might look similar to the following piece of code.

[FunctionName("AuthorizationTest")]
public static async Task Run(
	[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] 
	HttpRequest request,
	ILogger log)
{
	log.LogInformation($"Excuting {nameof(AuthorizationTest)}.");
			
	var bearerToken = request.Headers["Authorization"].ToString();
	var baseUrl = $"{request.Scheme}{Uri.SchemeDelimiter}{request.Host.Value}";

	var validationResult = new ActionableMessageTokenValidationResult();
	try
	{
		var tokenValidator = new ActionableMessageTokenValidator();
		validationResult = await tokenValidator.ValidateTokenAsync(bearerToken.Replace("Bearer ", ""), baseUrl);
	}
	catch(Exception ex)
	{
		log.LogError(ex, "Validation failed");
	}
	// the rest of your logic...

}

After the bearer token has been validated you might want to add some additional logic to check if the request is made from a tenant you trust and/or if the user (Object Id) matches someone who is allowed to invoke the endpoint.
If you want to do this, you’ll have to create some authorization logic for this yourself.

The full code sample which I’ve used in my Azure Function can be found in my ServerlessDevOps GitHub repository.

It has taken us some time to figure all of this out, but it’s working like a charm now!

Azure Functions are great! HTTP triggered Azure Functions are also great, but there’s one downside. All HTTP triggered Azure Functions are publicly available. While this might be useful in a lot of scenario’s, it’s also quite possible you don’t want ‘strangers’ hitting your public endpoints all the time.

One way you can solve this is by adding a small bit of authentication on your Azure Functions.

For HTTP Triggered functions you can specify the level of authority one needs to have in order to execute it. There are five levels you can choose from. It’s `Anonymous`, `Function`, `Admin`, `System` and `User`. When using C# you can specify the authorization level in the HttpTrigger-attribute, you can also specify this in the function.json file of course. If you want a Function to be accessed by anyone, the following piece of code will work because the authorization is set to Anonymous.

[FunctionName("Anonymous")]
public static HttpResponseMessage Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = null)]
    HttpRequestMessage req,
    ILogger logger)
{
    // your code

    return req.CreateResponse(HttpStatusCode.OK);
}

If you want to use any of the other levels, just change the AuthorizationLevel enum to any of the other values corresponding to the level of access you want. I’ve created a sample project on GitHub containing several Azure Functions with different authorization levels so you can test out the difference in the authorization levels yourself. Keep in mind, when running the Azure Functions locally, the authorization attribute is ignored and you can call any Function no matter which level is specified.

Whenever a different level a different level as Anonymous is used, the caller has to specify an additional parameter in the request in order to get authorized to use the Azure Function. Adding this additional parameter can be done in 2 different ways.
The first one is adding a `code`-parameter to the querystring which value contains the secret necessary for calling the Azure Function. When using this approach, a request can look like this: https://[myFunctionApp].azurewebsites.net/api/myFunction?code=theSecretCode

Another approach is to add this secret value in the header of your request. If this is your preferred way, you can add a header called `x-functions-key` which value has to contain the secret code to access the Azure Function.

There are no real pros or cons to any of these two approaches, it quite depends on your solution needs and how you want to integrate these Functions.

Which level should I choose?

Well, it depends.

The Anonymous level is pretty straightforward, so I’ll skip this one.

The User level is also fairly simple, as it’s not supported (yet). From what I understand, this level can and will be used in the future in order to support custom authorization mechanisms (EasyAuth). There’s an issue on GitHub which tracks this feature.

The Function level should be used if you want to give some other system (or user) access to this specific Azure Function. You will need to create a Function Key which the end-user/system will have to specify in the request they are making to the Azure Function. If this Function Key is used for a different Azure Function, it won’t get accepted and the caller will receive a 401 Unauthorized.

Only ones left are the Admin and System levels. Both of them are fairly similar, they work with so-called Host Keys. These Host Keys work on all Azure Functions which are deployed on the same system (Function App). This is different from the earlier mentioned Function Keys, which only work for one specific function.
The main difference between these two is, System only works with the _master host key. The Admin level should also work with all of the other defined host keys. I’m writing the word' ‘should’, because I couldn’t get this level to work with any of the other host keys. Both only appeared to be working with the defined _master key. This might have been a glitch at the time, but it’s good to investigate once you get started with this.

How do I set up those keys?

Sadly, you can’t specify them via an ARM template (yet?). My guess is this will never be possible as it’s something you want to manage yourself per environment. So how to proceed? Well head to the Azure Portal and check out the Azure Function you want to see or create keys for.

You can manage the keys for each Azure Function in the portal and even create new ones if you like.

clip_image001

I probably don’t have to mention this, but just to be on the safe side, you don’t want to distribute these keys to a client-side application. The reason for this is pretty obvious, it’s because the key will be sent in the request, therefore it’s not a secret anymore. Anyone can check out this request and see which key is sent to the Azure Function.
You only want to use these keys (both Function Keys and Host Keys) when making requests between server-side applications. This way your keys will never be exposed to the outside world and minimize the chance of a key getting compromised. If for some reason a key does become compromised you can Renew or Revoke a key.

One gotcha!

There’s one gotcha when creating an HTTP Triggered Function with the Admin authorization level. You can’t prefix these Functions with the term ‘Admin’. For example, when creating a function called ‘AdministratorActionWhichDoesSomethingImportant’ you won’t be able to deploy and run it. You’ll receive an error there’s a routing conflict.

[21-8-2018 19:07:41] The following 1 functions are in error:
[21-8-2018 19:07:41] AdministratorActionWhichDoesSomethingImportant : The specified route conflicts with one or more built in routes.

Or when navigating to the Function in the portal you’ll get this error message popped up.

image

Probably something you want to know in advance, before designing your complete API with Azure Functions.

There’s a relative new feature available in Azure called Managed Service Identity. What it does is create an identity for a service instance in the Azure AD tenant, which in its turn can be used to access other resources within Azure. This is a great feature, because now you don’t have to maintain and create identities for your applications by yourself anymore. All of this management is handled for you when using a System Assigned Identity. There’s also an option to use User Assigned Identities which work a bit different.

Because I’m an Azure Function fanboy and want to store my secrets within Azure Key Vault, I was wondering if I was able to configure MSI via an ARM template and access the Key Vault from an Azure Function without specifying an identity by myself.

As most of the things, setting this up is rather easy, once you know what to do.

The ARM template

The documentation states you can add an `identity` property to your Azure App Service in order to enable MSI.

"identity": {
    "type": "SystemAssigned"
}

This setting is everything you need in order to create a new service principal (identity) within the Azure Active Directory. This new identity has the exact same name as your App Service, so it should be easy to identify.

If you want to check out yourself if everything worked, you can check the AAD Audit Log. It should have a couple of lines stating a new service principal has been created.

clip_image001

You can also check out the details of which has happened by clicking on the lines.

image

Not very interesting, until something is broken or needs debugging.

An easier method to check if your service principal has been created is by checking the Enterprise Applications within your AAD tenant. If your deployment has been successful, there’s an application with the same name as your App Service.

clip_image001[5]

Step two in your ARM template

After having added the identity to the App Service, you now have access to the `tenantId` and `principalId` which belong to this identity. These properties are necessary in order to give your App Service identity access to the Azure Key Vault. If you’re familiar with Key Vault, you probably know there are some Access Policies you have to define in order to get access to specific areas in the Key Vault.

Figuring out how to retrieve the new App Service properties was the hardest part of this complete post, for me. Eventually I figured out how to access these properties, thanks to an answer on Stack Overflow. What I ended up doing is retrieving a reference to the App Service in the `accessPolicies` block of the Key Vault resource and use the `identity.tenantId` and `identity.principalId`.

"accessPolicies": [
{
  "tenantId": "[reference(concat('Microsoft.Web/sites/', parameters('webSiteName')), '2018-02-01', 'Full').identity.tenantId]",
  "objectId": "[reference(concat('Microsoft.Web/sites/', parameters('webSiteName')), '2018-02-01', 'Full').identity.principalId]",
  "permissions": {
    "keys": [],
    "secrets": [
      "get"
    ],
    "certificates": [],
    "storage": []
  }
}],

Easy, right? Well, if you’re an ARM-template guru probably.

Now deploy your template again and you should be able to see your service principal being added to the Key Vault access policies.

clip_image001[7]

Because we’ve specified the identity has access to retrieve (GET) secrets, in theory we are now able to use the Key Vault.

Retrieving data from the Key Vault

This is actually the easiest part. There’s a piece of code you can copy from the documentation pages, because it just works!

var azureServiceTokenProvider = new AzureServiceTokenProvider();
var keyvaultClient = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
            
var secretValue = await keyvaultClient.GetSecretAsync($"https://{myVault}.vault.azure.net/", "MyFunctionSecret");
            
return req.CreateResponse(HttpStatusCode.OK, $"Hello World! This is my secret value: `{secretValue.Value}`.");

The above piece of code retrieves a secret from the Key Vault and shows it in the response of the Azure Function. The result should look something like the following response I saw in Firefox.

image

Using the `KeyVaultTokenCallback` is exclusive to be used with the Key Vault (hence the name). If you want to use MSI with other Azure services, you will need to use the `GetAccessTokenAsync` method in order to retrieve an access token to access the other Azure service.

So, that’s all there is to it in order to make your Azure Function or Azure environment a bit more safe with these managed identities.
If you want to check out the complete source code, it’s available on GitHub.

I totally recommend using MSI, because it’ll make your code, software and templates much safer and secure.

I’m in the process of adding an ARM template to an open source project I’m contributing to. All of this was pretty straightforward, until I needed to add some secrets and connection strings to the project.

While it’s totally possible to integrate these secrets in your ARM parameter file or in your continuous deployment pipeline, I wanted to do something a bit more advanced and secure. Of course, Azure Key Vault comes to mind! I’ve already used this in some of my other ASP.NET projects and Azure Functions, so nothing new here.

The thing is, the projects I’ve worked on, always retrieved the secrets from Key Vault like the following example:

"adminPassword": {
    "reference": {
        "keyVault": {
        "id": "/subscriptions/<subscription-id>/resourceGroups/examplegroup/providers/Microsoft.KeyVault/vaults/<vault-name>"
        },
        "secretName": "examplesecret"
    }
}

While this isn’t a bad thing per se, I don’t like having the `subscription-id` hardcoded in this configuration, especially when doing open source development. Mainly because other people can’t access my Key Vault, so they’ll run into trouble when deploying this template. Therefore, I started investigating if this subscription id can be added dynamically.

Introducing the Dynamic Id

Lucky for us the ARM-team has us covered! By changing the earlier mentioned configuration a bit you’re able to use the function `subscription().subscriptionId` in order to get your own subscription id.

"adminPassword": {
    "reference": {
        "keyVault": {
        "id": "[resourceId(subscription().subscriptionId,  parameters('vaultResourceGroup'), 'Microsoft.KeyVault/vaults', parameters('vaultName'))]"
        },
        "secretName": "[parameters('secretName')]"
    }
},

Downside though, this doesn’t work in your parameter file!

It also doesn’t work in your normal ARM template!

So what’s left? Well, using ARM templates in combination with nested templates! Nested templates are the key to using this dynamic id. Nested templates aren’t something I envy using, because it’s easy to get lost in all of those open files.

Well, enough sample configuration for now, let’s see how this looks like in an actual file.

{
    "apiVersion": "2015-01-01",
    "name": "nestedTemplate",
    "type": "Microsoft.Resources/deployments",
    "properties": {
        "mode": "Incremental",
        "templateLink": {
            "uri": "[concat(parameters('templateBaseUri'), 'my-nested-template.json')]",
            "contentVersion": "1.0.0.0"
        },
        "parameters": {
            "resourcegroup": {
                "value": "[parameters('resourcegroup')]"
            },
            "hostingPlanName": {
                "value": "[parameters('hostingPlanName')]"
            },
            "skuName": {
                "value": "[parameters('skuName')]"
            },
            "skuCapacity": {
                "value": "[parameters('skuCapacity')]"
            },
            "websiteName": {
                "value": "[parameters('websiteName')]"
            },
            "vaultName": {
                "value": "[parameters('vaultName')]"
            },
            "mySuperSecretValueForTheAppService": {
                "reference": {
                    "keyVault": {
                        "id": "[resourceId(subscription().subscriptionId,  parameters('resourcegroup'), 'Microsoft.KeyVault/vaults', parameters('vaultName'))]"
                    },
                    "secretName": "MySuperSecretValueForTheAppService"
                }
            }
        }
    }
}

In order to use the dynamic id, you have to add it to the `parameters`-section of the nested template resource. Anywhere else in the process is too early or too late to retrieve those values. Ask me how I know…

The observant reader might also notice me using the `templateLink` property with an URI inside.

"templateLink": {
    "uri": "[concat(parameters('templateBaseUri'), 'my-nested-template.json')]",
    "contentVersion": "1.0.0.0"
}

This is because you can only use these functions when the nested template is located on a (public) remote location. Another reason why I don’t really like this approach. Linking to a remote location means you can’t use the templates which are located inside the artifact package you are deploying. There is an issue on the feedback portal asking to support local file locations, but it’s not implemented (yet).

For now we just have to copy the template(s) to a remote location during the CI-build process (or do some template-extraction-and-publication-magic in the deployment pipeline). Whenever the CD pipeline runs, it’ll have to try to use the templates which are pushed to this remote location. Sounds like a lot of work, that’s because it is!

You might wonder how does this nested template look like? Well, it looks a lot like a ‘normal’ template

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "resourcegroup": {
            "type": "string"
        },
        "hostingPlanName": {
            "type": "string",
            "minLength": 1
        },
        "skuName": {
            "type": "string",
            "defaultValue": "F1",
            "allowedValues": [
                "F1",
                "D1",
                "B1",
                "B2",
                "B3",
                "S1",
                "S2",
                "S3",
                "P1",
                "P2",
                "P3",
                "P4"
            ],
            "metadata": {
                "description": "Describes plan's pricing tier and instance size. Check details at https://azure.microsoft.com/en-us/pricing/details/app-service/"
            }
        },
        "skuCapacity": {
            "type": "int",
            "defaultValue": 1,
            "minValue": 1,
            "metadata": {
                "description": "Describes plan's instance count"
            }
        },
        "websiteName": {
            "type": "string"
        },
        "vaultName": {
            "type": "string"
        },
        "mySuperSecretValueForTheAppService": {
            "type": "securestring"
        }
    },
    "variables": {},
    "resources": [{
            "apiVersion": "2015-08-01",
            "name": "[parameters('hostingPlanName')]",
            "type": "Microsoft.Web/serverfarms",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "HostingPlan"
            },
            "sku": {
                "name": "[parameters('skuName')]",
                "capacity": "[parameters('skuCapacity')]"
            },
            "properties": {
                "name": "[parameters('hostingPlanName')]"
            }
        },
        {
            "apiVersion": "2015-08-01",
            "name": "[parameters('webSiteName')]",
            "type": "Microsoft.Web/sites",
            "location": "[resourceGroup().location]",
            "dependsOn": [
                "[resourceId('Microsoft.Web/serverFarms/', parameters('hostingPlanName'))]"
            ],
            "tags": {
                "[concat('hidden-related:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]": "empty",
                "displayName": "Website"
            },
            "properties": {
                "name": "[parameters('webSiteName')]",
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', parameters('hostingPlanName'))]"
            },
            "resources": [{
                "name": "appsettings",
                "type": "config",
                "apiVersion": "2015-08-01",
                "dependsOn": [
                    "[resourceId('Microsoft.Web/Sites/', parameters('webSiteName'))]"
                ],
                "tags": {
                    "displayName": "appSettings"
                },
                "properties": {
                    "MySuperSecretValueForTheAppService": "[parameters('mySuperSecretValueForTheAppService')]"
                }
            }]
        }
    ],
    "outputs": {}
}

This nested template is responsible for creating an Azure App Service with an Application Setting containing the secret we retrieved from Azure Key Vault in the main template. Pretty straightforward, especially if you’ve worked with ARM templates before.

If you want to see the complete templates & solution, check out my GitHub repository with this sample templates.

The deployment

All of this configuration is fun and games, but does it actually do the job?

One way to find out and that’s setting up a proper deployment pipeline! I’m most familiar using VSTS, so that’s the tool I’ll be using.

Create a new Release, add a new artifact to the location of your templates and create a new environment.

For testing purposes, this environment only needs to have a single step based on the `Create or Update Resource Group`-task.

In this task you will need to select the ARM Template file, along with the parameters file you want to use. Of course, all of the secrets I don’t want to specify, or want to override, I’m placing in the `Override template parameters`-section. Most important is the parameter for the `templateBaseUri`. This parameter contains the base URI to the location where the nested template(s) are stored.


image

It makes sense to override this setting as it’s quite possible you don’t want to use the GitHub location over here, but some location on a public blob container created by your CI-build.

Now save this pipeline and queue a release.

If all goes well, the deployment will fail with a `KeyVaultParameterReferenceNotFound` error.

At least one resource deployment operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-debug for usage details.
Details:
BadRequest: {
"error": {
"code": "KeyVaultParameterReferenceNotFound",
"message": "The specified KeyVault '/subscriptions/[subscription-id]/resourceGroups/nested-template-sample/providers/Microsoft.KeyVault/vaults/nested-template-vault' could not be found. Please see https://aka.ms/arm-keyvault for usage details."
}
} undefined
Task failed while creating or updating the template deployment.

Or a bit more visual:

clip_image001

This makes sense as we’re trying to retrieve a secret from the Azure Key Vault which doesn’t exist yet!

If you head down to the Azure Portal and check out the resource group you’ll notice both the resource group and the Key Vault has been created.

clip_image001[7]

The only thing which we need to do is add the `MySuperSecretValueForTheAppService` to the Key Vault.

clip_image001[9]

Once it’s added we can try the release again. All steps should be green now.

clip_image001[11]

You can verify in the resource group both the hosting plan and the App Service have been created now.

clip_image001[13]

Zooming in on the Application Settings of the App Service you’re also able to see the secret value which has been retrieved from Azure Key Vault!

clip_image001[15]

Proof the dynamic id is working when using the dynamic id and a nested template!

Too bad a `securestring` is still rendered in plain text on the portal, but that’s a completely different issue.


It has taken me quite some time to figure out all of the above steps. Probably because I’m no CI/CD expert, so I hope the above post will help others who aren’t experts on the matter also.