For our automated deployments we have several Azure Organizational accounts in place. These are created within the Azure Active Directory.

Because these accounts are meant for services, we don’t want them to inherit the default password policy for renewing their passwords every X days. Lucky for us, you can configure this via PowerShell. A short how-to is written on MSDN.

The thing that isn’t written (or referenced) over there is how to run the MSOL cmdlets.

I kept getting the messages `The term 'Set-MsolUser' is not recognized`. By searching a bit on this error I found a thread on the Office365 community forums where someone mentioned the “Microsoft Online Service Module for Windows PowerShell”. This set me off to searching in the right direction. Apparently you need to install a (new/extra) PowerShell module on your system in order to use the MSOL cmdlets. These cmdlets are part of the Office365 and Exchange Online services. A page with download links is provided by Microsoft Support. They provide a link to the Microsoft Online Service Sign-in Assistant for IT Professionals and the Azure Active Directory Module for Windows PowerShell (32-bit and 64-bit).

Once installed, you are finally able to use the MSOL cmdlets. Keep in mind though, you have to connect to the MSOL services first using the connection cmdlet.

Connect-MsolService -Credential $azureADCredentials

After connecting to the service, you can change the service account it’s password behavior to `Password Never Expires`.

For reference, this is the script I’ve used when changing the service account password policies:

function Set-CustomerAzureSubscription($subscriptionName)
{
    $azureSubscriptionSecurePassword  = ConvertTo-SecureString -String $azureSubscriptionPassword -AsPlainText -Force
    $azureCredentials = New-Object System.Management.Automation.PSCredential($azureSubscriptionUsername, $azureSubscriptionSecurePassword)

    Get-AzureAccount
    Add-AzureAccount -Credential $azureCredentials
    Get-AzureSubscription | % { Write-Host "Customer subscription: $($_.SubscriptionName)."}
    
    Write-Host "Selecting $($subscriptionName) as default Customer subscription."
    Select-AzureSubscription -SubscriptionName "$($subscriptionName)"
}

function Set-PasswordNeverExpiresForServiceAccounts($serviceAccountUsername, $serviceAccountPassword)
{
    $azureADCredentialsSecurePassword  = ConvertTo-SecureString -String $serviceAccountPassword -AsPlainText -Force
    $azureADCredentials = New-Object System.Management.Automation.PSCredential($serviceAccountUsername, $azureADCredentialsSecurePassword)
    Write-Host "Connecting to MSOL"
    Connect-MsolService -Credential $azureADCredentials

    Write-Host "Password never expires status of $($serviceAccountUsername)"
    Get-MSOLUser -UserPrincipalName $serviceAccountUsername | Select PasswordNeverExpires
    Write-Host "Setting password never expires status of $($serviceAccountUsername) to 'true'"
    Set-MsolUser -UserPrincipalName $serviceAccountUsername -PasswordNeverExpires $true
    Write-Host "Password never expires status of $($serviceAccountUsername)"
    Get-MSOLUser -UserPrincipalName $serviceAccountUsername | Select PasswordNeverExpires
}

Set-CustomerAzureSubscription $devSubscription
Set-PasswordNeverExpiresForServiceAccounts $devServiceAccount $devPassword

Set-CustomerAzureSubscription $accSubscription
Set-PasswordNeverExpiresForServiceAccounts $accServiceAccount $accPassword

Set-CustomerAzureSubscription $prodSubscription
Set-PasswordNeverExpiresForServiceAccounts $prodServiceAccount $prodPassword

The past couple of days I’ve had the pleasure to start using a Git server as the new version control system at my customer. I’ve already had the pleasure to use GitHub and BitBucket in the past, which works like a charm with tools like GitHub for Windows and SourceTree. Because I’m used to using these tools, I really wanted to use them on my day job also.

Because we chose to use SSH keys as a validation mechanism I had to create one. Once you know how to do this it’s quite easy, but till 2 days ago I didn’t had a clue.

Lucky for me there’s a nice tutorial on how to create SSH keys on the GitHub help pages. Also Atlassian has provided an extensive help document with a couple of steps. In retrospect I think the Atlassian help page is the most useful for helping out with local Git servers. However, these help documents don’t take into account the usage of SourceTree and you will need to do some extra steps to get this working.

For future reference, I’ll describe the steps I had to take below.

First thing you want to do is install Git, the Git Extensions and if you haven’t already, SourceTree.

After doing so you can start Git Bash from the Windows context menu in Windows Explorer.

image

By default your SSH keys are stored in the ~/.ssh directory. For Windows users, the ~-sign means your home directory, so C:\Users\Jan\.ssh

To create a new SSH key you can type the following in Git Bash.

ssh-keygen -t rsa -C "your_email@example.com"

After running this command, the tool will ask you for a filename to save the SSH key in. I didn’t go with the default name because I thought it would probably be wise to use a different SSH key for every customer I’m working with. Also because I will probably use different e-mail addresses per customer. So I went for id_rsa_customer.

The tool will also prompt you for entering a passphrase. If you want to be secure, please do provide a passphrase over here. I didn’t (it also accepts blanks), because you have to provide this passphrase every time you want to do something with the server (pull, push) via the Git Bash. My solution might not be very secure, so I will not advise in doing the same.

Your new SSH private key will now be created in the .ssh folder on your system.

Now you have to add this new SSH key to the SSH Agent of your system. In Git Bash you can do this by executing the following command.

ssh-agent -s
ssh-add ~/.ssh/id_rsa_customer

This works some of the times, but not always. Next time I started Git Bash I received an error when adding the key to the SSH Agent, “Could not open a connection to your authentication agent.

A Stack Overflow post helped me out here. If this also happens to you, you have to execute the following

eval `ssh-agent -s`
ssh-add ~/.ssh/id_rsa_customer

This will make sure the SSH agent is started. On the Stack Overflow post’s answer are a couple of links which provide some more in-depth detail about this issue.

Once you are done setting up your private key and the SSH agent, it’s time to get the public key which you will be able to add on your account on the Git server.

The public key is saved in a file on the same location as the private key, but has the extension .pub added to it. In my case it’s id_rsa_customer.pub. Copy this key to the clipboard like so:

clip < ~/.ssh/id_rsa.pub

Or open it in your favorite text editor and copy it from over there. Now add this key to your profile and you are set to start using Git via Git Bash.

Next time you’ll start Git Bash you will have to add the private key to the SSH agent again. This isn’t very user friendly and there’s also a way around this. The Atlassian help page (step 5) provides a script which can help you out.

Create (or modify) a .bashrc file in your user director (~/ or C:\Users\[username]\).

Add the following to the file via your favorite text editor.

SSH_ENV=$HOME/.ssh/environment
# start the ssh-agent
function start_agent {
    echo "Initializing new SSH agent..."
    # spawn ssh-agent
    /usr/bin/ssh-agent | sed 's/^echo/#echo/' > "${SSH_ENV}"
    echo succeeded
    chmod 600 "${SSH_ENV}"
    . "${SSH_ENV}" > /dev/null
    /usr/bin/ssh-add ~/.ssh/id_rsa_customer
}

if [ -f "${SSH_ENV}" ]; then
     . "${SSH_ENV}" > /dev/null
     ps -ef | grep ${SSH_AGENT_PID} | grep ssh-agent$ > /dev/null || {
        start_agent;
    }
else
    start_agent;
fi

I don’t really understand everything which happens on this script, but I do know it automatically starts the SSH Agent and registers the provided SSH key when I’m starting Git Bash. If you want to ‘reset’ these settings, just delete the .ssh/environment file and this script will create it once again.

We still haven’t talked about SourceTree yet. That’s because I first wanted to have Git Bash set up to work properly.

The SSH key we have created with Git Bash doesn’t work with SourceTree out of the box, you’ll have to convert them first. Choose to import SSH Keys in the Tools menu option.

image

A PuTTY Key Generator screen will pop up.

image

Load your existing private key into this and save it again with the Save private key button. Make sure not to overwrite the original file, because Git Bash still uses this one. Just save it to something like id_rsa_customer.ppk.

Now you are able to add this ppk SSH key to SourceTree.

In SourceTree you’ll have to start the SSH Agent before using SSH keys.

image

You’ll notice a small icon appears in your taskbar

image

Press the View Keys option and a small window will appear. You will probably haven’t got any keys added, yet, so press the Add Key option and choose your newly created ppk key file.

If everything happened successfully, you are now able to connect to your Git server using Git Bash and SourceTree as well.

After closing down and starting SourceTree again, the SSH Agent is stopped and your key isn’t added automatically once you start it again. Lucky for us developers, Atlassian has added a nice option for us. By filling out the SSH Client Configuration in the Options screen you are able to automatically start the SSH Agent and load your default SSH key.

image

Now you are all set to start working with Git.

While creating the PowerShell scripts for automatic deployment of the project’s Azure environment I discovered there are multiple Azure PowerShell modules.

When you want to manage a single resource, such as storage accounts, websites, databases, virtual machines, and media services, you need the (default) Azure module. However, when you need to manage resource groups, you will need the AzureResourceManager module.

This is useful information if you want to deploy new Azure websites with a specific hosting plan, like Basic or Standard. To create such websites the command Get-AzureResourceGroup is necessary. If you use PowerShell ISE you will notice this command isn’t available. In order to make this command available, run the following:

Switch-AzureMode AzureResourceManager

Doing so will activate the AzureResourceManager module and you will have a couple of different commands available.

If you want to see which commands are available within this module, run this command:

Get-Command -Module AzureResourceManager | Get-Help | Format-Table Name, Synopsis

Switching back to the ‘normal’ Azure module is also very easy. You just need to switch back to the different AzureMode again.

Switch-AzureMode -Name AzureServiceManagement

After switching back, all your normal commands are back again.

Keep in mind, if you need both modules, you need to switch between the AzureModes in your script also!

There are quite a couple of Azure cmdlets made available by Microsoft. All of this sweetness can be installed on your system via the Web Platform Installer. After installing these modules you can start managing your Azure subscription in PowerShell scripts.

Most of the stuff for managing your Azure subscription is implemented in these Azure cmdlets. One of the things which isn’t implemented (yet) is managing the Service Busses in your subscription. It is possible to add, delete and get a new Service Bus namespace with the New-AzureSBNamespace, Remove-AzureSBNamespace and Get-AzureSBNamespace cmdlets, but that’s all you get. You will probably understand, this isn’t enough if you want to deploy your complete environment via a PowerShell script.

Luckily for us we have the ability to use all of the .NET libraries and assemblies on your system. When you search online you will probably find some articles describing how to create service bus queues in C# by using the NamespaceManager. I’ve written some PowerShell which uses this class and creates queues in your subscription.

#First, create a new service bus namespace. This doesn't return the newly created object
New-AzureSBNamespace -Name $servicebusNamespace -Location $locationWestEUDataCenter -CreateACSNamespace $true
#Get the newly created service bus namespace, so we can do stuff with the information.
$azureServicebus = Get-AzureSBNamespace -Name $servicebusNamespace

#We need a tokenprovider for proper credentials
$tokenProvider = [Microsoft.ServiceBus.TokenProvider]::CreateSharedSecretTokenProvider("owner", $azureServicebus.DefaultKey)
#The uri of the namespace
$namespaceUri = [Microsoft.ServiceBus.ServiceBusEnvironment]::CreateServiceUri("sb", $servicebusNamespace, "");
#Now we can finally crate a NamespaceManger which has the power to create new queues.
$namespaceManager = New-Object Microsoft.ServiceBus.NamespaceManager $namespaceUri,$tokenProvider

Write-Host "Creating the queues" -ForegroundColor Green -BackgroundColor Black
#Creating the queues should work by now.
$namespaceManager.CreateQueue($nameOfTheServiceBusQueue)

If you want to start over, you can just delete the complete namespace and run the script again. This can be done with the following command.

Remove-AzureSBNamespace -Name $servicebusNamespace -Force

This script works on my machine, however you do need to import you subscription. How this is done is explained all over the web, but I’ll add it over here for reference.

Adding your subscription can be done with the commands below.

#This will download the settings file.
Get-AzurePublishSettingsFile
#This will import the downloaded settingsfile
Import-AzurePublishSettingsFile -PublishSettingsFile "..\theDownloadedSettingsFile.publishsettings"

At this moment the scripts above are working properly. When, or if, Microsoft publishes new cmdlets to manage the service busses I would recommend using them as it’s probably a lot safer compared to self-made scripts.

The project I’m working on at the moment has a lot of analytics data. This means there’s a lot of inserts and updates in the database and queries have to be fast! At the moment all of this is hosted in a single MS SQL Server which does a pretty decent job. Still, this seems like a perfect scenario to introduce a noSQL database, especially as we are migrating to the cloud to improve performance of the application as a whole.

After a having reviewed a couple of noSQL databases, our weapon of choice became MongoDB. It’s popular, fast and there are a couple of providers which offer it as a hosted solution.
While reviewing MongoDB I discovered you not only need to change your ‘scheme’, but also the way you query the database. I have spent quite some hours on finding out why the performance was so incredibly slow, therefore it seems like a good idea to share my findings.

To make life easier, the most important stuff to set up a connection to a MongoDB store is thread safe. This is great as you don’t have to setup a connection to MongoDB every time you want to use it. In my testing code I’ve set up some static properties to keep the connection alive.

public MongoDal()
{
	var connectionString = ConfigurationManager.AppSettings["MongoConnectionString"];
	if (MongoDal.Client == null)
	{
		MongoDal.Client = new MongoDB.Driver.MongoClient(connectionString);
	}
	if (MongoDal.DatabaseName == null)
	{
		MongoDal.DatabaseName = ConfigurationManager.AppSettings["MongoDatabase"];
	}
	if (MongoDal.MongoServer == null)
	{
		MongoDal.MongoServer = Client.GetServer();
	}
	if (MongoDal.MongoDatabase == null)
	{
		MongoDal.MongoDatabase = MongoServer.GetDatabase(MongoDal.DatabaseName);
	}
}

Keep in mind, all the code shown in this post is used for testing. Don’t use it in a real production scenario as it still needs a lot of tweaking and tuning.

Coming from a traditional SQL background, I figured it would be a good idea to search for a Collection and do some query on it. With a little help from the MongoDB documentation I figured the method should look a bit like this:

public IQueryable<T> Query<T>(string collection, IEnumerable<Guid> shouldBeIn, string fieldName)
{
	var dataModelCollection = MongoDatabase.GetCollection<T>(collection);
	var query = MongoDB.Driver.Builders.Query.In(fieldName, this.GetValue(shouldBeIn));
	var findResult = dataModelCollection.Find(query);
	return findResult.AsQueryable();
}

Based on the information I could find in the documentation. Seems like the correct way to query a collection, right?

Wrong!

Using this code will give you terrible performance results when you execute it (like when you do with .ToList()). I have written multiple tests using this code and it was much slower as the queries we had defined for SQL. This is strange, as the SQL queries were quite complex and the noSQL query was rather simple.

Some figures, the SQL test took about 14302 milliseconds to execute (ran the query 1000 times). Using the above code to query MongoDB took about 144604 milliseconds. That’s 10 times slower! It’s possible my document structure isn’t optimal, but that wouldn’t explain such a big difference in results. Something had to be off.

Having spent several hours to discover my error I finally found a different way to query MongoDB. Creating a DataContext-wannabe class appears to be the solution to the performance problems I was facing. Creating such a class is easy, just create a bunch of properties which look like this:

public IQueryable<SomeData> SomeData
{
	get
	{
		return new MongoQueryable<SomeData>(new MongoQueryProvider(MongoDal.MongoDatabase.GetCollection<SomeData>("SomeData")));
	}
}
public IQueryable<AwesomeOtherData> AwesomeOtherData
{
	get
	{
		return new MongoQueryable<AwesomeOtherData>(new MongoQueryProvider(MongoDal.MongoDatabase.GetCollection<AwesomeOtherdata>("AwesomeOtherData")));
	}
}
//etc...

 

Every property corresponds to a collection which is and you want to use in the code. Because every property is an IQueryble<T>, you can do LINQ queries on these properties, so I have changed my testing code to use these properties. The Query<T> method could now be implemented to something like this:

mongoDal.SomeData.Where(s => someIdCollection.Contains(s.SomeId)

 

Keep in mind you can only use LINQ queries which are supported by the MongoDB driver. I discovered the .GroupBy() method doesn’t work for this piece of code.

After having implemented this all over my testing code, I ran the MongoDB test again. The results were staggering! The test now only took 5916 milliseconds. That’s about 50% faster compared to the MS SQL test.

Keep in mind, I haven’t changed anything to the MongoDB store. I just changed the way I’m searching through the collections. Apparently it’s not really efficient to query though a MongoCursor. Using a MongoQueryable is probably the best way to do queries in a collection. I have stepped through the MongoDB C# driver code a bit and discovered when returning the results of a MongoCursor, it’s waiting for server responses most of the time. I haven’t stepped through the MongoQueryable code (yet), but it’s probably handling data retrieval in a different way.