Category: Windows Azure

Windows Azure, which affinity group or region is best for you?

A nice tool to use to see which Azure Affinity Group to use e.g. South America, North America or Asia is to download this tool and run checks from where your clients will be based.

http://research.microsoft.com/en-us/downloads/5c8189b9-53aa-4d6a-a086-013d927e15a7/default.aspx

Once you got it installed, add your storage accounts and then get started.

image

So above we will test from Sydney Australia to our UAT environment in America.

Lets click “run”

It will start executing the test, this is now a good time to plan a date, make a cup of coffee or write some JScript for your open source projects.

Results:

Sydney to North America

image

image

 

Sydney to South East Asia (Singapore)

image

image

Conclusion

For us, South East Asia was far more better (Web site, download is more important than upload), and the proof was in the pudding when we measures web site response times with MVC MiniProfiler.

However, this is not the ultimate conclusion, I bet these response times will vary depending on time of day, perhaps when Asia is awake and US is asleep, it could be the other way round, so test it at different times of day and pick the affinity or region that is best for you!

Advertisement

Windows Azure SDK 1.6–CSX folder output breaking change..again..

Background

Automated deployments with SDK 1.6 have been broken with TeamCity.

Location of CSX folder in build output changed.

Location of CSRUN.exe moved to emulator folder.

Details

Before, when using the MSBUILD targets, the csx folder that is needed by CSRUN.exe for automated deployments has been changed, what is worse, is the old csx folder location is not cleaned up and is partially there, so to the untrained eye you think it is still there!

Also note, you will need to change the path of csrun.exe as this has been moved.

Old Location

CloudProject\bin\%BuildConfiguration%\CloudProject.csx

New Location

CloudProject\csx\%BuildConfiguration%

Why does the old location with impartial files still exist? Not sure…Because this was my new error when I tried to deploy.

The compute emulator had a error: Can’t locate service descriptions

image

Now the new location with all files

image

TeamCity Artefacts fix

Notice the new relative path to the project is \csx\release and not bin\release\MyProject.csx.

image

Summary

This is not the first time, that we get breaking changes. In 1.5 allot of the MSBuild target names changed, they were fine the way they were, sometimes I just do not understand certain changes that did not really need to be made.

So, can anyone explain why the folder before in bin\MyprojectName.csx was changed to a directory level up from bin and called only csx, it just seems to be changes that we really can do without, or is their some grand planned scheme that will make this change so exciting in the future…who knows?

Automating Windows Azure Deployments leveraging TeamCity and PowerShell

Hi Guys,

Introduction

We will cover:

  • Overview to configure multiple build projects on TeamCity
  • Configure one of the build projects to deploy to the Azure Cloud
  • Automated deployments to the Azure Cloud for Web and Worker Roles
  • Leverage the msbuild target templates to automate generation of azure package files
  • Alternative solution of generating the azure package files
  • Using configuration transformations to manage settings e.g. UAT, Dev, Production

A colleague of mine Tatham Oddie and I are currently use TeamCity to automatically deploy our Azure/MVC3/Neo4j based project to the Azure cloud. Lets see how this can be done with relative ease and is fully automated. The focus here will be based on Powershell scripts which are using the Cerebrata Command scriplets, which can be found here: http://www.cerebrata.com/Products/AzureManagementCmdlets/Default.aspx

The PowerShell script included here will automatically undeploy and redploy you azure service and will even wait until all the services are in the ready state.

I will leave you to checking those commandlets out, and they worth every penny spent.

Now lets check how we get the deployment working.

The basic idea is that you have a continuous integration build configured on the Build Server in TeamCity, then what you do is configure the CI build to generate artifacts, which are basically the output from the build that can be used by another build project e.g. You can take the artifacts for the CI build and then run Functional Tests or Integration tests builds that run totally separate from the CI build. The idea here is, your functional and integration will NEVER interfere with the CI build and the Unit tests. Thus keeping CI builds fast and efficient.

Prerequisites on Build Server

  • TeamCity Professional Version 6.5.1
  • Cloud Subscription Certificate with Private key is imported into the User Certificate Store for the Team City service account
  • Cerebrata CMDLETS

TeamCity -Continuous Integration Build Project

Ok, so, lets do a quick check at my CI build that spits out the Azure Packages.

image

As we can see above, the CI build creates an Artifact called AzurePackage.

image

The way we generate these artifacts is very easy. In the settings for the CI Build Project we setup the artifacts path.

image

e.g. MyProjectMyProject.Azurebin%BuildConfiguration%Publish => AzurePackage

So, we will look at the build steps to configure.

image

As we can see below, we just say where the MSBuild is run from and then where the unit tests dll’s are.

image

Cool, now we need to setup the artifacts and configuration.

We just mention we want a release build.

image

Ok, now we need to tell our Azure Deployment project to have a dependency on the CI project we configured above.

Team City – UAT Deployment Build Project

So lets now go check out the UAT Deployment project.

image

This project will have dependencies on the CI build and then we will configure all the build parameters so it can connect to your Azure Storage and Service for automatic deployments. Once we done here, we will have a look at the powershell script that we use to automatically deploy to the cloud, the script supports un-deploying existing deployment slots before deploying a new one with retry attempts.

Ok, lets check the following for the UAT deployment project.

image

image

The above screenshot is the command that executes the powershell script, the parameters (%whatever%) will resolve from Build parameters in Step 6 of the screen shot above.

Here is the command for copy/paste friendless. Of course if you using some other Database then you do not need the Neo4j stuff.

-AzureAccountName “%AzureAccountName%” -AzureServiceName “%AzureServiceName%” -AzureDeploymentSlot “%AzureDeploymentSlot%” -AzureAccountKey “%AzureAccountKey%” -AzureSubscriptionId “%AzureSubscriptionId%” -AzureCertificateThumbprint “%AzureCertificateThumbprint%” -PackageSource “%AzurePackageDependencyPath%MyProject.Azure.cspkg” -ConfigSource “%AzurePackageDependencyPath%%ConfigFileName%” -DeploymentName “%build.number%-%build.vcs.number%” -Neo4jBlobName “%Neo4jBlobName%” -Neo4jZippedBinaryFileHttpSource “%Neo4jZippedBinaryFileHttpSource%”

This is the input for a deploy-package.cmd file, which is in our source repository.

image

Now, we also need to tell the Deployment project to use the Artifact from our CI Project. So we setup an Artifact Dependencies as show below in the dependencies section. Also, notice how we use a wildcard, so get all files from AzurePackage (AzurePackage/**). This will be the cspackage files.

image

Notice above, that I have a SnapShot Dependency, this is forcing the UAT deployment to USE the SAME source code that the CI build project is using.

So, the parameters are as follows.

image

PowerShell Deployment Scripts

The Deployment scripts consist of three files and remember I assumed you installed the Cerebrata Management Command Scriptlets.

Ok, so lets look at the Deploy-Package.cmd file, I would like to pay my gratitude to Jason Stangroome(http://blog.codeassassin.com) for this,

Jason wrote: “This tiny proxy script just writes a temporary PowerShell script containing all the arguments you’re trying to pass to let PowerShell interpret them and avoid getting them messed up by the Win32 native command line parser.”

@echo off
setlocal
set tempscript=%temp%\%~n0.%random%.ps1
echo $ErrorActionPreference="Stop" >"%tempscript%"
echo ^& "%~dpn0.ps1" %* >>"%tempscript%"
powershell.exe -command "& \"%tempscript%\""
set errlvl=%ERRORLEVEL%
del "%tempscript%"
exit /b %errlvl%

Ok, and now here is the PowerShell code, Deployment-Package.ps1. I will leave you to read what it does. In Summary.

It demonstrates.

  • Un-Deploying a service deployment slot
  • Deploying a service deployment slot
  • Using the certificate store to retrieve the certificate for service connections via a cert thumbprint – users cert store under the service account that TeamCity runs on.
  • Uploading Blobs
  • Downloading Blobs
  • Waiting until the new deployment is in a ready state
#requires -version 2.0
param (
	[parameter(Mandatory=$true)] [string]$AzureAccountName,
	[parameter(Mandatory=$true)] [string]$AzureServiceName,
	[parameter(Mandatory=$true)] [string]$AzureDeploymentSlot,
	[parameter(Mandatory=$true)] [string]$AzureAccountKey,
	[parameter(Mandatory=$true)] [string]$AzureSubscriptionId,
	[parameter(Mandatory=$true)] [string]$AzureCertificateThumbprint,
	[parameter(Mandatory=$true)] [string]$PackageSource,
	[parameter(Mandatory=$true)] [string]$ConfigSource,
	[parameter(Mandatory=$true)] [string]$DeploymentName,
	[parameter(Mandatory=$true)] [string]$Neo4jZippedBinaryFileHttpSource,
	[parameter(Mandatory=$true)] [string]$Neo4jBlobName
)

$ErrorActionPreference = "Stop"

if ((Get-PSSnapin -Registered -Name AzureManagementCmdletsSnapIn -ErrorAction SilentlyContinue) -eq $null)
{
	throw "AzureManagementCmdletsSnapIn missing. Install them from Https://www.cerebrata.com/Products/AzureManagementCmdlets/Download.aspx"
}

Add-PSSnapin AzureManagementCmdletsSnapIn -ErrorAction SilentlyContinue

function AddBlobContainerIfNotExists ($blobContainerName)
{
	Write-Verbose "Finding blob container $blobContainerName"
	$containers = Get-BlobContainer -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	$deploymentsContainer = $containers | Where-Object { $_.BlobContainerName -eq $blobContainerName }

	if ($deploymentsContainer -eq $null)
	{
		Write-Verbose  "Container $blobContainerName doesn't exist, creating it"
		New-BlobContainer $blobContainerName -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	}
	else
	{
		Write-Verbose  "Found blob container $blobContainerName"
	}
}

function UploadBlobIfNotExists{param ([string]$container, [string]$blobName, [string]$fileSource)

	Write-Verbose "Finding blob $container\$blobName"
	$blob = Get-Blob -BlobContainerName $container -BlobPrefix $blobName -AccountName $AzureAccountName -AccountKey $AzureAccountKey

	if ($blob -eq $null)
	{
		Write-Verbose "Uploading blob $blobName to $container/$blobName"
		Import-File -File $fileSource -BlobName $blobName -BlobContainerName $container -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	}
	else
	{
		Write-Verbose "Found blob $container\$blobName"
	}
}

function CheckIfDeploymentIsDeleted
{
	$triesElapsed = 0
	$maximumRetries = 10
	$waitInterval = [System.TimeSpan]::FromSeconds(30)
	Do
	{
		$triesElapsed+=1
		[System.Threading.Thread]::Sleep($waitInterval)
		Write-Verbose "Checking if deployment is deleted, current retry is $triesElapsed/$maximumRetries"
		$deploymentInstance = Get-Deployment `
			-ServiceName $AzureServiceName `
			-Slot $AzureDeploymentSlot `
			-SubscriptionId $AzureSubscriptionId `
			-Certificate $certificate `
			-ErrorAction SilentlyContinue

		if($deploymentInstance -eq $null)
		{
			Write-Verbose "Deployment is now deleted"
			break
		}

		if($triesElapsed -ge $maximumRetries)
		{
			throw "Checking if deployment deleted has been running longer than 5 minutes, it seems the delployment is not deleting, giving up this step."
		}
	}
	While($triesElapsed -le $maximumRetries)
}

function WaitUntilAllRoleInstancesAreReady
{
	$triesElapsed = 0
	$maximumRetries = 60
	$waitInterval = [System.TimeSpan]::FromSeconds(60)
	Do
	{
		$triesElapsed+=1
		[System.Threading.Thread]::Sleep($waitInterval)
		Write-Verbose "Checking if all role instances are ready, current retry is $triesElapsed/$maximumRetries"
		$roleInstances = Get-RoleInstanceStatus `
			-ServiceName $AzureServiceName `
			-Slot $AzureDeploymentSlot `
			-SubscriptionId $AzureSubscriptionId `
			-Certificate $certificate `
			-ErrorAction SilentlyContinue
		$roleInstancesThatAreNotReady = $roleInstances | Where-Object { $_.InstanceStatus -ne "Ready" }

		if ($roleInstances -ne $null -and
			$roleInstancesThatAreNotReady -eq $null)
		{
			Write-Verbose "All role instances are now ready"
			break
		}

		if ($triesElapsed -ge $maximumRetries)
		{
			throw "Checking if all roles instances are ready for more than one hour, giving up..."
		}
	}
	While($triesElapsed -le $maximumRetries)
}

function DownloadNeo4jBinaryZipFileAndUploadToBlobStorageIfNotExists{param ([string]$blobContainerName, [string]$blobName, [string]$HttpSourceFile)
	Write-Verbose "Finding blob $blobContainerName\$blobName"
	$blobs = Get-Blob -BlobContainerName $blobContainerName -ListAll -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	$blob = $blobs | findstr $blobName

	if ($blob -eq $null)
	{
	    Write-Verbose "Neo4j binary does not exist in blob storage. "
	    Write-Verbose "Downloading file $HttpSourceFile..."
		$temporaryneo4jFile = [System.IO.Path]::GetTempFileName()
		$WebClient = New-Object -TypeName System.Net.WebClient
		$WebClient.DownloadFile($HttpSourceFile, $temporaryneo4jFile)
		UploadBlobIfNotExists $blobContainerName $blobName $temporaryneo4jFile
	}
}

Write-Verbose "Retrieving management certificate"
$certificate = Get-ChildItem -Path "cert:\CurrentUser\My\$AzureCertificateThumbprint" -ErrorAction SilentlyContinue
if ($certificate -eq $null)
{
	throw "Couldn't find the Azure management certificate in the store"
}
if (-not $certificate.HasPrivateKey)
{
	throw "The private key for the Azure management certificate is not available in the certificate store"
}

Write-Verbose "Deleting Deployment"
Remove-Deployment `
	-ServiceName $AzureServiceName `
	-Slot $AzureDeploymentSlot `
	-SubscriptionId $AzureSubscriptionId `
	-Certificate $certificate `
	-ErrorAction SilentlyContinue
Write-Verbose "Sent Delete Deployment Async, will check back later to see if it is deleted"

$deploymentsContainerName = "deployments"
$neo4jContainerName = "neo4j"

AddBlobContainerIfNotExists $deploymentsContainerName
AddBlobContainerIfNotExists $neo4jContainerName

$deploymentBlobName = "$DeploymentName.cspkg"

DownloadNeo4jBinaryZipFileAndUploadToBlobStorageIfNotExists $neo4jContainerName $Neo4jBlobName $Neo4jZippedBinaryFileHttpSource

Write-Verbose "Azure Service Information:"
Write-Verbose "Service Name: $AzureServiceName"
Write-Verbose "Slot: $AzureDeploymentSlot"
Write-Verbose "Package Location: $PackageSource"
Write-Verbose "Config File Location: $ConfigSource"
Write-Verbose "Label: $DeploymentName"
Write-Verbose "DeploymentName: $DeploymentName"
Write-Verbose "SubscriptionId: $AzureSubscriptionId"
Write-Verbose "Certificate: $certificate"

CheckIfDeploymentIsDeleted

Write-Verbose "Starting Deployment"
New-Deployment `
	-ServiceName $AzureServiceName `
	-Slot $AzureDeploymentSlot `
	-PackageLocation $PackageSource `
	-ConfigFileLocation $ConfigSource `
	-Label $DeploymentName `
	-DeploymentName $DeploymentName `
	-SubscriptionId $AzureSubscriptionId `
	-Certificate $certificate

WaitUntilAllRoleInstancesAreReady

Write-Verbose "Completed Deployment"

Automating Cloud Package File without using CSPack and CSRun explicitly

We will need to edit the Cloud Project file so that Visual Studio can create the cloud package files , as it will then automatically run the cspackage for you which can be consumed by the artifacts and hence other build projects. This allows us to bake functionality into the MSBuild process to generate the package files without the need for explicitly using cspack.exe and csrun.exe. Resulting in less scripts, else you would need a separate PowerShell script just to package the cloud project files.

Below are the changes for the .ccproj file of the Cloud Project. Notice the condition is that we generate these package files ONLY if the build is outside of visual studio, so this is nice to keep it from not always creating the packages to keep our development experience build process short. So for the condition below to work, you will need to build the project from the command line using MSBuild.

Here is the config entries for the project file.

  <PropertyGroup>
    <CloudExtensionsDir Condition=" '$(CloudExtensionsDir)' == '' ">$(MSBuildExtensionsPath)\Microsoft\Cloud Service\1.0\Visual Studio 10.0\</CloudExtensionsDir>
  </PropertyGroup>
  <Import Project="$(CloudExtensionsDir)Microsoft.CloudService.targets" />
  <Import Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets" />
  <Target Name="AzureDeploy" AfterTargets="Build" DependsOnTargets="CorePublish" Condition="'$(BuildingInsideVisualStudio)'!='True'">
  </Target>

e.g.

C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe MyProject.sln /p:Configuration=Release

image

Configuration Transformations

You can also leverage configuration transformations so that you can have configurations for each environment. This is discussed here:

http://blog.alexlambert.com/2010/05/using-visual-studio-configuration.html

However, in a nutshell, you can have something like this in place, this means you can then have separate deployment config files, e.g.

ServiceConfiguration.cscfg

ServiceConfiguration.uat.cscfg

ServiceConfiguration.prod..cscfg

Just use the following config in the .ccproj file.

 <Target Name="ValidateServiceFiles"
          Inputs="@(EnvironmentConfiguration);@(EnvironmentConfiguration->'%(BaseConfiguration)')"
          Outputs="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')">
    <Message Text="ValidateServiceFiles: Transforming %(EnvironmentConfiguration.BaseConfiguration) to %(EnvironmentConfiguration.Identity).tmp via %(EnvironmentConfiguration.Identity)" />
    <TransformXml Source="%(EnvironmentConfiguration.BaseConfiguration)" Transform="%(EnvironmentConfiguration.Identity)"
     Destination="%(EnvironmentConfiguration.Identity).tmp" />

    <Message Text="ValidateServiceFiles: Transformation complete; starting validation" />
    <ValidateServiceFiles ServiceDefinitionFile="@(ServiceDefinition)" ServiceConfigurationFile="%(EnvironmentConfiguration.Identity).tmp" />

    <Message Text="ValidateServiceFiles: Validation complete; renaming temporary file" />
    <Move SourceFiles="%(EnvironmentConfiguration.Identity).tmp" DestinationFiles="%(EnvironmentConfiguration.Identity).transformed.cscfg" />
  </Target>
  <Target Name="MoveTransformedEnvironmentConfigurationXml" AfterTargets="AfterPackageComputeService"
          Inputs="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')"
          Outputs="@(EnvironmentConfiguration->'$(OutDir)Publish\%(filename).cscfg')">
    <Move SourceFiles="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')" DestinationFiles="@(EnvironmentConfiguration->'$(OutDir)Publish\%(filename).cscfg')" />
  </Target>

Here is a sample ServiceConfiguration.uat.config that will then leverage the transformations. Note the transformation for the web and worker roles sections. Our worker role is Neo4jServerHost and the Web is just called Web.

<?xml version="1.0"?>
<sc:ServiceConfiguration
    xmlns:sc="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"
    xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  <sc:Role name="Neo4jServerHost" xdt:Locator="Match(name)">
    <sc:ConfigurationSettings>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Storage connection string" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Drive connection string" value="DefaultEndpointsProtocol=http;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Neo4j DBDrive override Path" value=""/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="UniqueIdSynchronizationStoreConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
    </sc:ConfigurationSettings>
  </sc:Role>
  <sc:Role name="Web" xdt:Locator="Match(name)">
    <sc:ConfigurationSettings>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="UniqueIdSynchronizationStoreConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
    </sc:ConfigurationSettings>
  </sc:Role>
</sc:ServiceConfiguration>

Manually executing the script for testing

Prerequisites:

  • You will need to install the Cerebrata Azure Management CMDLETS from: https://www.cerebrata.com/Products/AzureManagementCmdlets/Download.aspx
  • If you are running 64 bit version, you will need to follow the readme file instructions contained with the AzureManagementCmdlets, as it requires manual copying of files. If you followed the default install, this readme will be in C:\Program Files\Cerebrata\Azure Management Cmdlets\readme.pdf
  • You will need to install the Certificate and Private Key (Which must be marked as exportable) to your User Certificate Store. This file will have an extension of .pfx. Use the Certificate Management Snap-In, for User Account Store. The certificate should be installed in the personal folder.
  • Once the certificate is installed, you should note the certificate thumbprint, as this is used as one of the parameters when executing the PowerShell script. Ensure you remove all the spaces from the thumbprint when using it in the script!

1) First up, you’ll need to make your own ”MyProject.Azure.cspkg” file. To do this, run this:

C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe MyProject.sln /p:Configuration=Release

(Adjust paths as required.)

You’ll now find a package waiting for you at ”C:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\MyProject.Azure.cspkg”.

2) Make sure you have the required management certificate installed on your machine (including the private key).

3) Now you’re ready to run the deployment script.

It requires quite a lot of parameters. The easiest way to find them is just to copy them from the last output log on TeamCity.

You will need to manually execute is Deploy-Package.ps1.

The Deploy-Package.ps1 file has input parameters that need to be supplied. Below is the list of parameters and description.

Note: These values can change in the future, so ensure you do not rely on this example below.

AzureAccountName: The Windows Azure Account Name e.g. MyProjectUAT

AzureServiceName: The Windows Azure Service Name e.g. MyProjectUAT

AzureDeploymentSlot: Production or Staging e.g. Production

AzureAccountKey: The Azure Account Key: e.g. youraccountkey==

AzureSubscriptionId:*The Azure Subscription Id e.g. yourazuresubscriptionId

AzureCertificateThumbprint: The certificate thumbprint you note down when importing the pfx file e.g. YourCertificateThumbprintWithNoWhiteSpaces

PackageSource: Location of the .cspkg file e.g. C:\Code\MyProject\MyProject.Azure\bin\Release\Publish\MyProject.Azure.cspkg

ConfigSource: Location of the Azure configuration files .cscfg e.g. C:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\ServiceConfiguration.uat.cscfg

DeploymentName: This can be a friendly name of the deployment e.g. local-uat-deploy-test

Neo4jBlobName: The name of the blob file containing the Neo4j binaries in zip format e.g. neo4j-community-1.4.M04-windows.zip

Neo4jZippedBinaryFileHttpSource: The http location of the Neo4j zipped binary files e.g. https://mydownloads.com/mydownloads/neo4j-community-1.4.M04-windows.zip?dl=1

-Verbose: You can use an additional parameter to get Verbose output which is useful when developing and testing the script, just append -Verbose to the end of the command.

Below is an example executed on my machine, this will be different on your machine, so use it as a guideline only:

<code title=Sample Deployment Execution>

.\Deploy-Package.ps1 -AzureAccountName MyProjectUAT `

-AzureServiceName MyProjectUAT `

-AzureDeploymentSlot Production `

-AzureAccountKey youraccountkey== `

-AzureSubscriptionId yoursubscriptionid `

-AzureCertificateThumbprint yourcertificatethumbprint `

-PackageSource “c:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\MyProject.Azure.cspkg” `

-ConfigSource “c:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\ServiceConfiguration.uat.cscfg” `

-DeploymentName local-uat-deploy-test -Neo4jBlobName neo4j-community-1.4.1-windows.zip `

-Neo4jZippedBinaryFileHttpSource https://mydownloads.com/mydownloads/neo4j-community-1.4.1-windows.zip?dl=1 -Verbose

</code>

Note: When running the PowerShell command and the 64bit version of the scripts, ensure you running the PowerShell version that you fixed in the readme file from Cerebrata, do not rely on the default shortcut links in the start menu!

Summary

Well, I hope this will help you automating Azure Deployments to the cloud, this a great way to keep UAT happy with Agile deployments to meet the goals of every sprint.

If you do not like the way we generate the package files above, you can choose to use CSRun and CSPack explicitly, I have prepared this script already, below is the code for you to use.

#requires -version 2.0
param (
	[parameter(Mandatory=$false)] [string]$ArtifactDownloadLocation
)

$ErrorActionPreference = "Stop"

$installPath= Join-Path $ArtifactDownloadLocation "..\AzurePackage"
$azureToolsPackageSDKPath="c:\Program Files\Windows Azure SDK\v1.4\bin\cspack.exe"
$azureToolsDeploySDKPath="c:\Program Files\Windows Azure SDK\v1.4\bin\csrun.exe"

$csDefinitionFile="..\..\Neo4j.Azure.Server\ServiceDefinition.csdef"
$csConfigurationFile="..\..\Neo4j.Azure.Server\ServiceConfiguration.cscfg"

$webRolePropertiesFile = ".\WebRoleProperties.txt"
$workerRolePropertiesFile=".\WorkerRoleProperties.txt"

$csOutputPackage="$installPath\Neo4j.Azure.Server.csx"
$serviceConfigurationFile = "$installPath\ServiceConfiguration.cscfg"

$webRoleName="Web"
$webRoleBinaryFolder="..\..\Web"

$workerRoleName="Neo4jServerHost"
$workerRoleBinaryFolder="..\..\Neo4jServerHost\bin\Debug"
$workerRoleEntryPointDLL="Neo4j.Azure.Server.dll"

function StartAzure{
	"Starting Azure development fabric"
	& $azureToolsDeploySDKPath /devFabric:start
	& $azureToolsDeploySDKPath /devStore:start
}

function StopAzure{
	"Shutting down development fabric"
	& $azureToolsDeploySDKPath /devFabric:shutdown
	& $azureToolsDeploySDKPath /devStore:shutdown
}

#Example: cspack Neo4j.Azure.Server\ServiceDefinition.csdef /out:.\Neo4j.Azure.Server.csx /role:$webRoleName;$webRoleName /sites:$webRoleName;$webRoleName;.\$webRoleName /role:Neo4jServerHost;Neo4jServerHost\bin\Debug;Neo4j.Azure.Server.dll /copyOnly /rolePropertiesFile:$webRoleName;WebRoleProperties.txt /rolePropertiesFile:$workerRoleName;WorkerRoleProperties.txt
function PackageAzure()
{
	"Packaging the azure Web and Worker role."
	& $azureToolsPackageSDKPath $csDefinitionFile /out:$csOutputPackage /role:$webRoleName";"$webRoleBinaryFolder /sites:$webRoleName";"$webRoleName";"$webRoleBinaryFolder /role:$workerRoleName";"$workerRoleBinaryFolder";"$workerRoleEntryPointDLL /copyOnly /rolePropertiesFile:$webRoleName";"$webRolePropertiesFile /rolePropertiesFile:$workerRoleName";"$workerRolePropertiesFile
	if (-not $?)
	{
		throw "The packaging process returned an error code."
	}
}

function CopyServiceConfigurationFile()
{
	"Copying service configuration file."
	copy $csConfigurationFile $serviceConfigurationFile
}

#Example: csrun /run:.\Neo4j.Azure.Server.csx;.\Neo4j.Azure.Server\ServiceConfiguration.cscfg /launchbrowser
function DeployAzure{param ([string] $azureCsxPath, [string] $azureConfigPath)
	"Deploying the package"
    & $azureToolsDeploySDKPath $csOutputPackage $serviceConfigurationFile
	if (-not $?)
	{
		throw "The deployment process returned an error code."
	}
}

Write-Host "Beginning deploy and configuration at" (Get-Date)

PackageAzure
StopAzure
StartAzure
CopyServiceConfigurationFile
DeployAzure '$csOutputPackage' '$serviceConfigurationFile'

# Give it 60s to boot up neo4j
[System.Threading.Thread]::Sleep(60000)

# Hit the homepage to make sure it's warmed up
(New-Object System.Net.WebClient).DownloadString("http://localhost:8080") | Out-Null

Write-Host "Completed deploy and configuration at" (Get-Date)

note, if using .Net 4.0 which I am sure you all are, you will need to provide the text files for web role and worker role with these entries.

WorkerRoleProperties.txt
TargetFrameWorkVersion=v4.0
EntryPoint=Neo4j.Azure.Server.dll

WebRoleProperties.txt
TargetFrameWorkVersion=v4.0

Thanks to Tatham Oddie for contributing and coming up with such great ideas for our builds.
Cheers

Romiko

Neo4j and gremlin plugin install guide

29/08/2011: OBSOLETE – Now baked into the Core of Neo4j.

Hi,

I was having some difficulties getting the Gremlin query plugin working correctly with the Neo4j server which we will host on a Windows Azure VM.

Below is some steps to get this working nicely.

Firstly you will of course need to have Neo4j running. Then all we need to do is install the following:

Java JDK – Here is my version
java version “1.6.0_26”
Java(TM) SE Runtime Environment (build 1.6.0_26-b03)
Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode)

JDK is needed to compile the plugin.

Maven 2.2.1 (There is compilation errors with SnapShot compiles with 3.0.3 at time of writing)
http://www.apache.org/dyn/closer.cgi/maven/binaries/apache-maven-2.2.1-bin.zip

Also, we need MVN, this is used to compile the Gremln Plugin. I was have problems with

Neo4j gremlin plugin
https://github.com/neo4j/gremlin-plugin

I like to setup environment variables to Neo4j server folder, java_home and also the maven location.

image

Once done we can then compile the plugin and copy it into the Neo4j plugins folder.

mvn clean package 
copy target\neo4j-gremlin-plugin-0.1-SNAPSHOT.jar $NEO4J_HOME\plugins 
cd $NEO4J_HOME\bin\neo4j.bat restart

Compiled version of the plugin.

image

Here we can see the plugin in the folder.

image

Now, to ensure Neo4j has the plugin, we can execute a curl command to check the extension is installed:

C:\Users\Romiko>curl localhost:7474/db/data/
{
  "relationship_index" : "http://localhost:7474/db/data/index/relationship",
  "node" : "http://localhost:7474/db/data/node",
  "relationship_types" : "http://localhost:7474/db/data/relationship/types",
  "extensions_info" : "http://localhost:7474/db/data/ext",
  "node_index" : "http://localhost:7474/db/data/index/node",
  "reference_node" : "http://localhost:7474/db/data/node/0",
  "extensions" : {
    "GremlinPlugin" : {
      "execute_script" : "http://localhost:7474/db/data/ext/GremlinPlugin/graphd
b/execute_script"
    }
  }
}

As we can see above, the rest result from the server has the GremlinPlugin Extension. In fact we can now do an HTTP Post Gremlin query to get the nodes from the object graph in the database.

e.g.

I want to see if I have a Node at the second level that has a relationship of type Related To with the Out Direction.

g.v(1).outE(‘RELATED_TO’)

Now we need to URL encode this.

+g.v(1).outE(%27RELATED_TO%27)

curl -d “script=+g.v(1).outE(%27RELATED_TO%27)” http://localhost:7474/db/data/ext/GremlinPlugin/graphdb/execute_script

as we can see the output is:

C:\Users\Romiko>curl -d “script=+g.v(1).outE(%27RELATED_TO%27)” http://localhost

:7474/db/data/ext/GremlinPlugin/graphdb/execute_script

[ {

“start” : “http://localhost:7474/db/data/node/1″,

“data” : {

},

“self” : “http://localhost:7474/db/data/relationship/23″,

“property” : “http://localhost:7474/db/data/relationship/23/properties/{key}”,

“properties” : “http://localhost:7474/db/data/relationship/23/properties”,

“type” : “RELATED_TO”,

“extensions” : {

},

“end” : “http://localhost:7474/db/data/node/2″

} ]

We have Node 23 being related to Node 2. (Remember Node 0, then Node 1).

the above result can be confirmed in the gremlin console (now baked into Neo4j) as of June 2011.

Note: The gremlin extension that is now part of theneo4j server\lib extensions is not for rest API queries, you still need this plugin!

  • gremlin> g.v(1).outE(‘RELATED_TO’)
  • ==> e[23][1-RELATED_TO->2]
  • gremlin>

Here is a screenshot of the gremlin console now baked into Neo4j.

image

image

Hope this gets you started with Neo4j and the gremlin plugin query language Smile

I might be looking at building a custom IQuerable expression translation, so we can then use Linq to query a gremlin based API. Might be fun to do, but first need to learn more about gremlin and Neo4j.

There is a fluent API for gremlin queries you can leverage as a .Net client:

NuGetPackage:
Source Code at:

Cheers

Azure Blob Storage Helper Class and Shared Access Policy

Hi,

I have create a simple helper class that can be used to Upload Blobs to a public and private container. It then also allows you to grant users Temporary access at the blob level for 2 days. This is nice when you want to provide a download link that will expire.

Below is the class

using System.IO;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;



namespace Common.Azure
{
    public interface IStorage
    {

        string ContainerThumbnails { get; }
        string ContainerPhotos { get; }

        CloudStorageAccount StorageAccountInfo { get; set; }
        CloudBlobClient BlobClient { get; set; }

        CloudBlob UploadBlob(string blobUri, Stream stream, string containerName, bool isPublic);
        string GetSharedAccessSignatureToDownloadBlob(string blobUri, string containerName, string userName);
    }
}

 

using System;
using System.Configuration;
using System.IO;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;

namespace Common.Azure
{
    public class Storage : IStorage
    {
        public string ContainerThumbnails
        {
            get { return "photothumbnails"; }
        }
        public string ContainerPhotos
        {
            get { return "photos"; }
        }


        public CloudStorageAccount StorageAccountInfo { get; set; }
        public CloudBlobClient BlobClient { get; set; }

        public Storage()
        {

            CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
            {
                if (RoleEnvironment.IsAvailable)
                    configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
                else
                    configSetter(ConfigurationManager.AppSettings[configName]);
            });

            StorageAccountInfo = CloudStorageAccount.FromConfigurationSetting("StorageConnectionString");
            BlobClient = StorageAccountInfo.CreateCloudBlobClient();
        }



        public CloudBlob UploadBlob(string blobUri, Stream stream, string containerName, bool isPublic)
        {
            var container = BlobClient.GetContainerReference(containerName);
            container.CreateIfNotExist();

            if (isPublic)
            {
                var permissions = new BlobContainerPermissions
                {
                    PublicAccess = BlobContainerPublicAccessType.Container
                };
                container.SetPermissions(permissions);
            }
            else
            {
                var permissions = new BlobContainerPermissions
                {
                    PublicAccess = BlobContainerPublicAccessType.Off
                };
                container.SetPermissions(permissions);
            }


            var blob = container.GetBlockBlobReference(blobUri);
            blob.UploadFromStream(stream);
            return blob;
        }

        public string GetSharedAccessSignatureToDownloadBlob(string blobUri, string containerName, string userName)
        {
            var container = BlobClient.GetContainerReference(containerName);
            container.CreateIfNotExist();
            var blob = container.GetBlockBlobReference(blobUri);




              
            var containeraccess= new SharedAccessPolicy();
            containeraccess.Permissions = SharedAccessPermissions.Read;

            var blobaccess = new SharedAccessPolicy
                                 {
                                     SharedAccessExpiryTime = DateTime.UtcNow.AddDays(2)
                                 };

            var perm = new BlobContainerPermissions
                           {
                               PublicAccess = BlobContainerPublicAccessType.Off
                           };
            perm.SharedAccessPolicies.Clear();
            perm.SharedAccessPolicies.Add(userName, containeraccess);

            container.SetPermissions(perm, new BlobRequestOptions());

            return blob.GetSharedAccessSignature(blobaccess, userName);
        }
    }
}

Now, in a MVC 3 controller, I can call the helper class when an order is submitted and removed from the shopping cart:

orderDetails.DownloadLink = storage.GetSharedAccessSignatureToDownloadBlob(photo.Photo_Url),
                                                                          "photos",
                                                                          cartItem.Username);

 

That’s all this is to it, then on the downloads link, you just render the shared access url

   //
        //Get: /Download/
        [HttpGet]
        public ActionResult Download(int photoId)
        {
            var userName = HttpContext.User.Identity.Name;
            OrderDetail order = _storeDb.OrderDetails.Include("Photo").Include("Order").First(x => x.PhotoId == photoId && x.Order.Username == userName);

            if (order == null)
                return View("InvalidPhoto");

            order.DownloadCount += 1;
            _storeDb.SaveChanges();

            string path = order.Photo.Photo_Url + order.DownloadLink;
            return base.Redirect(path);

        }

e.g.

http://myblobs.com/containername/blobname.jpg?se=2011-02-22T01%3A07%3A20Z&sr=b&si=romiko&sig=PsfUXcJtWRoWBvIiz%2FvHoUJnYF2D70%2B3CdlBbn9SiOM%3D

Using Autofac, MVC3, SQLMembershipProvider and Entity Framework in Windows Azure and SQL Azure

We will cover some configuration issues when deploying MVC3/Razor  Web Apps to the cloud

MVC Configuration

To get MVC 3/Razor working with Azure, follow these instructions:

MVC3 Assembly References

Once you have a SQL Azure instance running, you will need to connect to it from the SQL Management Studio, ensure you set the default database (it is on master, so ensure you change it to your database name, else it will not connect).

Configuration e.g. Database Connection Strings, SMTP settings etc

When we are developing on a local machine without the development fabric, e.g. F5 from the web project and not from the cloud project, the configuration information will be coming from the web.config or app.config (worker role).

So we need to ensure that ALL cloud configuration information for connection strings and other configuration values are stored at the PACKAGE level i.e. ServiceConfiguration.cscfg

However, when you add key/value pairs to the ServiceConfiguration.cscfg you will need to define them in the ServiceConfiguration.csdef file as well, but without the values.

So lets get this setup

——————————————————————Web,config———————————————————————–

  <connectionStrings>
    <add name="ApplicationServices" connectionString="Data Source=.;Initial Catalog=Readify.Romiko;Integrated Security=True;MultipleActiveResultSets=True" providerName="System.Data.SqlClient" />
    <add name="RomikoEntities" connectionString="metadata=res://*/Repository.Romiko.csdl|res://*/Repository.Romiko.ssdl|res://*/Repository.Romiko.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=.;Initial Catalog=Readify.Romiko;Integrated Security=True;MultipleActiveResultSets=True&quot;" providerName="System.Data.EntityClient" />
  </connectionStrings>

——————————————————————————————————————————————————–

As we can see above, we got the EF connection string and a SQLMembershipProvider connection string.

Excellent, now lets define these key/value pairs in the ServiceConfiguration.csdef

Now, I use these connections from both a worker and web role, so you need to define it in the respective sections:

——————————————————————ServiceConfiguration.csdef ————————————————-

<ServiceDefinition name="Readify.Romiko.Azure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
  <WebRole name="Readify.Romiko.Web">
    <Sites>
      <Site name="Web">
        <Bindings>
          <Binding name="Endpoint1" endpointName="Endpoint1" />
        </Bindings>
      </Site>
    </Sites>
    <Endpoints>
      <InputEndpoint name="Endpoint1" protocol="http" port="80" />
    </Endpoints>
    <Imports>
      <Import moduleName="Diagnostics" />
    </Imports>
    <ConfigurationSettings>
      <Setting name="RomikoEntities" />
      <Setting name="ApplicationServices" />
    </ConfigurationSettings>
  </WebRole>
  <WorkerRole name="Readify.Romiko.Worker">
    <Imports>
      <Import moduleName="Diagnostics" />
    </Imports>
    <ConfigurationSettings>
      <Setting name="RomikoEntities" />
      <Setting name="SMTPServer" />
      <Setting name="SMTPServerPort" />
      <Setting name="SMTPUser" />
      <Setting name="SMTPPassword" />
    </ConfigurationSettings>
  </WorkerRole>
</ServiceDefinition>

———————————————————————————————————————————————————

As you can see above, I only need the EF connection string in the worker role, as I do not use forms authentication there! Also notice, there is no values set, this will be done in the csfg file:

——————————————————————ServiceConfiguration.cscfg —————————————————–

<Role name="Readify.Romiko.Web">
   <Instances count="1" />
   <ConfigurationSettings>
     <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" />
     <Setting name="RomikoEntities" value="metadata=res://*/Repository.Romiko.csdl|res://*/Repository.Romiko.ssdl|res://*/Repository.Romiko.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=ifx2adecne.database.windows.net;Initial Catalog=Romiko;Persist Security Info=True;User ID=RomikoApp;Password=St0rmyCloud@pp1&quot;" />
     <Setting name="ApplicationServices" value="Data Source=ifx2adecne.database.windows.net;Initial Catalog=Romiko;Persist Security Info=True;User ID=RomikoApp;Password=St0rmyCloud@pp1" />
   </ConfigurationSettings>
 </Role>
 <Role name="Readify.Romiko.Worker">
   <Instances count="1" />
   <ConfigurationSettings>
     <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" />
     <Setting name="RomikoEntities" value="metadata=res://*/Repository.Romiko.csdl|res://*/Repository.Romiko.ssdl|res://*/Repository.Romiko.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=ifx2adecne.database.windows.net;Initial Catalog=Romiko;Persist Security Info=True;User ID=RomikoApp;Password=St0rmyCloud@pp1&quot;" />
     <Setting name="SMTPServer" value="smtp.mail.Romiko.com" />
     <Setting name="SMTPServerPort" value="25" />
     <Setting name="SMTPUser" value="Romiko@romiko.net" />
     <Setting name="SMTPPassword" value="derbynew" />
   </ConfigurationSettings>
 </Role>

———————————————————————————————————————————————————

Configuration Classes

Excellent, now we need a way to tell the runtime when to load data from the Azure configuration or the standard .NET configuration files.

All we do is create a helper class and interface, the interface will be used for IoC injection later with Autofac.

image

 

Now, the two concrete classes just fetch config info from seperate managers:

 public class ConfigurationManagerAzure : IConfigurationManager
    {
        public string GetDatabaseConnectionString()
        {

            return RoleEnvironment.GetConfigurationSettingValue("RomikoEntities");

        }

        public string AuthenticationProviderConnectionString()
        {

            return RoleEnvironment.GetConfigurationSettingValue("ApplicationServices");

        }


        public SmtpSettings GetSmtpSettings()
        {
            var smtpSettings = new SmtpSettings();

            smtpSettings.SmtpServer = RoleEnvironment.GetConfigurationSettingValue("SMTPServer");
            smtpSettings.SmtpServerPort = RoleEnvironment.GetConfigurationSettingValue("SMTPServerPort");
            smtpSettings.SmtpUser = RoleEnvironment.GetConfigurationSettingValue("SMTPUser");
            smtpSettings.SmtpPassword = RoleEnvironment.GetConfigurationSettingValue("SMTPPassword");

            return smtpSettings;
        }
    }

 

 public class ConfigurationManagerLocal : IConfigurationManager
    {
        public string GetDatabaseConnectionString()
        {

            return System.Configuration.ConfigurationManager.ConnectionStrings["RomikoEntities"].ToString();

        }

        public string AuthenticationProviderConnectionString()
        {

            return System.Configuration.ConfigurationManager.ConnectionStrings["ApplicationServices"].ToString();

        }


        public SmtpSettings GetSmtpSettings()
        {
            var smtpSettings = new SmtpSettings();

            smtpSettings.SmtpServer = System.Configuration.ConfigurationManager.AppSettings["SMTPServer"];
            smtpSettings.SmtpServerPort = System.Configuration.ConfigurationManager.AppSettings["SMTPServerPort"];
            smtpSettings.SmtpUser = System.Configuration.ConfigurationManager.AppSettings["SMTPUser"];
            smtpSettings.SmtpPassword = System.Configuration.ConfigurationManager.AppSettings["SMTPPassword"];


            return smtpSettings;
        }
    }

This is perfect, now what we can do is load the concrete class dynamically via AutoFac, so in the Global.asax.cs file we have something like this:

AutoFac (Web Role)

protected void Application_Start()
       {

           RegisterGlobalFilters(GlobalFilters.Filters);
           RegisterRoutes(RouteTable.Routes);


           var builder = new ContainerBuilder();
           builder.RegisterControllers(Assembly.GetExecutingAssembly());


           if (RoleEnvironment.IsAvailable)
               builder.RegisterType<ConfigurationManagerAzure>().AsImplementedInterfaces();
           else
               builder.RegisterType<ConfigurationManagerLocal>().AsImplementedInterfaces();

           builder.RegisterType<RomikoEntities>();
           builder.RegisterType<AccountMembershipService>().AsImplementedInterfaces();
           builder.RegisterType<Membership.AzureMembershipProvider>();
           builder.RegisterType<FormsAuthenticationService>().AsImplementedInterfaces();

           builder.Register(x => System.Web.Security.Membership.Provider).ExternallyOwned();

           Container = builder.Build();
           DependencyResolver.SetResolver(new AutofacDependencyResolver(Container));

       }

So, from the above we used the RoleEnvironment.isAvailable to detect if it is running on the cloud,, and then load the correct class types into the container.

Just ensure you controllers and helper classes have constructor’s for injecting the IConfiguration and Entities e.g.

Controller:

public HomeController(RomikoEntities entities)
        {
            this.entities = entities;
        }

Helper Class injection setup:

public class AlertManager
   {
       private readonly IConfigurationManager configurationManager;
       private readonly RomikoEntities entities;

       public PolicyAlertManager(
           IConfigurationManager configurationManager,
           RomikoEntities entities)
       {
           this.configurationManager = configurationManager;
           this.entities = entities;
       }

 

EF partial class extended

To get EF to use the connection string from the inject configuration concrete class we have this partial class:

public partial class RomikoEntities
    {
        public RomikoEntities(IConfigurationManager configurationManager)
            : this(configurationManager.GetDatabaseConnectionString())
        {

        }
    }

As we get can, EF has a default constructor that we can call to pass in the connection string.

AccountController and SQLMembershipProvider

Now, we need to make some modifications to the default account controller that comes with an MVC project, if you choose the template.

First of all, we will ensure Autofac registers a method call:

   builder.Register(x => System.Web.Security.Membership.Provider).ExternallyOwned();

This is used in the accountmodel class:

I modified the AccountController.cs to have the following constructor injection possible:

private  IFormsAuthenticationService formsService;
private  IMembershipService membershipService;

public AccountController(IMembershipService membershipService, IFormsAuthenticationService formsService)
{
    this.MembershipService = membershipService;
    this.FormsService = formsService;
}

public IMembershipService MembershipService
{
    get { return membershipService; }
    set { membershipService = value; }
}

public IFormsAuthenticationService FormsService
{
    get { return formsService; }
    set { formsService = value; }
}

 

Then we need to modify the AccountModel class to get the correct constructor parameters, since this is configured via the web.,config and it expects a paramterless constructor:

<membership>
  <providers>
    <clear />
    <add name="AspNetSqlMembershipProvider" type="Readify.Romiko.Web.Membership.AzureMembershipProvider" connectionStringName="ApplicationServices" enablePasswordRetrieval="false" enablePasswordReset="true" requiresQuestionAndAnswer="false" requiresUniqueEmail="false" maxInvalidPasswordAttempts="5" minRequiredPasswordLength="6" minRequiredNonalphanumericCharacters="0" passwordAttemptWindow="10" applicationName="/" />
  </providers>
</membership>

So, we have a custom MemberShipProvider that inherits from SqlMembershipProvider:

AzureMembershipProvider

using System.Collections.Specialized;
using System.Reflection;
using System.Web.Security;
using Autofac;
using Readify.Romiko.Common.Configuration;

namespace Readify.Romiko.Web.Membership
{
    public class AzureMembershipProvider : SqlMembershipProvider
    {
        private readonly IConfigurationManager configuration;

        public AzureMembershipProvider()
            :this((MvcApplication.Container.Resolve<IConfigurationManager>()))
        {
        }

        public AzureMembershipProvider(IConfigurationManager configuration)
            
        {
            this.configuration = configuration;
        }

        public override void Initialize(string name, NameValueCollection config)
        {
            base.Initialize(name, config);

            var connectionString = configuration.AuthenticationProviderConnectionString();
            var connectionStringField = typeof(SqlMembershipProvider).GetField("_sqlConnectionString", BindingFlags.Instance | BindingFlags.NonPublic);
            if (connectionStringField != null) connectionStringField.SetValue(this, connectionString);
        }
    }
}

 

Notice, above, we have kept the constructor parameter less, but then we use AutoFac to resolve the configuration:

Note MVCApplication is the class name of the Global.asax.cs file! Also we have a static reference to the Container.

Excellent, so now we can inject configuration, however, the AccountModel.cs file needs some pimping as well, remember this resolver:

builder.Register(x => System.Web.Security.Membership.Provider).ExternallyOwned();

Well, we need to implement this in the Account.Model, so it is available, again, we will resolve it like so:

In the AccountModel.cs there is a class called: AccountMembershipService, we will modify the constructor to do some resolving:

public class AccountMembershipService : IMembershipService
    {
        private readonly MembershipProvider _provider;

        public AccountMembershipService()
            : this(MvcApplication.Container.Resolve<MembershipProvider>())
        {
        }

        public AccountMembershipService(MembershipProvider provider)
        {
            _provider = provider ?? System.Web.Security.Membership.Provider;
        }
……..

So, as you can see we just use a parameter less constructor pattern and then within that constructor we resolve types and method/property calls.

So when Membership.Provider is called, the container will get the return value.

Autofac install with Nuget

In VS2010, we can go to View->Other Windows-> Package manager console, and install autofac into the various projects:

Note the default project, ensure you install to the respective projects:

image

Command to install is:

PM> install-package Autofac.Mvc3

You can read more about this here:

http://code.google.com/p/autofac/wiki/Mvc3Integration

SQL Azure Membership Role Database Setup

This is interesting, as you will need custom scripts that are azure friendly, e.g. SQL Azure does not use USE statements! Makes sense from a security perspective.

Updated ASP.net scripts for use with Microsoft SQL Azure

http://support.microsoft.com/kb/2006191/en-us

 

AutoFac and Worker Roles:

This is easy to setup, just put the logic in the OnStart Method of the WorkerRole.cs file.

  public override bool OnStart()
        {
            ServicePointManager.DefaultConnectionLimit = 12;

            //AutoFac Container
            var builder = new ContainerBuilder();

            if (RoleEnvironment.IsAvailable)
                builder.RegisterType<ConfigurationManagerAzure>().AsImplementedInterfaces();
            else
                builder.RegisterType<ConfigurationManagerLocal>().AsImplementedInterfaces();

            builder.RegisterType<AlertManager>();
            builder.RegisterType<RomikoCloudEntities>();

            container = builder.Build();

            return base.OnStart();
        }

As you can see the Alertmanager constructor has all the interfaces defined Smile

 public class AlertManager
    {
        private readonly IConfigurationManager configurationManager;
        private readonly RomikoCloudEntities entities;

        public PolicyAlertManager(
            IConfigurationManager configurationManager,
            RomikoCloudEntities entities)
        {
            this.configurationManager = configurationManager;
            this.entities = entities;
        }

So, I hope this gets you going with MVC3, Azure and all the bells and whistles that it comes with.

MVC3/Razor and Azure SDK 1.3

You can deploy MVC3 to Azure, endure you have all these in the reference and copy local set to true:

  • Microsoft.Web.Infrastructure
  • System.Web.Helpers
  • System.Web.Mvc
  • System.Web.Razor
  • System.Web.WebPages
  • System.Web.WebPages.Deployment
  • System.Web.WebPages.Razor

The best way to debug this after a deployment is to have the following in the web.config for debugging purposes:

<system.web>
  <customErrors mode="Off" />
  <compilation debug="true" targetFramework="4.0">

 

Obviously do not have this for production!

TFS and Windows Azure SDK 3.1 Development

Hi,

I am working on a project with Team Foundation Server which loves to make the web.config file read only. When this occurs and I try to run my MVC3 ASP.NET web role, I will get this error:

"The communication object, System.ServiceModel.Channels.ServiceChannel, cannot be used for communication because it is in the Faulted state."

It is not the best error in the world, however, the Development Fabric is deploying the application and needs write access to your web.config file, so endure this is not Read Only.

I never have this issue with BitBucket and Mecurial Smile

Honestly, I think this is a bug with the SDK 1.3, as I have this problem with other projects like Windows Phone 7 and Azure connectivity, what happens it it keeps playing with the machine key!

<machineKey decryption="AES" decryptionKey=blahblahblah