Automate #Azure Blob Snapshot backups with @Cerebrata

Hi,

Leveraging the cerebrata cmdlets for Azure, we can easily backup our blob containers via snapshot, this will prove useful for Page Blobs that are Random Access i.e. VHD’s on Cloud Drive

Here is how Purging Snapshots works

#requires -version 2.0
param (
	[parameter(Mandatory=$true)] [string]$AzureAccountName,
	[parameter(Mandatory=$true)] [string]$AzureAccountKey,
	[parameter(Mandatory=$true)] [array]$BlobContainers
)

$ErrorActionPreference = "Stop"

if ((Get-PSSnapin -Registered -Name AzureManagementCmdletsSnapIn -ErrorAction SilentlyContinue) -eq $null)
{
	throw "AzureManagementCmdletsSnapIn missing. Install them from Https://www.cerebrata.com/Products/AzureManagementCmdlets/Download.aspx"
}

Add-PSSnapin AzureManagementCmdletsSnapIn -ErrorAction SilentlyContinue

function SnapShotBlobContainer 
{
	param ( $containers, $blobContainerName )
	Write-Host "Starting snapshot $blobContainerName"

	$container = $containers | Where-Object { $_.BlobContainerName -eq $blobContainerName }

	if ($container -eq $null)
	{
		Write-Host  "Container $blobContainerName doesn't exist, skipping snapshot"
	}
	else
	{
        Write-Host  "Found blob container $blobContainerName"
Checkpoint-BlobContainer -Name $container.BlobContainerName -SaveSnapshotInformation -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	Write-Host  "Snapshot complete for $blobContainerName"
	}
}

$containers = Get-BlobContainer -AccountName $AzureAccountName -AccountKey $AzureAccountKey
foreach($container in $BlobContainers)
{
	SnapShotBlobContainer $containers $container
}

Then just call the script with the params. remember an array of items is parsed in like this:

-BlobContainers:@(‘container1’, ‘contaner2’) -AzureAccountName romikoTown -AzureAccountKey blahblahblahblahblehblooblowblab==

Advertisements

Appfabric Topics–Pub/Sub Messaging Service Bus

Overview

We will build a very simple Pub/Sub messaging system using the Azure sdk 1.6. I built this for a demo, so for production ready solutions you will need to alter how you define your message types, e.g. interfaces etc.

Below is an idea how topics work, basically they similar to queues, except a topic is broken down into subscriptions, so the publisher pushes messages to the Topic (Just like you would with a queue) and the subscriber will subscribe to a subscription.

image

So, in the above, we can see a publisher/subscriber messaging bus.

Filtering

Another nice feature is that you can attach metadata to messages via a IDictionary<string, object> property to include additional metadata for messages, e.g. messageType

This means, you can then create subscription rule filters based on properties that are on the message, all messages are cast to BrokeredMessage before they are sent on the bus. Generics are used on the subscriber to cast the message back to it’s original type.

This sample code that I have provided has filtering on the CUSTOMER subscription, which only takes messages with a MessageType = ‘BigOrder’.

Domain Modelling Rules

I like the idea of the service bus filtering ONLY on MessageType and NEVER on business domain rules e.g. Quantity > 100 etc, why? The Service Bus should never have rules that belong to a business domain, all logic for deciding what “TYPE” of message it is should be done before it is pushed onto the bus, if you stick to this, you will keep your service bus lean and mean.

The Code

I have a sample application you can download from:

https://bitbucket.org/romiko/appfabricpubsubdemo

or clone it with mecurial.

I also included asynchronous operations. Remember that Azure Appfabric Topics only support partial transactions. If you want full transaction support, check out NServiceBus. I highly recommend the use of NServiceBus with AppFabric for full transactional and guaranteed message delivery for .Net publishers and subscribers.

hg clone https://bitbucket.org/romiko/appfabricpubsubdemo

Sample output

Below are screenshots of what you should see when running the code, after updating the AcountDetails.cs file with your appfabric account details.

image

Notice:

One publisher – sending a small and big order

2 subscribers that get both messages

1 subscriber only getting the big order using the subscription filtering.

Subscription

You will need an Azure subscription to try out the demo, once you have the subscription, update the AccountDetails.cs file with your credentials. Namespace, Name and Key, this can all be found in the Azure portal management. Check the default key.

.

 

image

One the right pain will be the properties for the namespace, including the Management keys which you can use to get started, by using the default name of owner, or you can play with access control and create a user account and key.

image

Service Bus Explorer

I recommend you check out Servicebus Explorer

Here I use the explorer tool to see the message filters on the customer subscription, which only takes big orders.

image

Message Factory

using System;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;

namespace Logic
{
    public class MyMessagingFactory : IDisposable
    {
        private MessagingFactory factory;
        public NamespaceManager NamespaceManager { get; set; }

        public MyMessagingFactory()
        {
            var credentials =
               TokenProvider.CreateSharedSecretTokenProvider
                   (AccountDetails.Name, AccountDetails.Key);

            var serviceBusUri = ServiceBusEnvironment.CreateServiceUri
                ("sb", AccountDetails.Namespace, string.Empty);

            factory  = MessagingFactory.Create
                (serviceBusUri, credentials);

            NamespaceManager = new NamespaceManager(serviceBusUri, credentials);
        }


        public TopicClient GetTopicPublisherClient()
        {
            var topicClient =
                factory.CreateTopicClient("romikostopictransgrid");

            return topicClient;
        }

        public SubscriptionClient GetTopicSubscriptionClient(SubscriptionName subscription)
        {
            var topicSubscriptionClient =
                factory.CreateSubscriptionClient("romikostopictransgrid", subscription.ToString(), ReceiveMode.ReceiveAndDelete);


            return topicSubscriptionClient;
        }

        public void Dispose()
        {
            factory.Close();
        }
    }
}

Publisher

using System;
using Microsoft.ServiceBus.Messaging;
using Messages;

namespace Logic.Publish
{
    public class PublisherClient
    {
        public void SendTransformerOrder(TopicClient topicClient)
        {
            const string format = "Publishing message for {0}, Quantity {1} Transformer {2}";
            var orderIn1 = new TransformerOrder
                               {
                    Name = "Transgrid",
                    Transformer = "300kv, 50A",
                    Quantity = 5,
                    MessageType = MessageType.SmallOrder
                };

            var orderInMsg1 = new BrokeredMessage(orderIn1);
            orderInMsg1.Properties["MessageType"] = orderIn1.MessageType.ToString();
            Console.WriteLine(format, orderIn1.Name, orderIn1.Quantity, orderIn1.Transformer);

            topicClient.Send(orderInMsg1);

            var orderIn2 = new TransformerOrder
                               {
                    Name = "Transgrid",
                    Transformer = "250kv, 50A",
                    Quantity = 200,
                    MessageType = MessageType.BigOrder
                };

            var orderInMsg2 = new BrokeredMessage(orderIn2);
            orderInMsg2.Properties["MessageType"] = orderIn2.MessageType.ToString();

            orderInMsg2.Properties["Quatity"] = orderIn2.Quantity;




            Console.WriteLine(format, orderIn2.Name, orderIn2.Quantity, orderIn2.Transformer);

            //topicClient.Send(orderInMsg2);
            topicClient.BeginSend(orderInMsg2, a => 
                Console.WriteLine(string.Format("\r\nMessage published async, completed is: {0}.", a.IsCompleted)), 
                topicClient);

        }
    }
}

Subscriber

using System;
using System.Linq;
using Microsoft.ServiceBus.Messaging;
using Messages;

namespace Logic.Subscription
{
    public class SubscriptionManager
    {
        public static void CreateSubscriptionsIfNotExists(string topicPath, MyMessagingFactory factory)
        {
            var sales = SubscriptionName.Sales.ToString();
            if (!factory.NamespaceManager.SubscriptionExists(topicPath, sales))
                factory.NamespaceManager.CreateSubscription(topicPath, sales);

            var customer = SubscriptionName.Customer.ToString();
            if (!factory.NamespaceManager.SubscriptionExists(topicPath, customer))
            {
                var rule = new RuleDescription
                {
                    Name = "bigorder",
                    Filter = new SqlFilter(string.Format("MessageType = '{0}'", MessageType.BigOrder))
                };
                factory.NamespaceManager.CreateSubscription(topicPath, customer, rule);
            }

            var inventory = SubscriptionName.Inventory.ToString();
            if (!factory.NamespaceManager.SubscriptionExists(topicPath, inventory))
                factory.NamespaceManager.CreateSubscription(topicPath, inventory);
        }

        public static void ShowRules(string topicPath, MyMessagingFactory factory)
        {
            var currentRules = factory.NamespaceManager.GetRules(topicPath, SubscriptionName.Customer.ToString()).ToList();

            Console.WriteLine(string.Format("Rules for subscription: {0}", "Customer"));
            foreach (var result in currentRules)
            {
                var filter = (SqlFilter)result.Filter;
                Console.Write(string.Format("RuleName: {0}\r\n Filter: {1}\r\n", result.Name, filter.SqlExpression));
            }
        }
    }
}
using System;
using System.Threading;
using Microsoft.ServiceBus.Messaging;
using Messages;

namespace Logic.Subscription
{
    public class Subscriber
    {
        public void ReceiveTransformerOrder(SubscriptionClient client)
        {
            GetMessages(client);
        }

        private static void GetMessages(SubscriptionClient client)
        {
                //var orderOutMsg = client.Receive(TimeSpan.FromSeconds(5));
                client.BeginReceive(ReceiveDone, client);
        }

        public static void ReceiveDone(IAsyncResult result)
        {
            var subscriptionClient = result.AsyncState as SubscriptionClient;
            if (subscriptionClient == null)
            {
                Console.WriteLine("Async Subscriber got no data.");
                return;
            }

            var brokeredMessage = subscriptionClient.EndReceive(result);

            if (brokeredMessage != null)
            {
                var messageId = brokeredMessage.MessageId;
                var orderOut = brokeredMessage.GetBody<
                    TransformerOrder>();

                Console.WriteLine("Thread: {0}{6}" +
                                  "Receiving orders for subscriber: {1}{6}" +
                                  "Received MessageId: {2}{6}" +
                                  "Quantity: {3}{6}" +
                                  "Transformer:{4} for {5}{6}", 
                                  Thread.CurrentThread.ManagedThreadId,
                                  subscriptionClient.Name, messageId,
                                  orderOut.Quantity, orderOut.Transformer, orderOut.Name, Environment.NewLine);
            }
            subscriptionClient.Close();
        }
    }
}

Message

using System.Runtime.Serialization;

namespace Messages
{
    [DataContract]
    public class TransformerOrder
    {
        [DataMember]
        public string Name { get; set; }

        [DataMember]
        public string Transformer { get; set; }

        [DataMember]
        public int Quantity { get; set; }

        [DataMember]
        public string Color { get; set; }

        [DataMember]
        public MessageType MessageType{ get; set; }
    }
}

Sample Publisher

using System;
using Logic;
using Logic.Publish;
using Logic.Subscription;

namespace Publisher
{
    class Program
    {
        const string TopicPath = "romikostopictransgrid";
        static void Main()
        {

            using (var factory = new MyMessagingFactory())
            {
                SubscriptionManager.CreateSubscriptionsIfNotExists(TopicPath, factory);
                SubscriptionManager.ShowRules(TopicPath, factory);
                PublishMessage(factory);
            }
        }

        private static void PublishMessage(MyMessagingFactory factory)
        {
            var queue = factory.GetTopicPublisherClient();
            var publisher = new PublisherClient();
            publisher.SendTransformerOrder(queue);
            Console.WriteLine("Published Messages to bus.");
            Console.WriteLine(Environment.NewLine);
            Console.WriteLine("Press any key to publish again");
            Console.ReadLine();
            PublishMessage(factory);
        }

    }
}

Sample Subscriber

using System;
using System.Threading;
using Logic;
using Logic.Subscription;

namespace SubscriberCustomer
{
    class Program
    {
        static void Main()
        {
            using (var factory = new MyMessagingFactory())
            while (true)
            {
                SubscribeToMessages(factory);
                Thread.Sleep(TimeSpan.FromMilliseconds(500));
            }
        }

        private static void SubscribeToMessages(MyMessagingFactory factory)
        {
            var subscriptionCustomer = factory.GetTopicSubscriptionClient(SubscriptionName.Customer);
            var subscriber = new Subscriber();
            subscriber.ReceiveTransformerOrder(subscriptionCustomer);
        }
    }
}

Windows #Azure configuration transformations

Hi,

I needed to update the transformation today to support different csdef files for UAT/Test etc. I found myself forgetting the process, so i thought it would be a good idea to log the entries needed.

What I wanted was a way to disable New Relic in our performance environment, since we do not have a license key.

Since we use the Azure Tasks to run batch jobs before the worker role starts, it made sense that I create a TASK Environment variable that my batch script can check and see if it should install New relic, e.g.

if "%MYSTORYENV%" == "PERF" goto :EOF

So, in the above, my batch file startup.cmd will skip installing new relic if the environment variable is PERF. However we need to set this value in the csdef file.

So we go to the BASE servicedefinition.csdef file and have this entry for it.

<sd:Startup>
      <sd:Task commandLine="Startup.cmd" executionContext="elevated" taskType="background">
        <sd:Environment>
          <sd:Variable name="EMULATED">
            <sd:RoleInstanceValue xpath="/RoleEnvironment/Deployment/@emulated" />
          </sd:Variable>
          <sd:Variable name="MYSTORYENV" value="DEV" />
        </sd:Environment>
      </sd:Task>
    </sd:Startup>

Notice, that I have qualified all my csdef entries, this is important for transformations to occur (sd:)

Ok, the next step is that we create a transformation file

https://gist.github.com/1777060

Now, that we have this transform, we will need to edit the CSPROJ file. Please see below the parts added

Item Groups

<ItemGroup>
<ServiceDefinition Include="ServiceDefinition.csdef" />
<ServiceConfiguration Include="ServiceConfiguration.cscfg" />
</ItemGroup>
<ItemGroup>
<EnvironmentDefinition Include="ServiceDefinition.uat.csdef">
<BaseConfiguration>ServiceDefinition.csdef</BaseConfiguration>
</EnvironmentDefinition>
<EnvironmentDefinition Include="ServiceDefinition.perf.csdef">
<BaseConfiguration>ServiceDefinition.csdef</BaseConfiguration>
</EnvironmentDefinition>
<EnvironmentConfiguration Include="ServiceConfiguration.uat.cscfg">
<BaseConfiguration>ServiceConfiguration.cscfg</BaseConfiguration>
</EnvironmentConfiguration>
<EnvironmentConfiguration Include="ServiceConfiguration.perf.cscfg">
<BaseConfiguration>ServiceConfiguration.cscfg</BaseConfiguration>
</EnvironmentConfiguration>
<None Include="@(EnvironmentConfiguration)" />
<None Include="@(EnvironmentDefinition)" />
</ItemGroup>

Notice I have the include at the bottom, so I can see these in Visual Studio. I also have transformations for cscfg files, hence the reason why you see them here Smile

Targets Validation

<Target Name="ValidateServiceFiles"
		Inputs="@(EnvironmentConfiguration);@(EnvironmentConfiguration->'%(BaseConfiguration)');@(EnvironmentDefinition);@(EnvironmentDefinition->'%(BaseConfiguration)')"
		Outputs="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg');@(EnvironmentDefinition->'%(Identity).transformed.csdef')">

	<Message Text="ValidateServiceFiles: Transforming %(EnvironmentConfiguration.BaseConfiguration) to %(EnvironmentConfiguration.Identity).tmp via %(EnvironmentConfiguration.Identity)" />
	<TransformXml Source="%(EnvironmentConfiguration.BaseConfiguration)" Transform="%(EnvironmentConfiguration.Identity)" Destination="%(EnvironmentConfiguration.Identity).tmp" />
	<Message Text="ValidateServiceFiles: Transformation complete; starting validation" />

	<Message Text="ValidateServiceFiles: Transforming %(EnvironmentDefinition.BaseConfiguration) to %(EnvironmentDefinition.Identity).tmp via %(EnvironmentDefinition.Identity)" />
	<TransformXml Source="%(EnvironmentDefinition.BaseConfiguration)" Transform="%(EnvironmentDefinition.Identity)" Destination="%(EnvironmentDefinition.Identity).tmp" />
	<Message Text="ValidateServiceFiles: Transformation complete; starting validation" />

	<ValidateServiceFiles ServiceDefinitionFile="@(ServiceDefinition)" ServiceConfigurationFile="%(EnvironmentConfiguration.Identity).tmp" />
	<ValidateServiceFiles ServiceDefinitionFile="%(EnvironmentDefinition.Identity).tmp" ServiceConfigurationFile="@(ServiceConfiguration)" />
	<Message Text="ValidateServiceFiles: Validation complete; renaming temporary file" />

	<Move SourceFiles="%(EnvironmentConfiguration.Identity).tmp" DestinationFiles="%(EnvironmentConfiguration.Identity).transformed.cscfg" />
	<Move SourceFiles="%(EnvironmentDefinition.Identity).tmp" DestinationFiles="%(EnvironmentDefinition.Identity).transformed.csdef" />
</Target>

Notice above I have them for BOTH CSCFG and CSDEF files!

Move transforms to the app.publish folder for azure packaging

<Target Name="MoveTransformedEnvironmentConfigurationXml" AfterTargets="AfterPackageComputeService" Inputs="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')" Outputs="@(EnvironmentConfiguration->'$(OutDir)app.publish\%(filename).cscfg')">
<Move SourceFiles="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')" DestinationFiles="@(EnvironmentConfiguration->'$(OutDir)app.publish\%(filename).cscfg')" />
<Move SourceFiles="@(EnvironmentDefinition->'%(Identity).transformed.csdef')" DestinationFiles="@(EnvironmentDefinition->'$(OutDir)app.publish\%(filename).csdef')" />
</Target>

Summary

So there you have it, you will now have csdef and cscfg files for different environments.

Windows #Azure–Pre Role Startup Tasks

Hi,

Imagine you need to boot up a web role in the cloud, but before the global.asax events kick in or even lower, before the WebRoleEntryPoint events kick in, you need to do some installations of prerequisite software.

The best way to go about doing this is to register a task in the ServiceDefinition.csdef file. Lets imagine we need to run a batch file that will do some sort of installation, say a monitoring service that is required to be installed BEFORE IIS starts our web application, so that it can get a hook point, say New Relic!

Below is a configuration example that will do this for you.

https://gist.github.com/1775222

You can also set elevation privileges, which are required if you are running PowerShell scripts etc.

<Task commandLine="Startup.cmd" executionContext="elevated" taskType="background">

You can read more about this here:

http://msdn.microsoft.com/en-us/library/windowsazure/hh124132.aspx

http://msdn.microsoft.com/en-us/library/windowsazure/gg456327.aspx

http://msdn.microsoft.com/en-us/library/windowsazure/gg432991.aspx

So, I hope you now have a cool way to bootstrap your prerequisite software before IIS kicks in.

Windows Azure–Diagnosing Role Start-ups

Hi Guys,

I want to walk through three issues you can have with Windows Azure Diagnostics and a Worker Role. I assume you want to access the Windows Azure Trace Logs, since you use the Trace command to write out exceptions and status messages in the onStart code.

e.g, Trace.WriteLine(“Starting my service.”)

Also you have the WAD trace listener on, which it is by default.

https://gist.github.com/1757147

Scenario – Role fails to start very early on

You might have a custom worker role that starts some sort of background service and perhaps it fails immediate due to some sort of configuration.

Symptoms

You notice the role keeps recycling and recovering.

On Demand Transfer or Schedule Transfer of Diagnostics logs do not work at all, so you cannot get any Trace Information whatsoever.

Solution

Put this method in your WorkerEntryPoint.cs file and call it at the beggining of OnStart() and in any Catch Exception block

https://gist.github.com/1757083

e.g Start

public override bool OnStart()
{
WaitForWindowsAzureDiagnosticsInfrastructureToCatchUp();
try
{

 

e.g Exception

catch (Exception ex)
           {
               TraceException(ex);
               Trace.Flush();
               WaitForWindowsAzureDiagnosticsInfrastructureToCatchUp();
               return true;
           }

 

Scenario – Role fails to start a bit later

You are able to diagnose the problem since On Demand Transfer/Scheduled Transfer works and you can then get to the trace logs to see error messages you have written to Trace. Recall that Windows Azure has a settings to automatically have a trace listener on to redirect trace to its WAD table.

 

Scenario – Role fails to start a bit later

Symptoms

You notice the role keeps recycling and recovering or event stars up but is unresponsive.

On Demand Transfer does not work – You try but it just does not complete or hangs

Below is screen shots of On Demand Transfers with the Cerebrata Diagnostics Manager.

image

image

Solution

If you cannot do an On Demand Transfer of trace logs, perhaps it keeps recycling and recovering to fast for a On Demand Transfer to occur. Then what you do is temporarily configure a Scheduled Transfer of the Trace Logs

If using Cerebrata Diagnostics Manager

Click Remote Diagnostics in the Subscriptions under your Data Sources

image

image

Once you have configured Schedules transfer, this tool will basically UPLOAD a configuration file into your BLOB container: wad-control-container

Azure will automatically detect changes in this container and apply them to the Diagnostics Manager. Hence configuration of Windows Azure Diagnostics On The Fly

image

Now that we have scheduled transfer in place REBOOT the role that is causing the issue and then wait for it to try start up and fail, and then just go download the trace logs and it should be there.

image

Summary

So, ensure you have a silly sleep command in your work entry point OnStart and in areas where you catch exceptions in case your worker role crashes before Windows Azure Diagnostics!

Try On Demand Transfers if there is an issue, and if that does not work,  configure a scheduled transfer on the fly and then reboot the role to get the start up logs.

WARNING!

Scheduled Transfers will impact your billing of Storage Services, MAKE SURE you turn it OFF when you finished diagnosing the issue, else you will get BILLED for it.

Notice in my screen shot I ALWAYS use a quota so I never over use diagnostics storage – and Windows Azure Trace Logs are stored in TABLE Storage:

image

Remember configuration of WAD is in Blob and the actual trace logs are in Tables.

image

Windows Azure, which affinity group or region is best for you?

A nice tool to use to see which Azure Affinity Group to use e.g. South America, North America or Asia is to download this tool and run checks from where your clients will be based.

http://research.microsoft.com/en-us/downloads/5c8189b9-53aa-4d6a-a086-013d927e15a7/default.aspx

Once you got it installed, add your storage accounts and then get started.

image

So above we will test from Sydney Australia to our UAT environment in America.

Lets click “run”

It will start executing the test, this is now a good time to plan a date, make a cup of coffee or write some JScript for your open source projects.

Results:

Sydney to North America

image

image

 

Sydney to South East Asia (Singapore)

image

image

Conclusion

For us, South East Asia was far more better (Web site, download is more important than upload), and the proof was in the pudding when we measures web site response times with MVC MiniProfiler.

However, this is not the ultimate conclusion, I bet these response times will vary depending on time of day, perhaps when Asia is awake and US is asleep, it could be the other way round, so test it at different times of day and pick the affinity or region that is best for you!

Windows Azure SDK 1.6–CSX folder output breaking change..again..

Background

Automated deployments with SDK 1.6 have been broken with TeamCity.

Location of CSX folder in build output changed.

Location of CSRUN.exe moved to emulator folder.

Details

Before, when using the MSBUILD targets, the csx folder that is needed by CSRUN.exe for automated deployments has been changed, what is worse, is the old csx folder location is not cleaned up and is partially there, so to the untrained eye you think it is still there!

Also note, you will need to change the path of csrun.exe as this has been moved.

Old Location

CloudProject\bin\%BuildConfiguration%\CloudProject.csx

New Location

CloudProject\csx\%BuildConfiguration%

Why does the old location with impartial files still exist? Not sure…Because this was my new error when I tried to deploy.

The compute emulator had a error: Can’t locate service descriptions

image

Now the new location with all files

image

TeamCity Artefacts fix

Notice the new relative path to the project is \csx\release and not bin\release\MyProject.csx.

image

Summary

This is not the first time, that we get breaking changes. In 1.5 allot of the MSBuild target names changed, they were fine the way they were, sometimes I just do not understand certain changes that did not really need to be made.

So, can anyone explain why the folder before in bin\MyprojectName.csx was changed to a directory level up from bin and called only csx, it just seems to be changes that we really can do without, or is their some grand planned scheme that will make this change so exciting in the future…who knows?

Automating Windows Azure Deployments leveraging TeamCity and PowerShell

Hi Guys,

Introduction

We will cover:

  • Overview to configure multiple build projects on TeamCity
  • Configure one of the build projects to deploy to the Azure Cloud
  • Automated deployments to the Azure Cloud for Web and Worker Roles
  • Leverage the msbuild target templates to automate generation of azure package files
  • Alternative solution of generating the azure package files
  • Using configuration transformations to manage settings e.g. UAT, Dev, Production

A colleague of mine Tatham Oddie and I are currently use TeamCity to automatically deploy our Azure/MVC3/Neo4j based project to the Azure cloud. Lets see how this can be done with relative ease and is fully automated. The focus here will be based on Powershell scripts which are using the Cerebrata Command scriplets, which can be found here: http://www.cerebrata.com/Products/AzureManagementCmdlets/Default.aspx

The PowerShell script included here will automatically undeploy and redploy you azure service and will even wait until all the services are in the ready state.

I will leave you to checking those commandlets out, and they worth every penny spent.

Now lets check how we get the deployment working.

The basic idea is that you have a continuous integration build configured on the Build Server in TeamCity, then what you do is configure the CI build to generate artifacts, which are basically the output from the build that can be used by another build project e.g. You can take the artifacts for the CI build and then run Functional Tests or Integration tests builds that run totally separate from the CI build. The idea here is, your functional and integration will NEVER interfere with the CI build and the Unit tests. Thus keeping CI builds fast and efficient.

Prerequisites on Build Server

  • TeamCity Professional Version 6.5.1
  • Cloud Subscription Certificate with Private key is imported into the User Certificate Store for the Team City service account
  • Cerebrata CMDLETS

TeamCity -Continuous Integration Build Project

Ok, so, lets do a quick check at my CI build that spits out the Azure Packages.

image

As we can see above, the CI build creates an Artifact called AzurePackage.

image

The way we generate these artifacts is very easy. In the settings for the CI Build Project we setup the artifacts path.

image

e.g. MyProjectMyProject.Azurebin%BuildConfiguration%Publish => AzurePackage

So, we will look at the build steps to configure.

image

As we can see below, we just say where the MSBuild is run from and then where the unit tests dll’s are.

image

Cool, now we need to setup the artifacts and configuration.

We just mention we want a release build.

image

Ok, now we need to tell our Azure Deployment project to have a dependency on the CI project we configured above.

Team City – UAT Deployment Build Project

So lets now go check out the UAT Deployment project.

image

This project will have dependencies on the CI build and then we will configure all the build parameters so it can connect to your Azure Storage and Service for automatic deployments. Once we done here, we will have a look at the powershell script that we use to automatically deploy to the cloud, the script supports un-deploying existing deployment slots before deploying a new one with retry attempts.

Ok, lets check the following for the UAT deployment project.

image

image

The above screenshot is the command that executes the powershell script, the parameters (%whatever%) will resolve from Build parameters in Step 6 of the screen shot above.

Here is the command for copy/paste friendless. Of course if you using some other Database then you do not need the Neo4j stuff.

-AzureAccountName “%AzureAccountName%” -AzureServiceName “%AzureServiceName%” -AzureDeploymentSlot “%AzureDeploymentSlot%” -AzureAccountKey “%AzureAccountKey%” -AzureSubscriptionId “%AzureSubscriptionId%” -AzureCertificateThumbprint “%AzureCertificateThumbprint%” -PackageSource “%AzurePackageDependencyPath%MyProject.Azure.cspkg” -ConfigSource “%AzurePackageDependencyPath%%ConfigFileName%” -DeploymentName “%build.number%-%build.vcs.number%” -Neo4jBlobName “%Neo4jBlobName%” -Neo4jZippedBinaryFileHttpSource “%Neo4jZippedBinaryFileHttpSource%”

This is the input for a deploy-package.cmd file, which is in our source repository.

image

Now, we also need to tell the Deployment project to use the Artifact from our CI Project. So we setup an Artifact Dependencies as show below in the dependencies section. Also, notice how we use a wildcard, so get all files from AzurePackage (AzurePackage/**). This will be the cspackage files.

image

Notice above, that I have a SnapShot Dependency, this is forcing the UAT deployment to USE the SAME source code that the CI build project is using.

So, the parameters are as follows.

image

PowerShell Deployment Scripts

The Deployment scripts consist of three files and remember I assumed you installed the Cerebrata Management Command Scriptlets.

Ok, so lets look at the Deploy-Package.cmd file, I would like to pay my gratitude to Jason Stangroome(http://blog.codeassassin.com) for this,

Jason wrote: “This tiny proxy script just writes a temporary PowerShell script containing all the arguments you’re trying to pass to let PowerShell interpret them and avoid getting them messed up by the Win32 native command line parser.”

@echo off
setlocal
set tempscript=%temp%\%~n0.%random%.ps1
echo $ErrorActionPreference="Stop" >"%tempscript%"
echo ^& "%~dpn0.ps1" %* >>"%tempscript%"
powershell.exe -command "& \"%tempscript%\""
set errlvl=%ERRORLEVEL%
del "%tempscript%"
exit /b %errlvl%

Ok, and now here is the PowerShell code, Deployment-Package.ps1. I will leave you to read what it does. In Summary.

It demonstrates.

  • Un-Deploying a service deployment slot
  • Deploying a service deployment slot
  • Using the certificate store to retrieve the certificate for service connections via a cert thumbprint – users cert store under the service account that TeamCity runs on.
  • Uploading Blobs
  • Downloading Blobs
  • Waiting until the new deployment is in a ready state
#requires -version 2.0
param (
	[parameter(Mandatory=$true)] [string]$AzureAccountName,
	[parameter(Mandatory=$true)] [string]$AzureServiceName,
	[parameter(Mandatory=$true)] [string]$AzureDeploymentSlot,
	[parameter(Mandatory=$true)] [string]$AzureAccountKey,
	[parameter(Mandatory=$true)] [string]$AzureSubscriptionId,
	[parameter(Mandatory=$true)] [string]$AzureCertificateThumbprint,
	[parameter(Mandatory=$true)] [string]$PackageSource,
	[parameter(Mandatory=$true)] [string]$ConfigSource,
	[parameter(Mandatory=$true)] [string]$DeploymentName,
	[parameter(Mandatory=$true)] [string]$Neo4jZippedBinaryFileHttpSource,
	[parameter(Mandatory=$true)] [string]$Neo4jBlobName
)

$ErrorActionPreference = "Stop"

if ((Get-PSSnapin -Registered -Name AzureManagementCmdletsSnapIn -ErrorAction SilentlyContinue) -eq $null)
{
	throw "AzureManagementCmdletsSnapIn missing. Install them from Https://www.cerebrata.com/Products/AzureManagementCmdlets/Download.aspx"
}

Add-PSSnapin AzureManagementCmdletsSnapIn -ErrorAction SilentlyContinue

function AddBlobContainerIfNotExists ($blobContainerName)
{
	Write-Verbose "Finding blob container $blobContainerName"
	$containers = Get-BlobContainer -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	$deploymentsContainer = $containers | Where-Object { $_.BlobContainerName -eq $blobContainerName }

	if ($deploymentsContainer -eq $null)
	{
		Write-Verbose  "Container $blobContainerName doesn't exist, creating it"
		New-BlobContainer $blobContainerName -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	}
	else
	{
		Write-Verbose  "Found blob container $blobContainerName"
	}
}

function UploadBlobIfNotExists{param ([string]$container, [string]$blobName, [string]$fileSource)

	Write-Verbose "Finding blob $container\$blobName"
	$blob = Get-Blob -BlobContainerName $container -BlobPrefix $blobName -AccountName $AzureAccountName -AccountKey $AzureAccountKey

	if ($blob -eq $null)
	{
		Write-Verbose "Uploading blob $blobName to $container/$blobName"
		Import-File -File $fileSource -BlobName $blobName -BlobContainerName $container -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	}
	else
	{
		Write-Verbose "Found blob $container\$blobName"
	}
}

function CheckIfDeploymentIsDeleted
{
	$triesElapsed = 0
	$maximumRetries = 10
	$waitInterval = [System.TimeSpan]::FromSeconds(30)
	Do
	{
		$triesElapsed+=1
		[System.Threading.Thread]::Sleep($waitInterval)
		Write-Verbose "Checking if deployment is deleted, current retry is $triesElapsed/$maximumRetries"
		$deploymentInstance = Get-Deployment `
			-ServiceName $AzureServiceName `
			-Slot $AzureDeploymentSlot `
			-SubscriptionId $AzureSubscriptionId `
			-Certificate $certificate `
			-ErrorAction SilentlyContinue

		if($deploymentInstance -eq $null)
		{
			Write-Verbose "Deployment is now deleted"
			break
		}

		if($triesElapsed -ge $maximumRetries)
		{
			throw "Checking if deployment deleted has been running longer than 5 minutes, it seems the delployment is not deleting, giving up this step."
		}
	}
	While($triesElapsed -le $maximumRetries)
}

function WaitUntilAllRoleInstancesAreReady
{
	$triesElapsed = 0
	$maximumRetries = 60
	$waitInterval = [System.TimeSpan]::FromSeconds(60)
	Do
	{
		$triesElapsed+=1
		[System.Threading.Thread]::Sleep($waitInterval)
		Write-Verbose "Checking if all role instances are ready, current retry is $triesElapsed/$maximumRetries"
		$roleInstances = Get-RoleInstanceStatus `
			-ServiceName $AzureServiceName `
			-Slot $AzureDeploymentSlot `
			-SubscriptionId $AzureSubscriptionId `
			-Certificate $certificate `
			-ErrorAction SilentlyContinue
		$roleInstancesThatAreNotReady = $roleInstances | Where-Object { $_.InstanceStatus -ne "Ready" }

		if ($roleInstances -ne $null -and
			$roleInstancesThatAreNotReady -eq $null)
		{
			Write-Verbose "All role instances are now ready"
			break
		}

		if ($triesElapsed -ge $maximumRetries)
		{
			throw "Checking if all roles instances are ready for more than one hour, giving up..."
		}
	}
	While($triesElapsed -le $maximumRetries)
}

function DownloadNeo4jBinaryZipFileAndUploadToBlobStorageIfNotExists{param ([string]$blobContainerName, [string]$blobName, [string]$HttpSourceFile)
	Write-Verbose "Finding blob $blobContainerName\$blobName"
	$blobs = Get-Blob -BlobContainerName $blobContainerName -ListAll -AccountName $AzureAccountName -AccountKey $AzureAccountKey
	$blob = $blobs | findstr $blobName

	if ($blob -eq $null)
	{
	    Write-Verbose "Neo4j binary does not exist in blob storage. "
	    Write-Verbose "Downloading file $HttpSourceFile..."
		$temporaryneo4jFile = [System.IO.Path]::GetTempFileName()
		$WebClient = New-Object -TypeName System.Net.WebClient
		$WebClient.DownloadFile($HttpSourceFile, $temporaryneo4jFile)
		UploadBlobIfNotExists $blobContainerName $blobName $temporaryneo4jFile
	}
}

Write-Verbose "Retrieving management certificate"
$certificate = Get-ChildItem -Path "cert:\CurrentUser\My\$AzureCertificateThumbprint" -ErrorAction SilentlyContinue
if ($certificate -eq $null)
{
	throw "Couldn't find the Azure management certificate in the store"
}
if (-not $certificate.HasPrivateKey)
{
	throw "The private key for the Azure management certificate is not available in the certificate store"
}

Write-Verbose "Deleting Deployment"
Remove-Deployment `
	-ServiceName $AzureServiceName `
	-Slot $AzureDeploymentSlot `
	-SubscriptionId $AzureSubscriptionId `
	-Certificate $certificate `
	-ErrorAction SilentlyContinue
Write-Verbose "Sent Delete Deployment Async, will check back later to see if it is deleted"

$deploymentsContainerName = "deployments"
$neo4jContainerName = "neo4j"

AddBlobContainerIfNotExists $deploymentsContainerName
AddBlobContainerIfNotExists $neo4jContainerName

$deploymentBlobName = "$DeploymentName.cspkg"

DownloadNeo4jBinaryZipFileAndUploadToBlobStorageIfNotExists $neo4jContainerName $Neo4jBlobName $Neo4jZippedBinaryFileHttpSource

Write-Verbose "Azure Service Information:"
Write-Verbose "Service Name: $AzureServiceName"
Write-Verbose "Slot: $AzureDeploymentSlot"
Write-Verbose "Package Location: $PackageSource"
Write-Verbose "Config File Location: $ConfigSource"
Write-Verbose "Label: $DeploymentName"
Write-Verbose "DeploymentName: $DeploymentName"
Write-Verbose "SubscriptionId: $AzureSubscriptionId"
Write-Verbose "Certificate: $certificate"

CheckIfDeploymentIsDeleted

Write-Verbose "Starting Deployment"
New-Deployment `
	-ServiceName $AzureServiceName `
	-Slot $AzureDeploymentSlot `
	-PackageLocation $PackageSource `
	-ConfigFileLocation $ConfigSource `
	-Label $DeploymentName `
	-DeploymentName $DeploymentName `
	-SubscriptionId $AzureSubscriptionId `
	-Certificate $certificate

WaitUntilAllRoleInstancesAreReady

Write-Verbose "Completed Deployment"

Automating Cloud Package File without using CSPack and CSRun explicitly

We will need to edit the Cloud Project file so that Visual Studio can create the cloud package files , as it will then automatically run the cspackage for you which can be consumed by the artifacts and hence other build projects. This allows us to bake functionality into the MSBuild process to generate the package files without the need for explicitly using cspack.exe and csrun.exe. Resulting in less scripts, else you would need a separate PowerShell script just to package the cloud project files.

Below are the changes for the .ccproj file of the Cloud Project. Notice the condition is that we generate these package files ONLY if the build is outside of visual studio, so this is nice to keep it from not always creating the packages to keep our development experience build process short. So for the condition below to work, you will need to build the project from the command line using MSBuild.

Here is the config entries for the project file.

  <PropertyGroup>
    <CloudExtensionsDir Condition=" '$(CloudExtensionsDir)' == '' ">$(MSBuildExtensionsPath)\Microsoft\Cloud Service\1.0\Visual Studio 10.0\</CloudExtensionsDir>
  </PropertyGroup>
  <Import Project="$(CloudExtensionsDir)Microsoft.CloudService.targets" />
  <Import Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets" />
  <Target Name="AzureDeploy" AfterTargets="Build" DependsOnTargets="CorePublish" Condition="'$(BuildingInsideVisualStudio)'!='True'">
  </Target>

e.g.

C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe MyProject.sln /p:Configuration=Release

image

Configuration Transformations

You can also leverage configuration transformations so that you can have configurations for each environment. This is discussed here:

http://blog.alexlambert.com/2010/05/using-visual-studio-configuration.html

However, in a nutshell, you can have something like this in place, this means you can then have separate deployment config files, e.g.

ServiceConfiguration.cscfg

ServiceConfiguration.uat.cscfg

ServiceConfiguration.prod..cscfg

Just use the following config in the .ccproj file.

 <Target Name="ValidateServiceFiles"
          Inputs="@(EnvironmentConfiguration);@(EnvironmentConfiguration->'%(BaseConfiguration)')"
          Outputs="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')">
    <Message Text="ValidateServiceFiles: Transforming %(EnvironmentConfiguration.BaseConfiguration) to %(EnvironmentConfiguration.Identity).tmp via %(EnvironmentConfiguration.Identity)" />
    <TransformXml Source="%(EnvironmentConfiguration.BaseConfiguration)" Transform="%(EnvironmentConfiguration.Identity)"
     Destination="%(EnvironmentConfiguration.Identity).tmp" />

    <Message Text="ValidateServiceFiles: Transformation complete; starting validation" />
    <ValidateServiceFiles ServiceDefinitionFile="@(ServiceDefinition)" ServiceConfigurationFile="%(EnvironmentConfiguration.Identity).tmp" />

    <Message Text="ValidateServiceFiles: Validation complete; renaming temporary file" />
    <Move SourceFiles="%(EnvironmentConfiguration.Identity).tmp" DestinationFiles="%(EnvironmentConfiguration.Identity).transformed.cscfg" />
  </Target>
  <Target Name="MoveTransformedEnvironmentConfigurationXml" AfterTargets="AfterPackageComputeService"
          Inputs="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')"
          Outputs="@(EnvironmentConfiguration->'$(OutDir)Publish\%(filename).cscfg')">
    <Move SourceFiles="@(EnvironmentConfiguration->'%(Identity).transformed.cscfg')" DestinationFiles="@(EnvironmentConfiguration->'$(OutDir)Publish\%(filename).cscfg')" />
  </Target>

Here is a sample ServiceConfiguration.uat.config that will then leverage the transformations. Note the transformation for the web and worker roles sections. Our worker role is Neo4jServerHost and the Web is just called Web.

<?xml version="1.0"?>
<sc:ServiceConfiguration
    xmlns:sc="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"
    xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  <sc:Role name="Neo4jServerHost" xdt:Locator="Match(name)">
    <sc:ConfigurationSettings>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Storage connection string" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Drive connection string" value="DefaultEndpointsProtocol=http;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Neo4j DBDrive override Path" value=""/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="UniqueIdSynchronizationStoreConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
    </sc:ConfigurationSettings>
  </sc:Role>
  <sc:Role name="Web" xdt:Locator="Match(name)">
    <sc:ConfigurationSettings>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
      <sc:Setting xdt:Transform="Replace" xdt:Locator="Match(name)" name="UniqueIdSynchronizationStoreConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myprojectname;AccountKey=myaccountkey"/>
    </sc:ConfigurationSettings>
  </sc:Role>
</sc:ServiceConfiguration>

Manually executing the script for testing

Prerequisites:

  • You will need to install the Cerebrata Azure Management CMDLETS from: https://www.cerebrata.com/Products/AzureManagementCmdlets/Download.aspx
  • If you are running 64 bit version, you will need to follow the readme file instructions contained with the AzureManagementCmdlets, as it requires manual copying of files. If you followed the default install, this readme will be in C:\Program Files\Cerebrata\Azure Management Cmdlets\readme.pdf
  • You will need to install the Certificate and Private Key (Which must be marked as exportable) to your User Certificate Store. This file will have an extension of .pfx. Use the Certificate Management Snap-In, for User Account Store. The certificate should be installed in the personal folder.
  • Once the certificate is installed, you should note the certificate thumbprint, as this is used as one of the parameters when executing the PowerShell script. Ensure you remove all the spaces from the thumbprint when using it in the script!

1) First up, you’ll need to make your own ”MyProject.Azure.cspkg” file. To do this, run this:

C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe MyProject.sln /p:Configuration=Release

(Adjust paths as required.)

You’ll now find a package waiting for you at ”C:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\MyProject.Azure.cspkg”.

2) Make sure you have the required management certificate installed on your machine (including the private key).

3) Now you’re ready to run the deployment script.

It requires quite a lot of parameters. The easiest way to find them is just to copy them from the last output log on TeamCity.

You will need to manually execute is Deploy-Package.ps1.

The Deploy-Package.ps1 file has input parameters that need to be supplied. Below is the list of parameters and description.

Note: These values can change in the future, so ensure you do not rely on this example below.

AzureAccountName: The Windows Azure Account Name e.g. MyProjectUAT

AzureServiceName: The Windows Azure Service Name e.g. MyProjectUAT

AzureDeploymentSlot: Production or Staging e.g. Production

AzureAccountKey: The Azure Account Key: e.g. youraccountkey==

AzureSubscriptionId:*The Azure Subscription Id e.g. yourazuresubscriptionId

AzureCertificateThumbprint: The certificate thumbprint you note down when importing the pfx file e.g. YourCertificateThumbprintWithNoWhiteSpaces

PackageSource: Location of the .cspkg file e.g. C:\Code\MyProject\MyProject.Azure\bin\Release\Publish\MyProject.Azure.cspkg

ConfigSource: Location of the Azure configuration files .cscfg e.g. C:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\ServiceConfiguration.uat.cscfg

DeploymentName: This can be a friendly name of the deployment e.g. local-uat-deploy-test

Neo4jBlobName: The name of the blob file containing the Neo4j binaries in zip format e.g. neo4j-community-1.4.M04-windows.zip

Neo4jZippedBinaryFileHttpSource: The http location of the Neo4j zipped binary files e.g. https://mydownloads.com/mydownloads/neo4j-community-1.4.M04-windows.zip?dl=1

-Verbose: You can use an additional parameter to get Verbose output which is useful when developing and testing the script, just append -Verbose to the end of the command.

Below is an example executed on my machine, this will be different on your machine, so use it as a guideline only:

<code title=Sample Deployment Execution>

.\Deploy-Package.ps1 -AzureAccountName MyProjectUAT `

-AzureServiceName MyProjectUAT `

-AzureDeploymentSlot Production `

-AzureAccountKey youraccountkey== `

-AzureSubscriptionId yoursubscriptionid `

-AzureCertificateThumbprint yourcertificatethumbprint `

-PackageSource “c:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\MyProject.Azure.cspkg” `

-ConfigSource “c:\Code\MyProject\MyProject\MyProject.Azure\bin\Release\Publish\ServiceConfiguration.uat.cscfg” `

-DeploymentName local-uat-deploy-test -Neo4jBlobName neo4j-community-1.4.1-windows.zip `

-Neo4jZippedBinaryFileHttpSource https://mydownloads.com/mydownloads/neo4j-community-1.4.1-windows.zip?dl=1 -Verbose

</code>

Note: When running the PowerShell command and the 64bit version of the scripts, ensure you running the PowerShell version that you fixed in the readme file from Cerebrata, do not rely on the default shortcut links in the start menu!

Summary

Well, I hope this will help you automating Azure Deployments to the cloud, this a great way to keep UAT happy with Agile deployments to meet the goals of every sprint.

If you do not like the way we generate the package files above, you can choose to use CSRun and CSPack explicitly, I have prepared this script already, below is the code for you to use.

#requires -version 2.0
param (
	[parameter(Mandatory=$false)] [string]$ArtifactDownloadLocation
)

$ErrorActionPreference = "Stop"

$installPath= Join-Path $ArtifactDownloadLocation "..\AzurePackage"
$azureToolsPackageSDKPath="c:\Program Files\Windows Azure SDK\v1.4\bin\cspack.exe"
$azureToolsDeploySDKPath="c:\Program Files\Windows Azure SDK\v1.4\bin\csrun.exe"

$csDefinitionFile="..\..\Neo4j.Azure.Server\ServiceDefinition.csdef"
$csConfigurationFile="..\..\Neo4j.Azure.Server\ServiceConfiguration.cscfg"

$webRolePropertiesFile = ".\WebRoleProperties.txt"
$workerRolePropertiesFile=".\WorkerRoleProperties.txt"

$csOutputPackage="$installPath\Neo4j.Azure.Server.csx"
$serviceConfigurationFile = "$installPath\ServiceConfiguration.cscfg"

$webRoleName="Web"
$webRoleBinaryFolder="..\..\Web"

$workerRoleName="Neo4jServerHost"
$workerRoleBinaryFolder="..\..\Neo4jServerHost\bin\Debug"
$workerRoleEntryPointDLL="Neo4j.Azure.Server.dll"

function StartAzure{
	"Starting Azure development fabric"
	& $azureToolsDeploySDKPath /devFabric:start
	& $azureToolsDeploySDKPath /devStore:start
}

function StopAzure{
	"Shutting down development fabric"
	& $azureToolsDeploySDKPath /devFabric:shutdown
	& $azureToolsDeploySDKPath /devStore:shutdown
}

#Example: cspack Neo4j.Azure.Server\ServiceDefinition.csdef /out:.\Neo4j.Azure.Server.csx /role:$webRoleName;$webRoleName /sites:$webRoleName;$webRoleName;.\$webRoleName /role:Neo4jServerHost;Neo4jServerHost\bin\Debug;Neo4j.Azure.Server.dll /copyOnly /rolePropertiesFile:$webRoleName;WebRoleProperties.txt /rolePropertiesFile:$workerRoleName;WorkerRoleProperties.txt
function PackageAzure()
{
	"Packaging the azure Web and Worker role."
	& $azureToolsPackageSDKPath $csDefinitionFile /out:$csOutputPackage /role:$webRoleName";"$webRoleBinaryFolder /sites:$webRoleName";"$webRoleName";"$webRoleBinaryFolder /role:$workerRoleName";"$workerRoleBinaryFolder";"$workerRoleEntryPointDLL /copyOnly /rolePropertiesFile:$webRoleName";"$webRolePropertiesFile /rolePropertiesFile:$workerRoleName";"$workerRolePropertiesFile
	if (-not $?)
	{
		throw "The packaging process returned an error code."
	}
}

function CopyServiceConfigurationFile()
{
	"Copying service configuration file."
	copy $csConfigurationFile $serviceConfigurationFile
}

#Example: csrun /run:.\Neo4j.Azure.Server.csx;.\Neo4j.Azure.Server\ServiceConfiguration.cscfg /launchbrowser
function DeployAzure{param ([string] $azureCsxPath, [string] $azureConfigPath)
	"Deploying the package"
    & $azureToolsDeploySDKPath $csOutputPackage $serviceConfigurationFile
	if (-not $?)
	{
		throw "The deployment process returned an error code."
	}
}

Write-Host "Beginning deploy and configuration at" (Get-Date)

PackageAzure
StopAzure
StartAzure
CopyServiceConfigurationFile
DeployAzure '$csOutputPackage' '$serviceConfigurationFile'

# Give it 60s to boot up neo4j
[System.Threading.Thread]::Sleep(60000)

# Hit the homepage to make sure it's warmed up
(New-Object System.Net.WebClient).DownloadString("http://localhost:8080") | Out-Null

Write-Host "Completed deploy and configuration at" (Get-Date)

note, if using .Net 4.0 which I am sure you all are, you will need to provide the text files for web role and worker role with these entries.

WorkerRoleProperties.txt
TargetFrameWorkVersion=v4.0
EntryPoint=Neo4j.Azure.Server.dll

WebRoleProperties.txt
TargetFrameWorkVersion=v4.0

Thanks to Tatham Oddie for contributing and coming up with such great ideas for our builds.
Cheers

Romiko

Neo4j and gremlin plugin install guide

29/08/2011: OBSOLETE – Now baked into the Core of Neo4j.

Hi,

I was having some difficulties getting the Gremlin query plugin working correctly with the Neo4j server which we will host on a Windows Azure VM.

Below is some steps to get this working nicely.

Firstly you will of course need to have Neo4j running. Then all we need to do is install the following:

Java JDK – Here is my version
java version “1.6.0_26”
Java(TM) SE Runtime Environment (build 1.6.0_26-b03)
Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode)

JDK is needed to compile the plugin.

Maven 2.2.1 (There is compilation errors with SnapShot compiles with 3.0.3 at time of writing)
http://www.apache.org/dyn/closer.cgi/maven/binaries/apache-maven-2.2.1-bin.zip

Also, we need MVN, this is used to compile the Gremln Plugin. I was have problems with

Neo4j gremlin plugin
https://github.com/neo4j/gremlin-plugin

I like to setup environment variables to Neo4j server folder, java_home and also the maven location.

image

Once done we can then compile the plugin and copy it into the Neo4j plugins folder.

mvn clean package 
copy target\neo4j-gremlin-plugin-0.1-SNAPSHOT.jar $NEO4J_HOME\plugins 
cd $NEO4J_HOME\bin\neo4j.bat restart

Compiled version of the plugin.

image

Here we can see the plugin in the folder.

image

Now, to ensure Neo4j has the plugin, we can execute a curl command to check the extension is installed:

C:\Users\Romiko>curl localhost:7474/db/data/
{
  "relationship_index" : "http://localhost:7474/db/data/index/relationship",
  "node" : "http://localhost:7474/db/data/node",
  "relationship_types" : "http://localhost:7474/db/data/relationship/types",
  "extensions_info" : "http://localhost:7474/db/data/ext",
  "node_index" : "http://localhost:7474/db/data/index/node",
  "reference_node" : "http://localhost:7474/db/data/node/0",
  "extensions" : {
    "GremlinPlugin" : {
      "execute_script" : "http://localhost:7474/db/data/ext/GremlinPlugin/graphd
b/execute_script"
    }
  }
}

As we can see above, the rest result from the server has the GremlinPlugin Extension. In fact we can now do an HTTP Post Gremlin query to get the nodes from the object graph in the database.

e.g.

I want to see if I have a Node at the second level that has a relationship of type Related To with the Out Direction.

g.v(1).outE(‘RELATED_TO’)

Now we need to URL encode this.

+g.v(1).outE(%27RELATED_TO%27)

curl -d “script=+g.v(1).outE(%27RELATED_TO%27)” http://localhost:7474/db/data/ext/GremlinPlugin/graphdb/execute_script

as we can see the output is:

C:\Users\Romiko>curl -d “script=+g.v(1).outE(%27RELATED_TO%27)” http://localhost

:7474/db/data/ext/GremlinPlugin/graphdb/execute_script

[ {

“start” : “http://localhost:7474/db/data/node/1″,

“data” : {

},

“self” : “http://localhost:7474/db/data/relationship/23″,

“property” : “http://localhost:7474/db/data/relationship/23/properties/{key}”,

“properties” : “http://localhost:7474/db/data/relationship/23/properties”,

“type” : “RELATED_TO”,

“extensions” : {

},

“end” : “http://localhost:7474/db/data/node/2″

} ]

We have Node 23 being related to Node 2. (Remember Node 0, then Node 1).

the above result can be confirmed in the gremlin console (now baked into Neo4j) as of June 2011.

Note: The gremlin extension that is now part of theneo4j server\lib extensions is not for rest API queries, you still need this plugin!

  • gremlin> g.v(1).outE(‘RELATED_TO’)
  • ==> e[23][1-RELATED_TO->2]
  • gremlin>

Here is a screenshot of the gremlin console now baked into Neo4j.

image

image

Hope this gets you started with Neo4j and the gremlin plugin query language Smile

I might be looking at building a custom IQuerable expression translation, so we can then use Linq to query a gremlin based API. Might be fun to do, but first need to learn more about gremlin and Neo4j.

There is a fluent API for gremlin queries you can leverage as a .Net client:

NuGetPackage:
Source Code at:

Cheers

Azure Blob Storage Helper Class and Shared Access Policy

Hi,

I have create a simple helper class that can be used to Upload Blobs to a public and private container. It then also allows you to grant users Temporary access at the blob level for 2 days. This is nice when you want to provide a download link that will expire.

Below is the class

using System.IO;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;



namespace Common.Azure
{
    public interface IStorage
    {

        string ContainerThumbnails { get; }
        string ContainerPhotos { get; }

        CloudStorageAccount StorageAccountInfo { get; set; }
        CloudBlobClient BlobClient { get; set; }

        CloudBlob UploadBlob(string blobUri, Stream stream, string containerName, bool isPublic);
        string GetSharedAccessSignatureToDownloadBlob(string blobUri, string containerName, string userName);
    }
}

 

using System;
using System.Configuration;
using System.IO;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;

namespace Common.Azure
{
    public class Storage : IStorage
    {
        public string ContainerThumbnails
        {
            get { return "photothumbnails"; }
        }
        public string ContainerPhotos
        {
            get { return "photos"; }
        }


        public CloudStorageAccount StorageAccountInfo { get; set; }
        public CloudBlobClient BlobClient { get; set; }

        public Storage()
        {

            CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
            {
                if (RoleEnvironment.IsAvailable)
                    configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
                else
                    configSetter(ConfigurationManager.AppSettings[configName]);
            });

            StorageAccountInfo = CloudStorageAccount.FromConfigurationSetting("StorageConnectionString");
            BlobClient = StorageAccountInfo.CreateCloudBlobClient();
        }



        public CloudBlob UploadBlob(string blobUri, Stream stream, string containerName, bool isPublic)
        {
            var container = BlobClient.GetContainerReference(containerName);
            container.CreateIfNotExist();

            if (isPublic)
            {
                var permissions = new BlobContainerPermissions
                {
                    PublicAccess = BlobContainerPublicAccessType.Container
                };
                container.SetPermissions(permissions);
            }
            else
            {
                var permissions = new BlobContainerPermissions
                {
                    PublicAccess = BlobContainerPublicAccessType.Off
                };
                container.SetPermissions(permissions);
            }


            var blob = container.GetBlockBlobReference(blobUri);
            blob.UploadFromStream(stream);
            return blob;
        }

        public string GetSharedAccessSignatureToDownloadBlob(string blobUri, string containerName, string userName)
        {
            var container = BlobClient.GetContainerReference(containerName);
            container.CreateIfNotExist();
            var blob = container.GetBlockBlobReference(blobUri);




              
            var containeraccess= new SharedAccessPolicy();
            containeraccess.Permissions = SharedAccessPermissions.Read;

            var blobaccess = new SharedAccessPolicy
                                 {
                                     SharedAccessExpiryTime = DateTime.UtcNow.AddDays(2)
                                 };

            var perm = new BlobContainerPermissions
                           {
                               PublicAccess = BlobContainerPublicAccessType.Off
                           };
            perm.SharedAccessPolicies.Clear();
            perm.SharedAccessPolicies.Add(userName, containeraccess);

            container.SetPermissions(perm, new BlobRequestOptions());

            return blob.GetSharedAccessSignature(blobaccess, userName);
        }
    }
}

Now, in a MVC 3 controller, I can call the helper class when an order is submitted and removed from the shopping cart:

orderDetails.DownloadLink = storage.GetSharedAccessSignatureToDownloadBlob(photo.Photo_Url),
                                                                          "photos",
                                                                          cartItem.Username);

 

That’s all this is to it, then on the downloads link, you just render the shared access url

   //
        //Get: /Download/
        [HttpGet]
        public ActionResult Download(int photoId)
        {
            var userName = HttpContext.User.Identity.Name;
            OrderDetail order = _storeDb.OrderDetails.Include("Photo").Include("Order").First(x => x.PhotoId == photoId && x.Order.Username == userName);

            if (order == null)
                return View("InvalidPhoto");

            order.DownloadCount += 1;
            _storeDb.SaveChanges();

            string path = order.Photo.Photo_Url + order.DownloadLink;
            return base.Redirect(path);

        }

e.g.

http://myblobs.com/containername/blobname.jpg?se=2011-02-22T01%3A07%3A20Z&sr=b&si=romiko&sig=PsfUXcJtWRoWBvIiz%2FvHoUJnYF2D70%2B3CdlBbn9SiOM%3D