Next generation VPN software – Wangle

I have been working in IT for over 20 years. One of the issues we often have is:

How do we provide encryption and leverage saving data usage on a mobile phone data bundle contracts.

How do we balance anonymous behavior without comprising speed thus improving security.

Finally the solution will be released within the next few days, possibly before 15 June 2016.
The software is a mobile app called Wangle.

Watch the video, showcasing how the software works.

Many mobile phone apps have huge flaws when communicating with their back end servers. Wangle will provide another layer of security to ensure it cannot be intercepted by hackers.

What is even more exciting is that Wangle will reduce your data consumption, thus saving on mobile phone bills. Exactly by how much, remains unknown, until the app is used by the masses. However, during beta phase testing PDF download speeds were 15-20 times faster.

Sign up for early discounts at Wangle’s website, with a 50% discount for the first years annual subscription.

The Chip in CISCO routers and Hubs
This is the cream on top of the cake. Wangle is busy developing a chip with a tech company in Israel, that will be installed into networking routers, hubs and multiplexers, so that Encryption, Compression and Security can be applied at the Layer3, Layer 4 networking level, which will be far more superior than running it at the application level in the OSI networking model. This means, we are looking at the first commercial VPN that provides Layer 3/4 features at the hardware level. This will blow out current SSL accelerators which just focus on SSL encryption speeds.

Wangle is looking to be the next disrupting technology that will be leverage by huge data consumers like Netflix.

I encourage those in South Africa and Australia to download the app as soon as it is available on the Google Play store and Apple Store. In a future blog post, I will be posting my own performance testing results.

With the advent of Cloud Computing, we can now combine software and hardware level features that leverage Content Delivery Network’s such as Amazon S3 and Amazon Cloud Front.

Wangle is not magic, it is real. When a user first downloads a video via the Wangle VPN, it gets redistributed to all Wangle endpoints around the world.

When another user requests the same video, instead of serving it off the original location, Wangle will serve it to its VPN user from it’s own cloud distribution that is MUCH closer to the user.

Catch you all in a few weeks time when I will post stats on the Wangle VPN and publish the performance reports. I will also do a in depth analysis of the Wangle infrastructure via packet level interception using test encryption keys on Wireshark. This will allow us to have a look at what cloud technologies and endpoints are being leverage to deliver content to the user Faster, Secure and Compressed, with no compromise.

PACS Server IntelePACS 4-2-1-P394 – Medical Connections – Inaccurate Image Counts


When quering pacs at the Study Level, it is possible to get the incorrect ImageCounts, due to bugs in the software of IntelePACS. I think it is due to studies with mixed modalities.

I have written a library to alleviate this issue for .Net, where the imageCounts can be correctly retrieved at the Series level.

We need to query at the SeriesLevel and just parse an empty string for the studyUId (To force it)

private void SetQueryResultsSeries()
            var seriesCount = Data.Count;
            if (seriesCount > 0)

                for (var i = 0; i < seriesCount; i++)
                    var imagesInSeriesCount = Data[i][Keyword.NumberOfSeriesRelatedInstances];
                    if (imagesInSeriesCount.ExistsWithValue)
                        ImageCount += int.Parse(imagesInSeriesCount.Value.ToString());
                ImageCount = 0;

Then to use my lirbary, we just do this:

var query = new DicomQueryManager("AE_Romiko", "MYMasterPacsServer", "5000", "MyAccessionNumber","").BuildMasterSeriesLevel();
            //Notice the empty string above to force studyLevel enumaration so I can get the actual series collections.
            var imageCount = query.ImageCount

Bound a Windows Form to WorkingArea on multiple display setup

If you want to ensure a windows form cannot be dragged out of the view-able area of a multiple monitor screen and also want the option to dock it to the monitor it was actively on, then this code might be helpful. It also has a tolerance level of 50% where 50% of the form can be out of the view-able area.

You might think you do not need to enumerate the screens, but you do, if you want to dock it, especially if some screens are portrait and others are landscape.

You can optimize the code by storing the LeftMost and RightMost screen in a global static location.

 private void DockFormIfOutOfViewableArea()
            var widthTolerance = Location.X + (Width / 2);
            var heightTolerance = Location.Y + (Height / 2);
            Screen.AllScreens.OrderBy(r => r.WorkingArea.X).ForEach(screen =>
                if (!IsOnThisScreen(screen)) return;

                if (heightTolerance > screen.WorkingArea.Height)
                    Location = new Point(screen.WorkingArea.X, screen.Bounds.Height - Height + screen.Bounds.Y);
                if (Location.Y < screen.WorkingArea.Y )
                    Location = new Point(screen.WorkingArea.X, screen.WorkingArea.Y);

            if (widthTolerance > SystemInformation.VirtualScreen.Right)
                var closestScreen = Screen.AllScreens.OrderBy(r => r.WorkingArea.X).Last();
                Location = new Point(closestScreen.Bounds.Right - Width, closestScreen.Bounds.Height - Height + closestScreen.Bounds.Y);

            if (widthTolerance < SystemInformation.VirtualScreen.Left)
                var closestScreen = Screen.AllScreens.OrderBy(r => r.WorkingArea.X).First();
                Location = new Point(closestScreen.Bounds.Left, closestScreen.Bounds.Height - Height + closestScreen.Bounds.Y);


Nancy Rest Services – GZIP IT!

When dealing with JSON data, and you dealing with large result sets, say larger than a 1MB or so, it will definitely be feasible in many situation to zip the data before sending it to your client application.

The first step is to add zipping to the pipeline that Nancy uses, we then check that the content type returned in the response is JSON and we check that the client can accept the encoding of GZIP.

public static void AddGZip(IPipelines pipelines)
            pipelines.AfterRequest += ctx =&amp;gt;
                if ((!ctx.Response.ContentType.Contains(&amp;quot;application/json&amp;quot;)) || !ctx.Request.Headers.AcceptEncoding.Any(
               x =&amp;gt; x.Contains(&amp;quot;gzip&amp;quot;))) return;
                var jsonData = new MemoryStream();

                jsonData.Position = 0;
                if (jsonData.Length &amp;lt; 4096)
                    ctx.Response.Contents = s =&amp;gt;
                    ctx.Response.Headers[&amp;quot;Content-Encoding&amp;quot;] = &amp;quot;gzip&amp;quot;;
                    ctx.Response.Contents = s =&amp;gt;
                        var gzip = new GZipStream(s, CompressionMode.Compress, true);

Perfect, now what we want to do is also, is in the CLIENT application calling the rest service, we need to add a header to the request so the server knows is supports GZIP:
Accept-Encoding: gzip

So, we add this code to the client.

Request sent by client.

protected WebRequest AddHeaders(WebRequest request)
            request.Headers.Add(&amp;quot;Accept-Encoding&amp;quot;, &amp;quot;gzip&amp;quot;);
            return request;

Response processed by client.

if (((HttpWebResponse)response).ContentEncoding == &amp;quot;gzip&amp;quot;
                    &amp;amp;&amp;amp; response.ContentType.Contains(&amp;quot;application/json&amp;quot;))
                    var gzip = new GZipStream(response.GetResponseStream(), CompressionMode.Decompress, true);
                    var readerUnzipped = new StreamReader(gzip);
                    Response = Deserialize(readerUnzipped);
                   Response = Deserialize(reader);

Implement whatever deserializer you want, and then make sure you close the stream, reader.close😉

Server No GZIP

GZIP-With Compression

NServiceBus-ServiceMatrix Saga To Saga Request/Response Pattern


This document explains how to setup a SAGA to SAGA request Response pattern. Bus.Reply is used, as ReplyToOriginator is not supported. We will simulate a service receiving an order, then sending it to an Order Service, which then has a request/response pattern to process payment.

  • We will create 3 endpoints, OrderSender , OrderService, PaymentService
  • We will configure an order to send an initial Order from OrderSender to OrderSaga
  • We will then configure OrderSaga to send a Request/Response to PaymentSaga
  • Note, I have message correlation as well. This is needed for ReplyToOriginator to work between Saga’s from a timeout.

Saga to Saga Request/Response supports Bus.Reply, however do not use it in TimeOut handlers, as it will try reply to the timeout queue.

ReplyToOriginator also works, when you need to call the originating Saga, however there were issues on the bug report However you can get it working by doing two things:

  1. Ensure the Calling Saga outlives the Called Saga (Create a Long TimeOut that marks Saga completed in Calling, Create a shorter timeout in the Called Saga that MarksItComplete)
  2. Add this code to the ProcessOrderHandlerConfigureHowToFindSaga.cs

You can download the source code at:


Just enabled NUGET package restore J

TimeOuts and responding to the original calling (originator) saga

Never use Bus.Reply within a TimeOut handle in the Saga, as it will try to reply to the timeout queue, as Bus.Reply will always respond to the source of the LAST incoming message.

To get ReplyToOriginator working between Saga’s, you need to:

  1. Ensure the called Saga (ProcessOrder) lives LONGER than the called (Payment), by using timeouts in both sagas
  2. You need to add a correlation



This is the message pattern with timeouts and a polling pattern, which you can run indefinitely if ever needed.


Create three endpoints

  1. Click New endpoint
  2. Create an OrderReceiver, as an NServiceBus Host. Do the same for OrderSaga and PaymentSaga

  3. Your canvas will look like this


Send a message from OrderReceiver to OrderSaga

So now we will simulate a service (OrderReceiver) that receives orders on a backend system and then sends them to our OrderSaga for long running transaction processing

  1. Click the OrderReceiver and click “Send Command”
  2. Set the Service name to Orders (Domain) and the command ProcessOrder

    Your canvas should look like this

  3. Click the undeployed component and select Deploy
  4. Select OrderSaga as the destination and click Done

    Your canvas should look like this, with a bit of interior design J

  5. Edit the ProcessOrder message and add the following properties

  6. Open the ProcessOrderSender.cs file under the Orders folder, we will configure it to send 3 orders. We will implement IWantToRunWhenBusStartsAndStops


Note that I am not in the infrastructure folder, as this is generated code.

  1. Build the solution


Configure the OrderSaga as a Saga and Message Correlation

Great, so now we have the minimum need to set the OrderSaga endpoint to a real Saga, as a SAGA MUST have a message to handle. In this case ProcessOrder.

  1. Click the ProcessOrderHandler and click “Convert To Saga”
  2. This will open the ProcessOrderHandlerConfigureHowToFindSaga.cs file. Build the solution, so that partial classes are generated.
  3. We want to correlate order messages based on the orderId to the correct Saga instance. So here we will set the properties on how to find it. Add the following code:
  4. Open the file ProcessOrderHandlerSagaData.cs and add the OrderId, set the property to Unique, as this is how the Saga will correlate messages to the correct instance.

Excellent, so now we have correlation established between the OrderReceiver and the OrderSaga. So if ever the Saga receives order updates for the same order, the infrastructure will no which instance to send the processorder command to.



Configure Saga To Saga Command

Here we will configure the OrderSaga to send a message to the PaymentSaga, then we will update the PaymentSaga to become a Saga.

  1. Click the ProcessOrderHandler and click SendCommand
  2. Name the command ProcessOrderPayment

  3. Click Copy to Clipboard. This will then open the ProcessOrderHandler.cs file. Paste the code.
  4. Open the Canvas, it should look like this
  5. Click the ProcessOrderPaymentHandler, and Click Deploy.
  6. Select the Payment Saga, as this will handle the ProcessOrderPayment Request.
  7. Your canvas will look like this. BUILD SOLUTION
  8. Let’s CONVERT PaymentSaga endpoint to a Saga, as we have the minimum needed to do this!
    WARNING: NEVER convert an endpoint to a saga unless it has at least one message handler, else it cannot implement IAmStartedByMessages interface. You would have to wire it up manually, since the Infrastructure code generator will not know how.
  9. Click ProcessOrderPaymentHandler and click Convert to Saga…
  10. This will open the ProcessOrderPaymentHandlerConfigureHowToFindSaga
  11. Build the solution, to auto generate the Saga partial classes and infrastructure
  12. We want the payment instance to correlate to the correct order Id, so add this:



    Build the solution! We added properties so ConfigureHowToFindSaga will compile J


Configure Saga To Saga Response and Bus.Reply

  1. Open the ServiceMAtrix Canvas, confirm your canvas looks like this

    notice the icon for saga’s has a circle in it with a square.
  2. Click the ProcessOrderPaymentHandler in the payment Saga and click Reply with Message…
  3. Click Ok
  4. Copy the code to Clipboard
  5. Click the Copy To Clipboard, note the mouse pointer will show as busy, however you can still click copy to clipboard.
  6. This will open the ProcessOrderPaymentHandler.cs, paste the code here. Put in a Thread.Sleep to simulate a credit card payment.

  7. Your canvas will look like this now
  8. Add the following code to ProcessOrderHandler.cs file
  9. Build the solution


Testing the solution

Following the following in Order, so the msmq’s are created in the correct order, to avoid race conditions on the first time is starts.

  1. Start the SagaToSagaRequestResponse.PaymentSaga
  2. Start the SagaToSagaRequestResponse.OrderSaga
  3. Start the SagaToSagaRequestResponse.OrderReceiver
    You should see


In ServiceInsight we see:

Source Code

You can download the source code at:


Just enabled NUGET package restore J

Kill/Terminate process for current logged on user

Below is code that you can use to terminate processes that belong to the currently logged on user. This is using WMI and will work with all authenticated users, even non administrators.

You can use ExcludeMe to exclude a process, e.g. if you running a program and want to guarantee one instance on the machine, but it must not kill the current program.
KillProcesses(Process.GetCurrentProcess().ProcessName + “.exe”, false, Process.GetCurrentProcess());
The above will kill all other processes with same name, except the calling program.

Download Source Code

e.g. KillProcesses(“chrome.exe”, true, null);

 public static void KillProcesses(string processName, bool currentUserOnly, Process excludeMe = null)
            var processes = new ManagementObjectSearcher(string.Format("SELECT * FROM Win32_Process WHERE  Name='{0}'", processName)).Get();
            foreach (var o in processes)
                var process = (ManagementObject)o;
                var processId = int.Parse(process["ProcessId"].ToString());
                if (process["ExecutablePath"] == null) continue;
                if (excludeMe != null && processId == excludeMe.Id) continue;
                var ownerInfo = new object[2];
                process.InvokeMethod("GetOwner", ownerInfo);
                var owner = (string)ownerInfo[0];

                if (currentUserOnly)
                    var windowsIdentity = WindowsIdentity.GetCurrent();
                    if (windowsIdentity == null) return;
                    var currentUser = windowsIdentity.Name;
                    if (currentUser.Contains(owner))
                        process.InvokeMethod("Terminate", null);
                    process.InvokeMethod("Terminate", null);

Download Source Code

Unit Testing PowerScribe 360 – RadWhereCOM SDK


Of course no developer wants to write software that is not Unit Testable, else you going to spend 1 months building a project and 2 month debugging it! Why not get your Unit Tests started up front, then write all your use cases as unit tests and bang them one by one.

PowerScribe360 has allot of new features for Radiologists and we want to test when events get published or received, as well as fake all the internal dependencies of the COM e.g. Terminate require PowerScribe to be open, and of course when unit testing, this is a no no, as we want NO external dependencies.

So we will create a wrapper, here is the code, and you can get it from here as well:
Source Code

Note that PowerScribe360 is a class I created to wrap up the RadWhereCom into a public property e.g.

 public partial class Powerscribe360
        public IMyRadWhereCom radWhereCom;
        public Powerscribe360(IMyRadWhereCom myRadWhereCom)
            radWhereCom = myRadWhereCom;

        private void WireUpEvents()
            radWhereCom.UserLoggedIn += UserLoggedIn;
            radWhereCom.UserLoggedOut += UserLoggedOut;
            radWhereCom.AudioTranscribed += AudioTranscribed;
            radWhereCom.AccessionNumbersChanged += AccessionNumbersChanged;
            radWhereCom.ReportFinished += ReportFinished;
            radWhereCom.ReportClosed += ReportClosed;
            radWhereCom.ReportOpened += ReportOpened;
            radWhereCom.ReportChanged += ReportChanged;
            radWhereCom.DictationStarted += DictationStarted;
            radWhereCom.DictationStopped += DictationStopped;
            radWhereCom.Terminated += Terminated;

        public void UserLoggedIn(string userName)
        public void UserLoggedOut(string userName)

var ps = new Powerscribe360(radWhereCom);
ps.radWhereCom.UserLoggedOut += Raise.Event(“Foo”);

Source Code

Notice below, that if we did not MOCK RadWhereCOM, the call to radWhereCom.Terminate() in our event subscribe would fail.

The benefit is now that any “Cause” happening in PS360, we can fake the “effect” e.g.

        public void ShouldRaiseUserLoggedInEvent()
            // arrange
            var wasCalled = false;
            Hub.Subscribe(HubEvents.DictationSystem.LoginCompleted, () =>
                wasCalled = true;
            }, "Foo");
            var radWhereCom = Substitute.For<IMyRadWhereCom>();
            var ps = new Powerscribe360(radWhereCom);  

            // act
            ps.radWhereCom.UserLoggedIn += Raise.Event<RWC_UserLoggedInHandler>("Foo");

            // Assert

        public void ShouldRaiseUserLoggedOffEvent()
            // arrange
            var wasCalled = false;
            Hub.Subscribe(HubEvents.DictationSystem.PSInterop.LoggedOff, () =>
                wasCalled = true;
            }, "Foo");

            var radWhereCom = Substitute.For<IMyRadWhereCom>();
            var ps = new Powerscribe360(radWhereCom);

            // act 
            ps.radWhereCom.UserLoggedOut += Raise.Event<RWC_UserLoggedOutHandler>("Foo");

            // assert

Suppose this is in the constructor for our Powerscribe360 class:
radWhereCom.UserLoggedIn += UserLoggedIn;

Then the test above will fail if the line below is removed.
Now the code will never throw an exception when Terminate or any other PowerScribe 360 dependency is called:

Happy Coding!

Source Code

Medical Connections – DICOM and PACS (DICOMObjects)


After spending some time working with Medical DICOM Images, I have found that it can be a rather complex process to try get Image Counts off a PACS server, as the way to get it, is different when running a DICOM query at the STUDY or IMAGE level.

When CT Scans are being uploaded into your RIS server, it is very important to know what the number of images, especially for large studies with over 1000 images. So that you can ensure all images are received, so that Radiologists can start reporting immediately.

We leverage NService Bus to handle all the Study orders coming in, and the SAGA needs to know when to end, and part of the dependency is based on the number of images arrived during the import phase, akin to when a purchase order and all it’s items have been received by an order processor.

Basically, when you query a MASTER PACS server, the query level must be set to Study, and the way you get your image counts is based on a dataset field (0x0020, 0x1208).

When you query a Modality PACS server (Preferred), you use the Image Level query, and then can get the image count directly off the DicomDataSetCollection.

I have created a DICOM query Wrapper, that you can now use. Which can support backup servers to query, in the event the current server you are querying is not available. This can happen, if the PACS server is used allot, and it on the 50 connection limit.

Download Source Code

Below is how you would use the wrapper:

var preferredQuery =
    new DicomQueryManager(servers.DefaultPreferred, servers, workstation, port, o.AccessionNumber)

var masterQuery =
    new DicomQueryManager(servers.DefaultMaster, servers, workstation, port, o.AccessionNumber)
if (preferredQuery.Found)
    //Do SOmething
    var imageCount = preferredQuery.ImageCount;

Notice above, you execute a BuildPreferred() or BuildMaster(), this logic encapsulates how the query will be built, if it is a Modality server, it will set the query level to Image, and if it is a Master server, will set it to Study level. The logic then to find the ImageCount is automatically done for you.

You can get the source code for this here:

Download Source Code

NServiceBus – Some Benefits


I am not sure why, but in many organisation, there is allot of unnecessary complexity when looking at source code. From Programmers using try/catch blocks all over the show to unnecessary synchronous service calls with multiple threads, causing deadlocks on SQL Server.

I would like to give a simple example, consider this code

static ProcessOrder()
    const int workerCount = 4;
    WorkQueue = new ConcurrentQueue();
    Workers = new List();

    for (var index = 0; index; workerCount; index++)
        Worker = new BackgroundWorker();
        Worker.DoWork += Worker_DoWork;

The above code is extremely dangerous. You have no control over the load it will cause to backend calls, especially when it is calling Stored Procs etc. Secondly, it is not durable.

The above code, can easily be replaced by a NServiceBus Saga or handler, in the above, a Saga is appropriate, as this message is a Root Aggregate, we have an order to process. Sagas will provide an environment to preserve state and all the threading and correlation is all handled for you.

partial void HandleImplementation(ProcessOrder message)
    Logger.Info("ProcessOrder received, Saga Instance: {0} for OrderId: {1}", Data.Id, Data.OrderId);
    AlwaysRunForEveryMessage(message); //counts and extracts new order items.

    if (Data.OrderCount == 1)

    if (Data.InitialDelayPassed) //Expired workflow

From the above, you can see state is preserved for each unique OrderId. This Saga processes multiple order items for an OrderId. Which can be submitted at any time during the day. We do not have to worry about multiple threads, correlation is dealt with automatically, well we set the correlation id to the OrderId, so order items coming in can be matched to the correct Saga instance.

We can now get rid of the unreliable issues with in memory worker threads, and uncontrolled pounding of the database.

By using a MSMQ infrastructure and leveraging NServiceBus, these sort of issues, you find in code, can be easily mitigated.

Install NServiceBus as a service with Powershell


I have just completed a script that can be used to install a service bus host as a windows service. It is nice to use, when delegating the installs to the support team, this is always done via NServiceBus.Host.exe.

It can always be tricky parsing arguments in PowerShell. The trick is to use the Start-Process command, and isolate the arguments into it’s own variable.

Here is the two scripts.

Download the script

The one will install the license and the other installs your endpoints as a windows service. All you now need to do, is just type in the password.

It relies on a directory structure, where.

Root\ServiceNameA\Debug or Root\ServiceNameA\Release
Root\ServiceNameB\Debug or Root\ServiceNameB\Release

I hope this makes installing all your endpoints easy.

Download the script

Now, remember, if you install Subscriber endpoints before publishers, you will get a race condition, so with this script, add the endpoints in the array argument in the correct dependent order. First install all the publisher endpoints first, then the subscriber endpoints.