No, I have not forgotten to post the blog about web services part 2 and MSBuild part 2, I need to find sometime to do it, I hate to rush it, you know…
After toiling with Inline pipelines to achieve low latency objectives, I thought it might be a good idea to share what I have learnt about them.
I have two Send Pipelines.
One of them is used to prepare a dynamic Windows SharePoint Services Send Port
The other is a very complicated pipeline that transforms flat files to a very specific XML schema, it has it’s own rule engine and built in auditing components as well as a host of other class libraries that it uses for flat file conversion to XML. You might ask me, why the Fu… did you write your own component, when there is a BizTalk Mapper, you want me to be honest. I think the BizTalk mapper is a load of S%$^ for enterprise applications. It is cool for flight itinerary examples and very small transformations. Secondly it is slow. Thirdly is consumes huge amounts of memory.
So I decided to develop a Flat File mapping tool that is called from within a Pipeline, this allows me to use the streaming model within the pipeline and process one record at a time from a flat file. This I think is much more flexible. Secondly I can use a custom rule engine to apply manipulation and lastly, I can use a XSLT 2.0! You heard me right, BizTalk does not support XSLT 2.0, so there goes allot of mapping features. How it works in a nutshell, is that I have a database that is used to dynamically detect file feeds and apply the appropriate transformation, the XSLT 2.0 templates are stored in a configuration database that the FileManagement web service interfaces with, the Pipeline component library uses the handlers to these libraries for managing all the metadata. It is extremely fast. Also the actually mapping of the data is stored in a Serializable class, and a UI is used to De-serialize and Serialize this mapping data for on the fly changes. This means, that if the flat file schema changes or the mapping needs to change, I DO NOT need to REDPLOY half of my bloody BizTalk assembles, you know the score, the schemas need to be redeployed, the mappings and so the list goes on.
This is not what this blog is about, but I thought it would be nice since the orchestration I show you is a FileManagement orchestration and is responsible for this part. The orchestration will process a flat file, send it to SharePoint for archiving and then transform it to XML format and then send it to a central database for workflow processing.
Inline Send Pipeline
Here is an overview of the Orchestration, I kept it simple, so big warning! This orchestration is not yet optimised to reduce persistence points. However using Inline Pipelines can improve latency, if done correctly. So if you use this orchestration please optimise it. For example, I have no Suspend shapes, and I should have them, I don’t like to lose my data!
SharePoint Send Pipeline
The task of this pipeline is to promote some properties that the dynamic send port will use e.g.
pInMsg.Context.Write("ConfigPropertiesXml", "http://schemas.microsoft.com/BizTalk/2006/WindowsSharePointServices-properties", configxml);
You can read my previous blog about this property at:
Another task that it does is generate custom metadata by interfacing with a Business Object Layer to detect all metadata.
So in a nutshell this pipeline is more of a property manager.
Here is the Execute method of the WSSFilesEncoder component.
public IBaseMessage Execute(IPipelineContext pContext, IBaseMessage pInMsg)
//initialize the FileMetaData
So above, a flat file goes in and a flat file comes out with some new properties!
For those sharepoint Guru’s? I developed a custom class that can be used to manage properties being sent to a document library.
So if you look at my Document Library, you can see when I send a flat file that metadata is populated, this is all done with the class I show below.
It is a really cool class, if using a dynamic send port to SharePoint you can use this class to generate the config properties in the message context at runtime, just customise it for you! Compare this to the Document Library columns I have above. Then in a pipeline you can instantiate this class and fill in the properties from a xml file or a database!
private string propertySource1Field ="";
private string FileFeedSourceField = "FileFeedSource";
private string propertySource2Field = "";
private string OriginalFileNameField = "OriginalFileName";
private string propertySource3Field = "";
private string BizTalkMessageIDField = "BizTalkMessageID";
private string propertySource4Field = "";
private string RegionField = "Region";
private string propertySource5Field = "";
private string CountryField = "Country";
private string propertySource6Field = "";
private string BrandField = "Brand";
private string propertySource7Field = "";
private string NameField = "Name";
private string propertySource8Field = "";
cp.SPSIDValue = SharePointID.ToString();
File Translator Pipeline
This pipeline actually modifies the message! A flat file goes in and a XML comes out. I apologise for the name UFT, it means universal file translator, I never made up this name! it is rather funny.
Here is the execute method of the FileTranslator Pipeline.
public IBaseMessage Execute(IPipelineContext pc, IBaseMessage pInMsg)
UFTMappingAgent vFileConfig = new UFTMappingAgent(spp.FileFeedSourceValue, spp.SPSIDValue);
Ok, now the exciting bit for those of you that already have pipelines and just want to know how to execute them. Here is my orchestration.
I highlighted the shapes that are used for preparing Inline Pipelines.
In this shape I take the input message that came into the orchestration and I add it to a PipeLine collection of type:
Basically, what you need to realize is that when you call a Send Pipeline it needs to know the 3 things:
1. The type of pipeline to call
2. The SendPipelineInputMessages
3. The Output variable that the pipeline uses to assign it’s output message
So before we execute a pipeline, the first thing we do is prepare the SendPipelineInputMessages. In my case, I am not batching, I just send one message to the collection. Remember messages are immutable in BPEL so when a pipeline spits out a message you MUST assign it to a NEW message. Ok here are the variable properties for my collection.
Here is the code for the expression shape above:
The next shape is specific to my business process and does not concern us much, but I put it here, it is a detection mechanism to detect a file feed and grabs its configuration from a database.
I then have an If statement to check if the file was successfully detected. If it is we can then execute the pipeline. Since we already added the input message to the collection, we ready to go. But before we do so, I have some good news!! Whenever you add messages to a collection of type Microsoft.XLANGs.Pipeline.SendPipelineInputMessages, the CONTEXT properties associated with the message is preserved!
You realise that the call to the Pipeline is ALWAYS in a Construction Shape since we will create a new message, in this case the new message will be: PromotedFlatFile
Here is the code for executing the pipeline, Notice the variables for the method!
So, there you have it, I then take the PromotedFlatFIle and send it to SharePoint.
I also have another pipeline further down, in that section, I do allot of the work in one expression within the Message Construct. Lets check it out!
Above, I use a new collection for my message (We could have used the existing one, BUT NEVER set it to null, you need the constructor!) So the statement below will never work.
FlatFileUFTPipelineInputMessage = null;
Here is the text representation of the call:
System.Diagnostics.Trace.WriteLine("Completed to send flat file to SharePoint");
Now I just send the New
SendUFTMessage which is an XML file and not a flat file!
What I do here is call another orchestration to import the xml data.
Executing Inline pipelines is pretty straight forward, however you need to make sure you working with the latest version of a message, in my case the UFT Pipeline was working with a message constructed from the previous pipeline!
Also, the above orchestration is not optimised. So choosing transaction scopes wisely and the use of Suspend shapes is a good idea, which I will do when i go back to work on Monday morning, sigh………..