Thoughts on the Microsoft Integration Roadmap

Microsoft has finally released an integration roadmap. For the past several years there has been concern over the future of BizTalk and Microsoft’s integration strategy. For many BizTalk developers, the perception was that BizTalk was no longer a strategic product and that Microsoft was just putting in the minimum effort necessary to keep it running on the current version of the Windows. At the same time there was confusion over how BizTalk Server, Azure BizTalk Services and now Logic Apps all fit together in Microsoft’s future integration plan. This roadmap finally gives us a clear view of where Microsoft is going with their integration products.

Continuing Support for BizTalk Server

From the roadmap, it looks like rumors of the death of BizTalk Server have been greatly exaggerated. In Q4 of 2016 Microsoft will be releasing a new version of BizTalk Server. Along with the usual platform alignment and bug fixes, Microsoft will be adding key new features to the product. The most important feature in my opinion is the coming support for SQL Server Always on Availability Groups. This support will also enable support for BizTalk high-availability configurations running in Azure IaaS. This is a really big deal for anybody who has struggled to set up a high-availability solution for BizTalk. Currently, log shipping is the only supported disaster recovery scenario for BizTalk Server and even that is not supported if you try and run BizTalk on Azure. It is not clear from the roadmap but I am hoping this means BizTalk will be dropping the dependency on the Distributed Transaction Coordinator.

The other new BizTalk Server feature I am eager to learn more about is the Azure API connector support. I am very interested to see what this looks like and how it may tie into Azure Logic Apps. I am curious to see if the API connectors will be included in some form of on-premises solution that will run inside BizTalk or if this will be a special adapter that interface with Logic Apps. The roadmap does not provide any details on this yet, so we will have to wait for Microsoft to release more information later this year.

A Vision for Integration

The roadmap breaks Microsoft’s integration vision into two key pillars. The first pillar is Modern Integration and this encompasses Azure Logic Apps. The second is Enterprise Integration and involves a mix of BizTalk Server and Azure Logic Apps. Modern Integration will target the “Citizen Integrator”, individuals who are not integration specialists. This will likely be a mix of web and mobile developers and even some power users. The driving force is to make building integration solutions simpler without requiring a developer with special skills. One of the biggest drawbacks to using BizTalk and other middleware platforms is the steep learning curve and the time it takes to become effective with the tools. Microsoft is directly addressing this problem in the Logic Apps platform. Initially Logic Apps will target SaaS and web-centric solutions, and some hybrid scenarios but there are plans to make Logic Apps part of the Azure Stack which will make it available for on-premises development too.

The target audience for the Enterprise Integration pillar will be integration specialists. This will look at more traditional integration problems like working with legacy systems and exchanging data between companies using EDI and HL7 standards. These are areas where specialized knowledge is required and it makes sense to have a specialist developer involved in building a solution. These types of solutions generally have more stringent requirements around performance and availability that are best addressed by an integration expert. In the short term, BizTalk Server will be the primary choice for building on-premises integrations between legacy systems and can be used for hybrid integrations as well. Long term Microsoft will be adding more enterprise connectors to Logic Apps to give it feature parity with Azure BizTalk Services. Eventually these two pillars will converge into a single integration platform in the form of Logic Apps.

Conclusion

This roadmap has provided some much-needed clarity around Microsoft’s integration strategy. While it is going to take some time to get there, we now know where Microsoft is headed with their integration platforms and their eventual convergence into Logic Apps. I applaud Microsoft’s intentions to simplify the development of integration solutions and make them make them more agile. I am excited to see what they do in this area.

This is what I am taking away from this roadmap:

  • BizTalk Server is going to be around for a while. It will continue to be the first choice for on-premises integration projects. BizTalk is a mature product and it is going to take some time to implement equivalent functionality in Logic Apps. The new HA features in BizTalk Server 2016 will make it easier to sell to enterprise customers who are already using SQL Server Always On with their other applications.
  • The writing is on the wall for Azure BizTalk Services. While I would not be in a rush to move any existing applications off of BizTalk Services, I would not plan on starting any new projects with it either. Eventually the functionality of BizTalk Services will be made available in Logic Apps.
  • Logic Apps is the go-to tool for building cloud-based integration solutions. It already has a large library of connectors to the most popular SaaS and web applications and has the tools to make it easy to work with web APIs.
  • We are going to continue hearing more about the democratization of integration. Similar pushes are showing up in the BI and application development space with PowerBI and PowerApps. This probably is not the best news for the developers and consultants who are heavily invested in BizTalk, but I am hoping this will increase the available talent pool for integration developers.

2015 Favorite Books

I cannot believe 2015 is over already. That means it is once again time to review my favorite books for 2015. I read a lot this year, but only a few books. I was focused on learning more about distributed systems theory and that meant I read a lot of research papers instead. This coming year I will be looking to apply my distributed systems knowledge and learning about building applications and systems in the cloud. I also plan on putting a serious dent into my reading backlog.

Favorite Books 2015

  • Distributed Systems for Fun and Profit - This is a free ebook and it is a nice primer for learning about distributed systems. My favorite part is each chapter links to a number of relevant papers on the topic of the chapter. This was a primary source of papers I read this year.
  • Building Microservices - This is the book to read if your interested in learning how to build microservices. It is full of practical advice covering everything from
  • Lean Enterprise - If you are interested in learning how to apply Lea and Agile principles to large enterprise projects, then this is the book for you. Another book full of practical advice and not just theory.
  • Akka in Action - While this book is still in early access at the time of this post, it makes for a solid introduction to learning the actor model and Akka. It has a few rough edges still but I expect they will be ironed out by the time the final version is published.

Azure Resource Manager Authentication In PowerShell

Introduction

The other day I was working on building a PowerShell script to provision some Azure resources using the Azure Resource Manager module. I had some difficulty with authenticating my script with my Azure account. The documentation for the PowerShell module was not very clear on the authentication requirements and I only figured it out when I started reading through the documentation for the Azure CLI tools.

Resource Manager Authentication

The Resource Manager API only supports authentication against organizational accounts. This in a nutshell was the source of my problem and the documentation is not really clear on this point. On top of that the API does not provide a clear error. When I would attempt to authenticate in my script, I did not receive any errors after entering my credentials. The errors would appear whenever I tried to do something with Resource Manager API after the login step. This would have a been easiert to resolve if the authentication service would have provided a clear error message when I was using my Microsoft Account credentials.

For better or worse, all of my Azure subscriptions are tied to my Microsoft Account and while I have an organizational account I really did not want to call Azure support to have them move my subscription. The workaround is to create a new account in the Azure AD default directory associated with the Microsoft Account. This account can then be granted co-admin permissions and it also counts as an organizational account.

Create the Organizational User

  1. In the Azure portal navigate to your default directory
  2. Click on the Users tab at the top of the screen
  3. Click on the Add User button
  4. Fill out the new user form
    • Set the Type of User to New user in your organization
    • Set Role to User
    • Leave the Enable Multi-Factor Authentication box unchecked
  5. Finish creating the user by clicking the Create button
  6. Add the new user to a subscription as a co-administrator
  7. Login using the new account and change the account password

Use the New User in PowerShell

In PowerShell, try authenticating with the Resource Manager API using the Add-AzureRmAccount cmdlet. When the login form appears, use the credentials for the new organizational user. Now that you have logged in with an organizational account you should be able to run the other AzureRm cmdlets without receiving cryptic errors about your credentials being expired or not authorized to perform a particular action.

Along with the ability to access the Resource Manager API, the new organizational user credentials can also be used to create PSCredential objects. These objects can be passed into the Add-AzureRmAccount and Add-AzureAccount cmdlets to pass credentials in non-interactive scripts.

ESB Toolkit Building a Custom Adapter Provider

Introduction

ESB Toolkit supports a number of adapters out-of-the-box, but the WCF-NetTcp and WCT-WebHttp adapters are noticeably absent from the list in the itinerary designer and the transport type vocabulary BRE. According to the documentation, it’s possible to build your own adapter providers to add support for additional adapters to ESB Toolkit. Recently I had a client who wanted to use WCF-NetTcp to communicate with web services using ESB Toolkit, so I took some time to see how hard it is to build a custom adapter provider. It turns out that it is a fairly simple process to add a custom adapter provider, but the process is not very well documented.

Build the Adapter Provider

  1. Open Visual Studio and start a new class library project. Be sure to configure the project to sign the assembly with a strong-name key.
  2. Add a reference to the Microsoft.Practices.ESB.Adapter assembly in the ESB Toolkit installation folder.
  3. Implement the WCFBaseAdapterProvider interface. There is also a BaseAdapterProvider interface that can be used to implement support for non-WCF adapters, like the HTTP adapter, in BizTalk. Here is my sample code that implements the interface for the WCF-NetTcp adapter:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Practices.ESB.Adapter;
namespace SD.ESB.Adapters
{
public class WcfNetTcpAdapterProvider : WCFBaseAdapterProvider
{
public override string AdapterName
{
get { return "WCF-NetTcp"; }
}
}
}
  1. Build the project and install the assembly in the GAC.

Register the Adapter Provider

Add an entry for the new adapter provider to the esb.config file located in the ESB Toolkit installation folder. Here is a sample entry for the WCF-NetTcp adapter provider:

1
<adapterProvider name="WCF-NetTcp" type="SD.ESB.Adapters.WcfNetTcpAdapterProvider, SD.ESB.Adapters, Version=1.0.0.0, Culture=neutral, PublicKeyToken=f6c9c5befea65e79" moniker="nettcp" />

The moniker attribute is taken from the protocol section of the URL. For NetTcp this becomes nettcp.

Build the Manifest

The manifest files are located under the Visual Studion installation folder in the \Common7\IDE\Extensions\Microsoft.Practices.Services.Itinerary.DslPackage subfolder. Since we are adding a new WCF adapter, I made a copy of the WCF-CustomProviderManifest.xml file and renamed it to WCF-NetTcpProviderManifest.xml. If your particular adapter requires additional context properties, just add the entry for the appropriate assembly to the manifest file.

Final Steps

Once the above steps have been completed, restart Visual Studio and bounce the BizTalk host instances. At this point you should be able to see the new adapter in the itinerary designer. Please note that the new adapter will not exist in the transport type BRE vocabulary. If you are using the BRE resolver for routing you will need to either update the vocabulary or set the transport type using a static string.

Conclusion

As you can see, building new custom adapter providers is pretty straightforward once you know what needs to be done. I hope this helps anybody else who needs to add support for additional WCF adapters to ESB Toolkit. Since it was so simple to add support for the WCF-NetTcp and WCF-WebHttp adapters, it raises the question as to why Microsoft has not made it a standard part of ESB Toolkit themselves. Hopefully we will see these two adapters included when BizTalk 2016 is released next year.

Simulate Batch Processing in BizTalk Using a Database and a Service Window

On a recent project I had a need to simulate a batch aggregation process using BizTalk. In this scenario, the source system would produce data files. Each file would represent one business transaction. The destination system, however, was expecting to receive all of the transactions in a single file. There is more than solution to this scenario. For example I could have built a parallel convoy but in this case that seemed like a lot of effort based on the requirements and the volume of data that will be passing through this interface. Instead I was able to “simulate” a batching process using a simple table in SQL Server and a BizTalk service window.

The first step is to create a staging table in a SQL Server database. In my case the destination system was receiving a flat-file with one row representing one business transaction so I modeled my database table using a similar structure.

The process consists of two message flows. The first process picks up the data files from the source system and inserts the data into the staging table. The second process then reads the rows from the staging table into a single message and then maps them to the flat-file for the destination system. The key item is to configure the SQL receive location with a service window that does not overlap with the source system’s schedule for creating files. The idea is to ensure that all of the source files have been processed and committed to the staging table before the second process queries the database. For example, if the source system is configured to create data files at 5:00pm, you may want to configure the SQL receive location service window to process files between 6:00pm and 6:10pm. If the source system is going to run on a continuous schedule you may want to configure a service window on the first process too. Just make sure that the two service windows do not overlap.

I found this to be an easier way to aggregate messages using BizTalk without introducing the complexity of a parallel convoy. Plus with a convoy there is always the danger that one or more messages may not arrive in a timely manner and cause zombie instances to show up in the suspended message list.

Mapper Property Promotion

Here is a little something that I did not realize was happening when BizTalk executes a transform. If you have a property schema associated with the destination schema in a map, BizTalk will automatically promote those properties when it executes the map. This does not help if you need to promote an adapter context property or another property schema that is not associated with the destination schema. For those you will still need to use another tactic like promoting properties using an initializing correlation set.

Generally I promote properties for the adapters or a generic envelope schema and I just recently observed this behavior. If you are working on a project that uses promoted context properties to route messages this may be an easy way to set those values.

ESB Toolkit: ESB Dispatcher Disassemble Pipeline Component

One of my favorite aspects of BizTalk is that there is always something new to learn. I have seen the ESB Dispatcher Disassemble component and I know it can be used to debatch an XML file and execute an itinerary. What I did not know is that it can be used to route a single message to multiple recipients. All you need to do is add a resolver for each destination endpoint to the routing service in the itinerary. The ESB Dispatcher Disassemble will generate one message for each configured resolver.

The one downside is that using this component can limit your options for selecting an itinerary. Since this is a disassemble stage component and it executes the itinerary, you need to select your itinerary in the decode stage. This means that none of the message context properties will be promoted and will not be available to the context resolver or the BRE resolver. If you can live with the limitations or are willing to build a custom decode stage component to assist with selecting the itinerary, the ESB Dispatcher Disassemble component is a relatively simple way to route a message to multiple destination endpoints using the ESB Toolkit.

2014 Favorite Books

I am a little late with my favorite books list for 2014, but better late than never. Right? Unfortunately I did not get to read as many books as would have liked this past year. Last Year I mentioned I wanted to get into Big Data but that did not happen as planned. Instead I spent my reading time learning about APIs and distributed systems. As always, none of the links below are affiliate links. I link directly to the publisher for ebooks and to Amazon for paper books as that is where I make my own purchases.

Favorite Books 2014

  • RESTful Web APIs - This is the first book I have read on REST APIs that provided practical advice on building services using hypermedia.
  • Designing Evolvable Web APIs with ASP.NET - Similar to the previous title, this book provides lots of practical advice for building hypermedia APIs using the ASP.NET Web API framework.
  • Understanding Computation - I have not read anything on the theory of computation or language design since I was in school. I really enjoy this book as it provides a nice introduction and review of those two topics and presents them in a way that does not require a computer science degree to understand.
  • Lean from the Trenches - This book was all about practical advice on using Lean principles to manage software projects. I really like the way the author uses retrospectives on a real project as the examples to show what works and what does not.

The list is a little short this year. I hope to get back into my normal reading groove for 2015.

Install ESB Portal Database on a Named Instance

On my current project I had a requirement to install the ESB Toolkit exception portal database on a named instance of SQL Server. This particular client was sharing a single physical server among multiple database applications for their development environment. Some of these applications, like BizTalk, make changes to the SQL Server configuration settings that are optimal for normal database operation. To isolate these applications they created a named instance for each application that required special configuration settings.

There are plenty of guides for installing and configuring the ESB exception portal out there already so I will not replicate those instructions here. If you need a guide, I am a fan of Configure BizTalk ESB Toolkit 2.2 Management Portal. Installing the portal on a named instance is not complicated but I have not seen it documented. It was not obvious to me until I read through the batch and PowerShell installation scripts.

When you run the Management_Install.cmd batch script, add an argument to the end that specifies the name of the SQL Server instance you wwould like use for the portal database. So if you have a server named devsql with an instance named bts the command would look like this:

1
.\Management_Install.cmd devsql\bts

The user running the script will need to have permission to create new databases in the named instance. Assuming the prerequisites and security have been configured correctly the script should install the portal database to the specified SQL Server instance.

If you would rather run the PowerShell script directly, the SQL instance name is the second argument to the script. The first argument to the PowerShell script is the IIS website ID for the ESB web services. Using the same values as before, the command would look like this:

1
.\Management_Install.ps1 1 devsql\bts

2013 Favorite Books

I enjoyed writing up a post about my favorite books at the end of last year so I thought it would be fun to do it again. Maybe I will turn this into an annual event. This year I managed to read 21 books and 14 research papers. Note these are not affiliate links and I will get absolutely nothing if you click on them.

Favorite Books 2013

  • SOA with REST - This book does a great job of mapping the REST architectural style to SOA patterns. It demonstrates how to build SOA solutions using REST for the implementation.
  • Building Hypermedia APIs with HTML 5 and Node - Don’t let the title fool you. This book is full of lots of great information on building APIs with hypermedia. I don’t know (and don’t care) much about Node.js and I didn’t have any trouble following the code samples.
  • SOA Patterns - This book is nice collection of SOA patterns and anti-patterns. I like the fact that the patterns are described in a way that isn’t specific to a single technology stack.
  • Practical Project Initiation - While I am a technologist and not a project manager I still found some great tips on how to start a project off on the right foot.
  • Bad Data Handbook - This is collection of essays on data problems and how to work around them. A couple of the scenarios presented were things I have experienced in my own projects.
  • Dune - This was my fun reading for the year. I read all six of Frank Herbert’s Dune novels. Overall I enjoyed the story though it gets a little preachy towards the end of the series.

For 2014 I hope to get into Big Data and some related subjects like functional programming. I have been taking advantage of the holiday sales to stock up on books to read in the coming year. Until then.