Implementing Auto-Save with React

Over the past few months I have been slowly learning about modern web development. The last time I worked on a web project .NET 2.0 was the new hotness, so I have a lot of catching up to do. After checking out a few of the more popular frameworks out there I chose React as the framework for building web-based user interfaces. I chose React because of its popularity and support, tight focus on views and its affinity for functional programming.

A common feature in modern applications is support for automatically saving. This is seen in desktop applications like Microsoft Office, and nearly every mobile application out there. I wanted to see if I could replicate that behavior inside of a React application.

Challenge Accepted

The systems I have seen that implement this feature send a save event after every change. In a web application, this would generate a lot of network activity so wanted to try implementing this feature using an idle timer. The application starts the timer after any user activity and tracks changes to the form state. The timer resets every time there is user activity to prevent triggering a save while the user is still actively changing the form data. Once the timer expires, it fires an event where the application would perform the save functionality. In the case of my test applcation, it resets the tracking state and outputs a message stating that it saved successfully. The full application code can be found on GitHub in my react-auto-save repository.

2016 Favorite Books

Another year has come and gone. Time for another favorite books blog post. I may not be doing the best job of blogging, but at least I am consistent with my year-end entry.
This past year I came closer to burning out than I would like to admit, so I spent more time reading fun books instead of technology books. I still managed to read some good
technical books, so without further ado let us take a look at the list.

Note: None of the links below are affiliate links. I get nothing from linking to these books. I try to link to the author or publisher site when possible, and Amazon as the default.

Favorite Books 2016

  • The Phoenix Project - While not a true technical book, this was still an interesting read. It is a novelization of the problems encountered in a typical IT organziation and how they applied DevOps techniques to solve them.
  • Concurrency in C# Cookbook - I have been getting into more concurrent and reactive programming lately. This book provides concrete examples on how to use the Async/Await and TaskParallel features to build concurrent applications in .NET using C#.
  • Dependency Injection in .NET - This is the definitive book on properly applying dependency injection techniques and tools in .NET. I really like that the concepts are taught without the use of a DI framework. This makes the techniques applicable to multiple implementations.
  • Continuous Delivery with Windows and .NET - This is a free eBook from O’Reilly that gives a good overview of the tools and options available for implementing CD pipelines on the Windows and .NET platforms.
  • Hidden Empire - This is the first book in a science-fiction novel series that I really enjoyed. If you are into “space opera” like Star Wars or Babylon 5, I would suggest checking it out.
  • Storm Front - This is the first book in the Dresden Files, a fantasy series following a private detective wizard who investigates supernatural mysteries in the modern world. Think Harry Potter crossed with Philip Marlowe.

New Home

Moving Complete

If you are reading this, then you are looking at this blog at its new home at Just a reminder that links to will redirect here for a while, but you will want to update any bookmarks you have to the old domain.

Thanks again for reading!


A New Home

Sometime within the next two weeks I will be moving this blog to a new domain. “Rogue Technology” was my “doing business as” name when I decided to become an independent consultant many years ago. That venture never really went anywhere and I have decided that it is time to clean out the old and bring in the new.

The new domain will be Assuming I get everything configured correctly, links to will continue to work for a while. If your feed reader supports HTTP 301 redirects everything should update itself automatically. If you have bookmarks, you will probably want to change once the new domain is live.

Thanks again for reading.

Thoughts on the Microsoft Integration Roadmap

Microsoft has finally released an integration roadmap. For the past several years there has been concern over the future of BizTalk and Microsoft’s integration strategy. For many BizTalk developers, the perception was that BizTalk was no longer a strategic product and that Microsoft was just putting in the minimum effort necessary to keep it running on the current version of the Windows. At the same time there was confusion over how BizTalk Server, Azure BizTalk Services and now Logic Apps all fit together in Microsoft’s future integration plan. This roadmap finally gives us a clear view of where Microsoft is going with their integration products.

Continuing Support for BizTalk Server

From the roadmap, it looks like rumors of the death of BizTalk Server have been greatly exaggerated. In Q4 of 2016 Microsoft will be releasing a new version of BizTalk Server. Along with the usual platform alignment and bug fixes, Microsoft will be adding key new features to the product. The most important feature in my opinion is the coming support for SQL Server Always on Availability Groups. This support will also enable support for BizTalk high-availability configurations running in Azure IaaS. This is a really big deal for anybody who has struggled to set up a high-availability solution for BizTalk. Currently, log shipping is the only supported disaster recovery scenario for BizTalk Server and even that is not supported if you try and run BizTalk on Azure. It is not clear from the roadmap but I am hoping this means BizTalk will be dropping the dependency on the Distributed Transaction Coordinator.

The other new BizTalk Server feature I am eager to learn more about is the Azure API connector support. I am very interested to see what this looks like and how it may tie into Azure Logic Apps. I am curious to see if the API connectors will be included in some form of on-premises solution that will run inside BizTalk or if this will be a special adapter that interface with Logic Apps. The roadmap does not provide any details on this yet, so we will have to wait for Microsoft to release more information later this year.

A Vision for Integration

The roadmap breaks Microsoft’s integration vision into two key pillars. The first pillar is Modern Integration and this encompasses Azure Logic Apps. The second is Enterprise Integration and involves a mix of BizTalk Server and Azure Logic Apps. Modern Integration will target the “Citizen Integrator”, individuals who are not integration specialists. This will likely be a mix of web and mobile developers and even some power users. The driving force is to make building integration solutions simpler without requiring a developer with special skills. One of the biggest drawbacks to using BizTalk and other middleware platforms is the steep learning curve and the time it takes to become effective with the tools. Microsoft is directly addressing this problem in the Logic Apps platform. Initially Logic Apps will target SaaS and web-centric solutions, and some hybrid scenarios but there are plans to make Logic Apps part of the Azure Stack which will make it available for on-premises development too.

The target audience for the Enterprise Integration pillar will be integration specialists. This will look at more traditional integration problems like working with legacy systems and exchanging data between companies using EDI and HL7 standards. These are areas where specialized knowledge is required and it makes sense to have a specialist developer involved in building a solution. These types of solutions generally have more stringent requirements around performance and availability that are best addressed by an integration expert. In the short term, BizTalk Server will be the primary choice for building on-premises integrations between legacy systems and can be used for hybrid integrations as well. Long term Microsoft will be adding more enterprise connectors to Logic Apps to give it feature parity with Azure BizTalk Services. Eventually these two pillars will converge into a single integration platform in the form of Logic Apps.


This roadmap has provided some much-needed clarity around Microsoft’s integration strategy. While it is going to take some time to get there, we now know where Microsoft is headed with their integration platforms and their eventual convergence into Logic Apps. I applaud Microsoft’s intentions to simplify the development of integration solutions and make them make them more agile. I am excited to see what they do in this area.

This is what I am taking away from this roadmap:

  • BizTalk Server is going to be around for a while. It will continue to be the first choice for on-premises integration projects. BizTalk is a mature product and it is going to take some time to implement equivalent functionality in Logic Apps. The new HA features in BizTalk Server 2016 will make it easier to sell to enterprise customers who are already using SQL Server Always On with their other applications.
  • The writing is on the wall for Azure BizTalk Services. While I would not be in a rush to move any existing applications off of BizTalk Services, I would not plan on starting any new projects with it either. Eventually the functionality of BizTalk Services will be made available in Logic Apps.
  • Logic Apps is the go-to tool for building cloud-based integration solutions. It already has a large library of connectors to the most popular SaaS and web applications and has the tools to make it easy to work with web APIs.
  • We are going to continue hearing more about the democratization of integration. Similar pushes are showing up in the BI and application development space with PowerBI and PowerApps. This probably is not the best news for the developers and consultants who are heavily invested in BizTalk, but I am hoping this will increase the available talent pool for integration developers.

2015 Favorite Books

I cannot believe 2015 is over already. That means it is once again time to review my favorite books for 2015. I read a lot this year, but only a few books. I was focused on learning more about distributed systems theory and that meant I read a lot of research papers instead. This coming year I will be looking to apply my distributed systems knowledge and learning about building applications and systems in the cloud. I also plan on putting a serious dent into my reading backlog.

Favorite Books 2015

  • Distributed Systems for Fun and Profit - This is a free ebook and it is a nice primer for learning about distributed systems. My favorite part is each chapter links to a number of relevant papers on the topic of the chapter. This was a primary source of papers I read this year.
  • Building Microservices - This is the book to read if your interested in learning how to build microservices. It is full of practical advice covering everything from
  • Lean Enterprise - If you are interested in learning how to apply Lea and Agile principles to large enterprise projects, then this is the book for you. Another book full of practical advice and not just theory.
  • Akka in Action - While this book is still in early access at the time of this post, it makes for a solid introduction to learning the actor model and Akka. It has a few rough edges still but I expect they will be ironed out by the time the final version is published.

Azure Resource Manager Authentication In PowerShell


The other day I was working on building a PowerShell script to provision some Azure resources using the Azure Resource Manager module. I had some difficulty with authenticating my script with my Azure account. The documentation for the PowerShell module was not very clear on the authentication requirements and I only figured it out when I started reading through the documentation for the Azure CLI tools.

Resource Manager Authentication

The Resource Manager API only supports authentication against organizational accounts. This in a nutshell was the source of my problem and the documentation is not really clear on this point. On top of that the API does not provide a clear error. When I would attempt to authenticate in my script, I did not receive any errors after entering my credentials. The errors would appear whenever I tried to do something with Resource Manager API after the login step. This would have a been easiert to resolve if the authentication service would have provided a clear error message when I was using my Microsoft Account credentials.

For better or worse, all of my Azure subscriptions are tied to my Microsoft Account and while I have an organizational account I really did not want to call Azure support to have them move my subscription. The workaround is to create a new account in the Azure AD default directory associated with the Microsoft Account. This account can then be granted co-admin permissions and it also counts as an organizational account.

Create the Organizational User

  1. In the Azure portal navigate to your default directory
  2. Click on the Users tab at the top of the screen
  3. Click on the Add User button
  4. Fill out the new user form
    • Set the Type of User to New user in your organization
    • Set Role to User
    • Leave the Enable Multi-Factor Authentication box unchecked
  5. Finish creating the user by clicking the Create button
  6. Add the new user to a subscription as a co-administrator
  7. Login using the new account and change the account password

Use the New User in PowerShell

In PowerShell, try authenticating with the Resource Manager API using the Add-AzureRmAccount cmdlet. When the login form appears, use the credentials for the new organizational user. Now that you have logged in with an organizational account you should be able to run the other AzureRm cmdlets without receiving cryptic errors about your credentials being expired or not authorized to perform a particular action.

Along with the ability to access the Resource Manager API, the new organizational user credentials can also be used to create PSCredential objects. These objects can be passed into the Add-AzureRmAccount and Add-AzureAccount cmdlets to pass credentials in non-interactive scripts.

ESB Toolkit Building a Custom Adapter Provider


ESB Toolkit supports a number of adapters out-of-the-box, but the WCF-NetTcp and WCT-WebHttp adapters are noticeably absent from the list in the itinerary designer and the transport type vocabulary BRE. According to the documentation, it’s possible to build your own adapter providers to add support for additional adapters to ESB Toolkit. Recently I had a client who wanted to use WCF-NetTcp to communicate with web services using ESB Toolkit, so I took some time to see how hard it is to build a custom adapter provider. It turns out that it is a fairly simple process to add a custom adapter provider, but the process is not very well documented.

Build the Adapter Provider

  1. Open Visual Studio and start a new class library project. Be sure to configure the project to sign the assembly with a strong-name key.
  2. Add a reference to the Microsoft.Practices.ESB.Adapter assembly in the ESB Toolkit installation folder.
  3. Implement the WCFBaseAdapterProvider interface. There is also a BaseAdapterProvider interface that can be used to implement support for non-WCF adapters, like the HTTP adapter, in BizTalk. Here is my sample code that implements the interface for the WCF-NetTcp adapter:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Practices.ESB.Adapter;
namespace SD.ESB.Adapters
public class WcfNetTcpAdapterProvider : WCFBaseAdapterProvider
public override string AdapterName
get { return "WCF-NetTcp"; }
  1. Build the project and install the assembly in the GAC.

Register the Adapter Provider

Add an entry for the new adapter provider to the esb.config file located in the ESB Toolkit installation folder. Here is a sample entry for the WCF-NetTcp adapter provider:

<adapterProvider name="WCF-NetTcp" type="SD.ESB.Adapters.WcfNetTcpAdapterProvider, SD.ESB.Adapters, Version=, Culture=neutral, PublicKeyToken=f6c9c5befea65e79" moniker="nettcp" />

The moniker attribute is taken from the protocol section of the URL. For NetTcp this becomes nettcp.

Build the Manifest

The manifest files are located under the Visual Studion installation folder in the \Common7\IDE\Extensions\Microsoft.Practices.Services.Itinerary.DslPackage subfolder. Since we are adding a new WCF adapter, I made a copy of the WCF-CustomProviderManifest.xml file and renamed it to WCF-NetTcpProviderManifest.xml. If your particular adapter requires additional context properties, just add the entry for the appropriate assembly to the manifest file.

Final Steps

Once the above steps have been completed, restart Visual Studio and bounce the BizTalk host instances. At this point you should be able to see the new adapter in the itinerary designer. Please note that the new adapter will not exist in the transport type BRE vocabulary. If you are using the BRE resolver for routing you will need to either update the vocabulary or set the transport type using a static string.


As you can see, building new custom adapter providers is pretty straightforward once you know what needs to be done. I hope this helps anybody else who needs to add support for additional WCF adapters to ESB Toolkit. Since it was so simple to add support for the WCF-NetTcp and WCF-WebHttp adapters, it raises the question as to why Microsoft has not made it a standard part of ESB Toolkit themselves. Hopefully we will see these two adapters included when BizTalk 2016 is released next year.

Simulate Batch Processing in BizTalk Using a Database and a Service Window

On a recent project I had a need to simulate a batch aggregation process using BizTalk. In this scenario, the source system would produce data files. Each file would represent one business transaction. The destination system, however, was expecting to receive all of the transactions in a single file. There is more than solution to this scenario. For example I could have built a parallel convoy but in this case that seemed like a lot of effort based on the requirements and the volume of data that will be passing through this interface. Instead I was able to “simulate” a batching process using a simple table in SQL Server and a BizTalk service window.

The first step is to create a staging table in a SQL Server database. In my case the destination system was receiving a flat-file with one row representing one business transaction so I modeled my database table using a similar structure.

The process consists of two message flows. The first process picks up the data files from the source system and inserts the data into the staging table. The second process then reads the rows from the staging table into a single message and then maps them to the flat-file for the destination system. The key item is to configure the SQL receive location with a service window that does not overlap with the source system’s schedule for creating files. The idea is to ensure that all of the source files have been processed and committed to the staging table before the second process queries the database. For example, if the source system is configured to create data files at 5:00pm, you may want to configure the SQL receive location service window to process files between 6:00pm and 6:10pm. If the source system is going to run on a continuous schedule you may want to configure a service window on the first process too. Just make sure that the two service windows do not overlap.

I found this to be an easier way to aggregate messages using BizTalk without introducing the complexity of a parallel convoy. Plus with a convoy there is always the danger that one or more messages may not arrive in a timely manner and cause zombie instances to show up in the suspended message list.

Mapper Property Promotion

Here is a little something that I did not realize was happening when BizTalk executes a transform. If you have a property schema associated with the destination schema in a map, BizTalk will automatically promote those properties when it executes the map. This does not help if you need to promote an adapter context property or another property schema that is not associated with the destination schema. For those you will still need to use another tactic like promoting properties using an initializing correlation set.

Generally I promote properties for the adapters or a generic envelope schema and I just recently observed this behavior. If you are working on a project that uses promoted context properties to route messages this may be an easy way to set those values.