Live on Hexo


Hexo is Live

If you are reading this, then the transition to Hexo has completed successfully. Hopefully the redirects are all working properly and new articles are showing up in your favorite feed reader. This should remove a lot of my blogging friction and make it easier for me to write articles more often.

Thanks for reading, and I look forward to posting some new material soon.

Moving to Hexo

We're Moving

Moving to Hexo

Just a heads-up for anybody who subscribes to this blog via RSS or ATOM. Sometime within the next week I will be migrating this blog from Pelican to Hexo. Pelican has served me well the past several years, but it is starting to cause enough problems that I want to try another static site generator. Pelican is a little awkward to run under Windows. For Python programs like Pelican, I typically install them inside of some form of virtual environment like virtualenv or an Anaconda environment. This means I have to remember to switch environments which I usually forget to do until I get strange errors trying to work with Pelican. Managing the plugins and themes is also somewhat different. I do not change either very often, and I end up having to read the documentation every time I want to add or update something. The final reason I am moving away from Pelican is it is rather slow. It takes a good 30-40 seconds to regenerate the site every time I hit save in my editor. This is really painful when experimenting with page layout for a new article.

After trying out a few different static site generators, I have settled on Hexo for the next version of my blog site. Hexo is a Node.js application and is a much smoother experience on Windows. During development, my site can regenerate in 3-5 seconds, which I really like. Plus, plugins and themes are all managed through npm which makes it easy to keep things updated.

The one downside to Hexo is it will not allow me to have both RSS and ATOM feeds. Going forward, the site will have an ATOM feed and a new JSON Feed. I am going to have automatic redirects configured that should route requests from the old feeds to the ATOM feed. Most feed readers that I know of understand both RSS and ATOM, so this should be seamless for most readers. I have a test post queued up for next week after I make the move, so if you are not seeing new content by then I would suggest manually updating your subscription.

Hopefully this will be a relatively smooth transition.

Thanks for reading!

Raw Input with Azure Functions HTTP Triggers


This is a quick blog post to document my experience with building serverless
web APIs with Azure Functions. Recently, I was building an API where I
wanted to receive an XML message in the body of the HTTP trigger, but I did
not want the Functions framework to attempt to deserialize the data. I
wanted to receive the raw input so I could pass it as-is to another API.

HTTP Trigger Template

When you create a new Azure Function, the default template will use the
following statement to read the body of the HTTP request:

dynamic data = await req.Content.ReadAsAsync<object>();

This statement will attempt to deserialize the HTTP request body by using the
Content-Type header to choose a serializer component. By default it supports
the JSON and XML serializers, but you can define and register your own custom
serializer too. In order for the serializer to work properly, you need to have
a classs with the appropriate decorators so the serializer knows how to map
the data to the class properties.

In my use case, I was not going to do anything with the data other than pass
it on to another API, so I did not want to incur the overhead of
deserializing/serializing the data. Instead, you can change the above
statement to read the body as a string. When reading the data as a string,
the framework will not attempt to deserialzie the data.

To read the body as a string, replace the above statement with this:

string data = await req.Content.ReadAsStringAsync();


It turns out getting the raw input is nothing more than a simple statement
change. All it takes is changing the statement from reading the data as an
object to reading it as a string. Now I have it documented for the next time
I need to do something similar in Azure Functions.

This article originally posted at

Yeoman Generator for Morepath

Yeoman Generator for Morepath

I have been playing around a bit with the Morepath microservices framework for Python lately. I noticed in the Morepath documentation that there is a Cookiecutter template for scaffolding new projects. On a lark, I wanted to see how hard it would be to build a Yeoman template for Morepath. I have wanted to learn about building custom Yeoman templates to set up new projects with the proper structure and dependencies for my work projects. Since I did not see an existing Morepath template on the NPM site, I figured I would try and build one myself.

In its current state, it just sets up a basic skeleton project, but as I learn more about Morepath I plan on adding more features. If anybody is interested in trying it out, it is available on NPM. Anybody interested in how I built it, and how bad my JavaScript code is, can check out the project repository on GitHub.

Implementing Auto-Save with React

Over the past few months I have been slowly learning about modern web development. The last time I worked on a web project .NET 2.0 was the new hotness, so I have a lot of catching up to do. After checking out a few of the more popular frameworks out there I chose React as the framework for building web-based user interfaces. I chose React because of its popularity and support, tight focus on views and its affinity for functional programming.

A common feature in modern applications is support for automatically saving. This is seen in desktop applications like Microsoft Office, and nearly every mobile application out there. I wanted to see if I could replicate that behavior inside of a React application.

Challenge Accepted

The systems I have seen that implement this feature send a save event after every change. In a web application, this would generate a lot of network activity so wanted to try implementing this feature using an idle timer. The application starts the timer after any user activity and tracks changes to the form state. The timer resets every time there is user activity to prevent triggering a save while the user is still actively changing the form data. Once the timer expires, it fires an event where the application would perform the save functionality. In the case of my test applcation, it resets the tracking state and outputs a message stating that it saved successfully. The full application code can be found on GitHub in my react-auto-save repository.

2016 Favorite Books

Another year has come and gone. Time for another favorite books blog post. I may not be doing the best job of blogging, but at least I am consistent with my year-end entry.
This past year I came closer to burning out than I would like to admit, so I spent more time reading fun books instead of technology books. I still managed to read some good
technical books, so without further ado let us take a look at the list.

Note: None of the links below are affiliate links. I get nothing from linking to these books. I try to link to the author or publisher site when possible, and Amazon as the default.

Favorite Books 2016

  • The Phoenix Project - While not a true technical book, this was still an interesting read. It is a novelization of the problems encountered in a typical IT organziation and how they applied DevOps techniques to solve them.
  • Concurrency in C# Cookbook - I have been getting into more concurrent and reactive programming lately. This book provides concrete examples on how to use the Async/Await and TaskParallel features to build concurrent applications in .NET using C#.
  • Dependency Injection in .NET - This is the definitive book on properly applying dependency injection techniques and tools in .NET. I really like that the concepts are taught without the use of a DI framework. This makes the techniques applicable to multiple implementations.
  • Continuous Delivery with Windows and .NET - This is a free eBook from O’Reilly that gives a good overview of the tools and options available for implementing CD pipelines on the Windows and .NET platforms.
  • Hidden Empire - This is the first book in a science-fiction novel series that I really enjoyed. If you are into “space opera” like Star Wars or Babylon 5, I would suggest checking it out.
  • Storm Front - This is the first book in the Dresden Files, a fantasy series following a private detective wizard who investigates supernatural mysteries in the modern world. Think Harry Potter crossed with Philip Marlowe.

New Home

Moving Complete

If you are reading this, then you are looking at this blog at its new home at Just a reminder that links to will redirect here for a while, but you will want to update any bookmarks you have to the old domain.

Thanks again for reading!


A New Home

Sometime within the next two weeks I will be moving this blog to a new domain. “Rogue Technology” was my “doing business as” name when I decided to become an independent consultant many years ago. That venture never really went anywhere and I have decided that it is time to clean out the old and bring in the new.

The new domain will be Assuming I get everything configured correctly, links to will continue to work for a while. If your feed reader supports HTTP 301 redirects everything should update itself automatically. If you have bookmarks, you will probably want to change once the new domain is live.

Thanks again for reading.

Thoughts on the Microsoft Integration Roadmap

Microsoft has finally released an integration roadmap. For the past several years there has been concern over the future of BizTalk and Microsoft’s integration strategy. For many BizTalk developers, the perception was that BizTalk was no longer a strategic product and that Microsoft was just putting in the minimum effort necessary to keep it running on the current version of the Windows. At the same time there was confusion over how BizTalk Server, Azure BizTalk Services and now Logic Apps all fit together in Microsoft’s future integration plan. This roadmap finally gives us a clear view of where Microsoft is going with their integration products.

Continuing Support for BizTalk Server

From the roadmap, it looks like rumors of the death of BizTalk Server have been greatly exaggerated. In Q4 of 2016 Microsoft will be releasing a new version of BizTalk Server. Along with the usual platform alignment and bug fixes, Microsoft will be adding key new features to the product. The most important feature in my opinion is the coming support for SQL Server Always on Availability Groups. This support will also enable support for BizTalk high-availability configurations running in Azure IaaS. This is a really big deal for anybody who has struggled to set up a high-availability solution for BizTalk. Currently, log shipping is the only supported disaster recovery scenario for BizTalk Server and even that is not supported if you try and run BizTalk on Azure. It is not clear from the roadmap but I am hoping this means BizTalk will be dropping the dependency on the Distributed Transaction Coordinator.

The other new BizTalk Server feature I am eager to learn more about is the Azure API connector support. I am very interested to see what this looks like and how it may tie into Azure Logic Apps. I am curious to see if the API connectors will be included in some form of on-premises solution that will run inside BizTalk or if this will be a special adapter that interface with Logic Apps. The roadmap does not provide any details on this yet, so we will have to wait for Microsoft to release more information later this year.

A Vision for Integration

The roadmap breaks Microsoft’s integration vision into two key pillars. The first pillar is Modern Integration and this encompasses Azure Logic Apps. The second is Enterprise Integration and involves a mix of BizTalk Server and Azure Logic Apps. Modern Integration will target the “Citizen Integrator”, individuals who are not integration specialists. This will likely be a mix of web and mobile developers and even some power users. The driving force is to make building integration solutions simpler without requiring a developer with special skills. One of the biggest drawbacks to using BizTalk and other middleware platforms is the steep learning curve and the time it takes to become effective with the tools. Microsoft is directly addressing this problem in the Logic Apps platform. Initially Logic Apps will target SaaS and web-centric solutions, and some hybrid scenarios but there are plans to make Logic Apps part of the Azure Stack which will make it available for on-premises development too.

The target audience for the Enterprise Integration pillar will be integration specialists. This will look at more traditional integration problems like working with legacy systems and exchanging data between companies using EDI and HL7 standards. These are areas where specialized knowledge is required and it makes sense to have a specialist developer involved in building a solution. These types of solutions generally have more stringent requirements around performance and availability that are best addressed by an integration expert. In the short term, BizTalk Server will be the primary choice for building on-premises integrations between legacy systems and can be used for hybrid integrations as well. Long term Microsoft will be adding more enterprise connectors to Logic Apps to give it feature parity with Azure BizTalk Services. Eventually these two pillars will converge into a single integration platform in the form of Logic Apps.


This roadmap has provided some much-needed clarity around Microsoft’s integration strategy. While it is going to take some time to get there, we now know where Microsoft is headed with their integration platforms and their eventual convergence into Logic Apps. I applaud Microsoft’s intentions to simplify the development of integration solutions and make them make them more agile. I am excited to see what they do in this area.

This is what I am taking away from this roadmap:

  • BizTalk Server is going to be around for a while. It will continue to be the first choice for on-premises integration projects. BizTalk is a mature product and it is going to take some time to implement equivalent functionality in Logic Apps. The new HA features in BizTalk Server 2016 will make it easier to sell to enterprise customers who are already using SQL Server Always On with their other applications.
  • The writing is on the wall for Azure BizTalk Services. While I would not be in a rush to move any existing applications off of BizTalk Services, I would not plan on starting any new projects with it either. Eventually the functionality of BizTalk Services will be made available in Logic Apps.
  • Logic Apps is the go-to tool for building cloud-based integration solutions. It already has a large library of connectors to the most popular SaaS and web applications and has the tools to make it easy to work with web APIs.
  • We are going to continue hearing more about the democratization of integration. Similar pushes are showing up in the BI and application development space with PowerBI and PowerApps. This probably is not the best news for the developers and consultants who are heavily invested in BizTalk, but I am hoping this will increase the available talent pool for integration developers.

2015 Favorite Books

I cannot believe 2015 is over already. That means it is once again time to review my favorite books for 2015. I read a lot this year, but only a few books. I was focused on learning more about distributed systems theory and that meant I read a lot of research papers instead. This coming year I will be looking to apply my distributed systems knowledge and learning about building applications and systems in the cloud. I also plan on putting a serious dent into my reading backlog.

Favorite Books 2015

  • Distributed Systems for Fun and Profit - This is a free ebook and it is a nice primer for learning about distributed systems. My favorite part is each chapter links to a number of relevant papers on the topic of the chapter. This was a primary source of papers I read this year.
  • Building Microservices - This is the book to read if your interested in learning how to build microservices. It is full of practical advice covering everything from
  • Lean Enterprise - If you are interested in learning how to apply Lea and Agile principles to large enterprise projects, then this is the book for you. Another book full of practical advice and not just theory.
  • Akka in Action - While this book is still in early access at the time of this post, it makes for a solid introduction to learning the actor model and Akka. It has a few rough edges still but I expect they will be ironed out by the time the final version is published.