Azure Resource Manager Authentication In PowerShell


The other day I was working on building a PowerShell script to provision some Azure resources using the Azure Resource Manager module. I had some difficulty with authenticating my script with my Azure account. The documentation for the PowerShell module was not very clear on the authentication requirements and I only figured it out when I started reading through the documentation for the Azure CLI tools.

Resource Manager Authentication

The Resource Manager API only supports authentication against organizational accounts. This in a nutshell was the source of my problem and the documentation is not really clear on this point. On top of that the API does not provide a clear error. When I would attempt to authenticate in my script, I did not receive any errors after entering my credentials. The errors would appear whenever I tried to do something with Resource Manager API after the login step. This would have a been easiert to resolve if the authentication service would have provided a clear error message when I was using my Microsoft Account credentials.

For better or worse, all of my Azure subscriptions are tied to my Microsoft Account and while I have an organizational account I really did not want to call Azure support to have them move my subscription. The workaround is to create a new account in the Azure AD default directory associated with the Microsoft Account. This account can then be granted co-admin permissions and it also counts as an organizational account.

Create the Organizational User

  1. In the Azure portal navigate to your default directory
  2. Click on the Users tab at the top of the screen
  3. Click on the Add User button
  4. Fill out the new user form
    • Set the Type of User to New user in your organization
    • Set Role to User
    • Leave the Enable Multi-Factor Authentication box unchecked
  5. Finish creating the user by clicking the Create button
  6. Add the new user to a subscription as a co-administrator
  7. Login using the new account and change the account password

Use the New User in PowerShell

In PowerShell, try authenticating with the Resource Manager API using the Add-AzureRmAccount cmdlet. When the login form appears, use the credentials for the new organizational user. Now that you have logged in with an organizational account you should be able to run the other AzureRm cmdlets without receiving cryptic errors about your credentials being expired or not authorized to perform a particular action.

Along with the ability to access the Resource Manager API, the new organizational user credentials can also be used to create PSCredential objects. These objects can be passed into the Add-AzureRmAccount and Add-AzureAccount cmdlets to pass credentials in non-interactive scripts.

ESB Toolkit Building a Custom Adapter Provider


ESB Toolkit supports a number of adapters out-of-the-box, but the WCF-NetTcp and WCT-WebHttp adapters are noticeably absent from the list in the itinerary designer and the transport type vocabulary BRE. According to the documentation, it’s possible to build your own adapter providers to add support for additional adapters to ESB Toolkit. Recently I had a client who wanted to use WCF-NetTcp to communicate with web services using ESB Toolkit, so I took some time to see how hard it is to build a custom adapter provider. It turns out that it is a fairly simple process to add a custom adapter provider, but the process is not very well documented.

Build the Adapter Provider

  1. Open Visual Studio and start a new class library project. Be sure to configure the project to sign the assembly with a strong-name key.
  2. Add a reference to the Microsoft.Practices.ESB.Adapter assembly in the ESB Toolkit installation folder.
  3. Implement the WCFBaseAdapterProvider interface. There is also a BaseAdapterProvider interface that can be used to implement support for non-WCF adapters, like the HTTP adapter, in BizTalk. Here is my sample code that implements the interface for the WCF-NetTcp adapter:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Practices.ESB.Adapter;
namespace SD.ESB.Adapters
public class WcfNetTcpAdapterProvider : WCFBaseAdapterProvider
public override string AdapterName
get { return "WCF-NetTcp"; }
  1. Build the project and install the assembly in the GAC.

Register the Adapter Provider

Add an entry for the new adapter provider to the esb.config file located in the ESB Toolkit installation folder. Here is a sample entry for the WCF-NetTcp adapter provider:

<adapterProvider name="WCF-NetTcp" type="SD.ESB.Adapters.WcfNetTcpAdapterProvider, SD.ESB.Adapters, Version=, Culture=neutral, PublicKeyToken=f6c9c5befea65e79" moniker="nettcp" />

The moniker attribute is taken from the protocol section of the URL. For NetTcp this becomes nettcp.

Build the Manifest

The manifest files are located under the Visual Studion installation folder in the \Common7\IDE\Extensions\Microsoft.Practices.Services.Itinerary.DslPackage subfolder. Since we are adding a new WCF adapter, I made a copy of the WCF-CustomProviderManifest.xml file and renamed it to WCF-NetTcpProviderManifest.xml. If your particular adapter requires additional context properties, just add the entry for the appropriate assembly to the manifest file.

Final Steps

Once the above steps have been completed, restart Visual Studio and bounce the BizTalk host instances. At this point you should be able to see the new adapter in the itinerary designer. Please note that the new adapter will not exist in the transport type BRE vocabulary. If you are using the BRE resolver for routing you will need to either update the vocabulary or set the transport type using a static string.


As you can see, building new custom adapter providers is pretty straightforward once you know what needs to be done. I hope this helps anybody else who needs to add support for additional WCF adapters to ESB Toolkit. Since it was so simple to add support for the WCF-NetTcp and WCF-WebHttp adapters, it raises the question as to why Microsoft has not made it a standard part of ESB Toolkit themselves. Hopefully we will see these two adapters included when BizTalk 2016 is released next year.

Simulate Batch Processing in BizTalk Using a Database and a Service Window

On a recent project I had a need to simulate a batch aggregation process using BizTalk. In this scenario, the source system would produce data files. Each file would represent one business transaction. The destination system, however, was expecting to receive all of the transactions in a single file. There is more than solution to this scenario. For example I could have built a parallel convoy but in this case that seemed like a lot of effort based on the requirements and the volume of data that will be passing through this interface. Instead I was able to “simulate” a batching process using a simple table in SQL Server and a BizTalk service window.

The first step is to create a staging table in a SQL Server database. In my case the destination system was receiving a flat-file with one row representing one business transaction so I modeled my database table using a similar structure.

The process consists of two message flows. The first process picks up the data files from the source system and inserts the data into the staging table. The second process then reads the rows from the staging table into a single message and then maps them to the flat-file for the destination system. The key item is to configure the SQL receive location with a service window that does not overlap with the source system’s schedule for creating files. The idea is to ensure that all of the source files have been processed and committed to the staging table before the second process queries the database. For example, if the source system is configured to create data files at 5:00pm, you may want to configure the SQL receive location service window to process files between 6:00pm and 6:10pm. If the source system is going to run on a continuous schedule you may want to configure a service window on the first process too. Just make sure that the two service windows do not overlap.

I found this to be an easier way to aggregate messages using BizTalk without introducing the complexity of a parallel convoy. Plus with a convoy there is always the danger that one or more messages may not arrive in a timely manner and cause zombie instances to show up in the suspended message list.

Mapper Property Promotion

Here is a little something that I did not realize was happening when BizTalk executes a transform. If you have a property schema associated with the destination schema in a map, BizTalk will automatically promote those properties when it executes the map. This does not help if you need to promote an adapter context property or another property schema that is not associated with the destination schema. For those you will still need to use another tactic like promoting properties using an initializing correlation set.

Generally I promote properties for the adapters or a generic envelope schema and I just recently observed this behavior. If you are working on a project that uses promoted context properties to route messages this may be an easy way to set those values.

ESB Toolkit: ESB Dispatcher Disassemble Pipeline Component

One of my favorite aspects of BizTalk is that there is always something new to learn. I have seen the ESB Dispatcher Disassemble component and I know it can be used to debatch an XML file and execute an itinerary. What I did not know is that it can be used to route a single message to multiple recipients. All you need to do is add a resolver for each destination endpoint to the routing service in the itinerary. The ESB Dispatcher Disassemble will generate one message for each configured resolver.

The one downside is that using this component can limit your options for selecting an itinerary. Since this is a disassemble stage component and it executes the itinerary, you need to select your itinerary in the decode stage. This means that none of the message context properties will be promoted and will not be available to the context resolver or the BRE resolver. If you can live with the limitations or are willing to build a custom decode stage component to assist with selecting the itinerary, the ESB Dispatcher Disassemble component is a relatively simple way to route a message to multiple destination endpoints using the ESB Toolkit.

2014 Favorite Books

I am a little late with my favorite books list for 2014, but better late than never. Right? Unfortunately I did not get to read as many books as would have liked this past year. Last Year I mentioned I wanted to get into Big Data but that did not happen as planned. Instead I spent my reading time learning about APIs and distributed systems. As always, none of the links below are affiliate links. I link directly to the publisher for ebooks and to Amazon for paper books as that is where I make my own purchases.

Favorite Books 2014

  • RESTful Web APIs - This is the first book I have read on REST APIs that provided practical advice on building services using hypermedia.
  • Designing Evolvable Web APIs with ASP.NET - Similar to the previous title, this book provides lots of practical advice for building hypermedia APIs using the ASP.NET Web API framework.
  • Understanding Computation - I have not read anything on the theory of computation or language design since I was in school. I really enjoy this book as it provides a nice introduction and review of those two topics and presents them in a way that does not require a computer science degree to understand.
  • Lean from the Trenches - This book was all about practical advice on using Lean principles to manage software projects. I really like the way the author uses retrospectives on a real project as the examples to show what works and what does not.

The list is a little short this year. I hope to get back into my normal reading groove for 2015.

Install ESB Portal Database on a Named Instance

On my current project I had a requirement to install the ESB Toolkit exception portal database on a named instance of SQL Server. This particular client was sharing a single physical server among multiple database applications for their development environment. Some of these applications, like BizTalk, make changes to the SQL Server configuration settings that are optimal for normal database operation. To isolate these applications they created a named instance for each application that required special configuration settings.

There are plenty of guides for installing and configuring the ESB exception portal out there already so I will not replicate those instructions here. If you need a guide, I am a fan of Configure BizTalk ESB Toolkit 2.2 Management Portal. Installing the portal on a named instance is not complicated but I have not seen it documented. It was not obvious to me until I read through the batch and PowerShell installation scripts.

When you run the Management_Install.cmd batch script, add an argument to the end that specifies the name of the SQL Server instance you wwould like use for the portal database. So if you have a server named devsql with an instance named bts the command would look like this:

.\Management_Install.cmd devsql\bts

The user running the script will need to have permission to create new databases in the named instance. Assuming the prerequisites and security have been configured correctly the script should install the portal database to the specified SQL Server instance.

If you would rather run the PowerShell script directly, the SQL instance name is the second argument to the script. The first argument to the PowerShell script is the IIS website ID for the ESB web services. Using the same values as before, the command would look like this:

.\Management_Install.ps1 1 devsql\bts

2013 Favorite Books

I enjoyed writing up a post about my favorite books at the end of last year so I thought it would be fun to do it again. Maybe I will turn this into an annual event. This year I managed to read 21 books and 14 research papers. Note these are not affiliate links and I will get absolutely nothing if you click on them.

Favorite Books 2013

  • SOA with REST - This book does a great job of mapping the REST architectural style to SOA patterns. It demonstrates how to build SOA solutions using REST for the implementation.
  • Building Hypermedia APIs with HTML 5 and Node - Don’t let the title fool you. This book is full of lots of great information on building APIs with hypermedia. I don’t know (and don’t care) much about Node.js and I didn’t have any trouble following the code samples.
  • SOA Patterns - This book is nice collection of SOA patterns and anti-patterns. I like the fact that the patterns are described in a way that isn’t specific to a single technology stack.
  • Practical Project Initiation - While I am a technologist and not a project manager I still found some great tips on how to start a project off on the right foot.
  • Bad Data Handbook - This is collection of essays on data problems and how to work around them. A couple of the scenarios presented were things I have experienced in my own projects.
  • Dune - This was my fun reading for the year. I read all six of Frank Herbert’s Dune novels. Overall I enjoyed the story though it gets a little preachy towards the end of the series.

For 2014 I hope to get into Big Data and some related subjects like functional programming. I have been taking advantage of the holiday sales to stock up on books to read in the coming year. Until then.

BizTalk 2010 EDI: Configuring ISA11 as Repetition Separator

Had an interesting EDI issue that occurred recently with how BizTalk 2010 parses the value for ISA11. In older EDI versions, 4010 and below I believe, ISA11 was called the “standards identifier” and was always set to “U”. Newer versions of EDI added a new delimiter called the “repetition separator” and repurposed ISA11 to define what this character would be in an interchange.

In the event we were seeing errors like this:

Error: 1 (Miscellaneous error)
    16: Invalid Control Standard Identifier

Error: 2 (Field level error)
        SegmentID: ISA
    Position in TS: 1
    Data Element ID: ISA11
    Position in Segment: 11
    Data Value: ^
    7: Invalid code value

BizTalk should automatically parse this correctly based on the EDI version defined in the schema but in my case it was defaulting to the old behavior even though the version in the schema was 4060. We even tried to setting the UseIsa11AsRepetitionSeparator property on the EDI pipeline to True but BizTalk was still parsing it like a 4010 transaction.

We fixed this using the following steps:

  1. Open the party properties and ensure the Local BizTalk processes messages received by the party or supports sending messages from this party option is checked.
  2. Open the agreement properties and choose the outbound settings tab.
  3. Select the Envelopes option from the tree view on the left.
  4. Under ISA11 usage select the Repetition separator option.
  5. Go back to the party properties and uncheck the Local BizTalk processes messages received by the party or supports sending messages from this party option.
  6. Restart the host instances used to receive EDI transactions to apply the changes.

This is what my Envelopes configuration looks like:

Envelope settings

I do not understand why changing the outbound settings would have any impact on how BizTalk parses an inbound interchange but it works. At some point I will have to see if this is still an issue in BizTalk 2013.

Be Careful When Using the %SourceFileName% Macro

One of my team members ran into a interesting problem when using the %SourceFileName% macro in a send port. In this scenario the client was receiving EDI data in files and they wanted a copy of the original data sent to an archive folder using the original file name. During testing we would occassionally see the following warning messages in the event log:

"The FILE send adapter cannot open file C:\Work\BizTalk\SourceFileNameTest\Out\EDI_X12_864 - Copy.edi for writing.
Details: The file exists."

What was happening is sometimes a single EDI file (interchange) will contain multiple transactions. By default, BizTalk will split the interchange into individual transactions in the EDI pipeline. Since all the transactions came from the same source file, each transaction will contain the same value in the FILE.ReceivedFileName context property. This is the context property that the macro uses to name an output file. When BizTalk would attempt to write the transactions to the archive folder, the first transaction written would succeed and the remaining transactions would trigger the file exists warning.

To work around this behavior you can do one of the following:

  1. Add some sort of unique identifier to the file name. For example instead of using just %SourceFileName% you could combine it with %MessageID% to generate a unique file name for each transaction.

  2. Configure the file adapter to allow appending to an existing file.

  3. Configure the EDI party agreement to preserve the interchange. This would keep all the transactions in a single message. In this case you would need to build additional functionality into your BizTalk application to manually split the interchange.