Gobot and Arduino

Gobots

Go and Arduino

I have been trying to get back into working with my Arduino again, but I find myself wishing I could use a higher-level language like Python instead of C. While I have a decent grasp of C, I would much rather work with a language with modern affordances like garbage collection. Luckily, I stumbled over a framework for Go that fits the bill. While Go is definitely a lower-level language than Python, it is higher level than C and is nicer, at least in my opinion, to work with than C.

Enter Gobot

The framework I discovered is called Gobot. Gobot is a framework for building drones, robots, and IoT devices. It has support for numerous devices and platforms, including the Arduino and the Raspberry Pi boards that I own. Just in case I need to set up another Arduino board in the future, here is how I was able to install the Firmata library on the Arduino board.

Getting Started

Install Firmata

Gobot uses the Firmata protocol to communicate with with the Arduino, so the first step is to load the Firmata library onto the board. The Firmata library ships with the latest version of the Arduino IDE, so I downloaded and installed the IDE package, and followed these instructions for installing Firmata.

  1. Download the latest version of the Arduino IDE and install it
  2. Connect the Arduino board to the laptop
  3. Configure the IDE with the board type and port
    1. For the board type, go to the Tools | Board menu and select the proper board from the list
    2. For the port, go to the Tools | Port menu and select the port that has the board name next to it
  4. Upload the Firmata sketch to the Arduino board
    1. Go to the File | Examples | StandardFirmata menu to open the sketch
    2. Click the Upload button and wait for the sketch to build and transfer
    3. When the process is complete, you should see the message, “Done Uploading” in the status window

The board should be ready to work with Gobot now.

Install Gobot

For our purposes here, I am assuming you already have the Go language and environment already installed.

  1. Install Gobot into your local environment
    1
    $ go get -d -u gobot.io/x/gobot/...
  2. Go grab a snack and drink while you wait for the framework to download and install

Execute a Sample Project

  1. Connect an LED to the Arduino using your favorite sample, I am using the directions that came with my Adafruit Arduino kit, and make a note of which pin it is connected to.
  2. Enter the following code sample from the Gobot Getting Started documentation and save it in a file called blink.go; if necessary change the pin number to correspond to the pin that the LED is connected to
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    package main
    import (
    "time"
    "gobot.io/x/gobot"
    "gobot.io/x/gobot/drivers/gpio"
    "gobot.io/x/gobot/platforms/firmata"
    )
    func main() {
    firmataAdaptor := firmata.NewAdaptor("/dev/ttyACM0")
    led := gpio.NewLedDriver(firmataAdaptor, "13")
    work := func() {
    gobot.Every(1*time.Second, func() {
    led.Toggle()
    })
    }
    robot := gobot.NewRobot("bot",
    []gobot.Connection{firmataAdaptor},
    []gobot.Device{led},
    work,
    )
    robot.Start()
    }
  3. Run the sample program
    1
    $ go run blink.go

After the code compiles and gets uploaded to the board, you should see the LED blink. At this point everything should be ready for more advanced projects.

Next Steps

Now that I have my Arduino up and running with Gobot, my next challenge is to go back and redo the sample projects that came with my kit and see if I can get them all working in Go. Once I finish, I figure it will have given me enough practice to start working on something more elaborate.

Until next time, thanks for reading!

New Year, New Adventure

DevOps gears

New Job

In my last post, I mentioned that I was looking for a new job. Late last year I accepted an offer to become the DevOps Manager at VHT here in Akron, OH. I have held off on announcing it until now to allow StoneDonut to communicate my exit to their customers and partners. I have taken a much-needed vacation the first two weeks of this year, and have spent the past two weeks getting settled in my new position in DevOps.

Devving the Ops

I am very excited to be joining the DevOps team at VHT. I really enjoy working on tough technical problems and building tools and platforms for other developers. I think the DevOps space will be a great fit for my interests and my career growth. The opportunity at VHT was especially attractive as it will give me the opportunity to grow as a technical manager, while still allowing me to perform hands-on work as well. I am looking forward to using my development and integration experience to build the next-generation CI/CD platform for VHT’s development teams.

The Only Constant is Change

As one might expect, this career move will bring changes to my personal projects, including this blog. For most of my career, I have primarily worked with .NET, BizTalk, Azure, and other parts of the Microsoft stack. However, VHT is primarily a Linux and Amazon Web Services shop. As a result, the focus of this blog will shift to focus more on Continuous Delivery, AWS, and open source tools like Ansible and Jenkins. I plan on keeping somewhat current with C#, .NET Core and parts of Azure, but I will not be writing blog posts about BizTalk and enterprise integration any time soon. (Though maybe I will write up some posts about integrating open source tools together as part of a CD pipeline!) I am also hoping to write some posts about the custom tools and platforms we build at VHT.

On one hand, I am sad to leave StoneDonut. After 10 years, I definitely consider my former partners Chuck and Michael as mentors and friends. I most certainly would not be here without their help. However, I am also really excited to see where this new road leads. Thanks again for reading.

2017 Favorite Books

Not sure how it happened so fast, but all of a sudden we are near the end of 2017. For the first time, I have more non-technical books than purely technical on this list. I am interested to see if this is an outlier, or the beginning of a trend. Tune into my favorite books post next year to find out!

Note: None of the links below are affiliate links. I get nothing from linking to these books. I try to link to the author or publisher site when possible, and Amazon as the default.

Favorite Books 2017

  • The Mythical Man Month - The classic book on software engineering and project management. Despite its age, there are still many relevant insights for modern software projects.
  • Data Science from Scratch - This book provides an interesting introduction to data science. Instead of starting with specific tools and libraries, it teaches data science fundamentals by having the reader implement them in Python.
  • Microservices in .NET Core - This book covers the fundamentals of building microservices and assembling them into complete systems. Examples are built using .NET Core, OWIN and the NancyFX web framework. Definitely worth a look if you are interested in building microservices using the next generation .NET framework.
  • Think Like a Data Scientist - This book presents an end-to-end process for applying data science to solve problems. It starts with defining goals and initial analysis, proceeds to the technical stages of development, and ends with final presentation to end users. Perfect for people like me who are comfortable with the technical aspects of data science, but have little experience in the realms of business analysis and presentation.
  • The Monuments Men - A history narrative about the men and women who worked to protect and secure the greatest art treasures in the world during World War II. If you liked the movie, the book goes into a lot more detail about the mission of the monuments division.
  • Console Wars - This book chronicles the history of the video game history starting in the mid-1980s with the rise of Nintendo and Sega. It is full of behind-the-scenes information and is a really fun read for somebody like me who is a big fan of video games and grew up during this time period.
  • Deep Thinking: Where Machine Intelligence Ends and Human Creativity Begins - The epic story of the battle between former world chess champion Gary Kasporav and IBM’s supercomputer Deep Blue. Kasporav uses his personal experience to discuss the future of artificial intelligence and creativity.

Towards a New Adventure

Adventure for the Atari 2600

Time for a Change

10 years ago, I decided to hang my shingle and start my own consulting business. Soon after, I became a partner at StoneDonut. Over these past years, I have had an amazing adventure as a consultant focusing on integration. I am blessed to have two awesome partners in Chuck Loughry and Michael Schenck and I have learned so much from working with them. I had the privilege of designing and implementing BizTalk middleware solutions for enterprise customers both large and small. It has been a wild ride and I am very proud of the work we have done.

However, I have become interested in a number of newer technology areas, like DevOps, Cloud-native distributed systems development and Data Science. These areas do not really fit into StoneDonut’s core business focus of integration, service orientation and business process management. Add to that the feeling over the past few years that I have plateaued in my personal growth as a developer, and it became clear that something needed to change. After much deliberation and with more than a little trepidation, I have decided it is time fore me to move on and pursue a new direction in my career.

So, What is Next?

I do not have an answer to this question yet. Right now I am open to new opportunities, and I am talking to people in my network to get an idea of what is out there at the moment. I have already had a number of recruiters contact me with interesting positions, so I do not think it will take too long to find something that interests me. I am equally comfortable in Windows and Linux environments and I am looking forward to applying my development experience to new challenges. If you are reading this and think that my experience would be beneficial to your company in one of the three areas above, feel free to contact me on Twitter, LinkedIn or email.

As always, thanks for reading!

Mediatr Custom Behavior: Logging

Warning!

Before I begin, a brief word of warning. I have not yet decided if the technique described below is a good idea or a terrible idea. So far it seems to be working well, but I have not exercised it enough to confidently recommend that others use it. Now that the safety advisory is out of the way, on with the show.

ASP.NET Web API Logging

I have been working on a Web API project where I wanted to have teh API log some basic information on every web request. What I did not want was logging code splattered in my API controllers. I am using Mediatr to decouple my application logic from the Web API framework, but having logging code splattered all over my message handlers did not feel like an improvement. After considering my available options, I decided to try building a custom Mediatr behavior, and adding it to a Mediatr pipeline. To facilitate this, I created an interface to define the metadata that I wanted to be logged.

The interface simply references another class that contains all of my data elements:

1
2
3
4
5
6
7
8
9
10
11
12
13
public interface IAPIRequestContext
{
APIRequestMetadata APIRequestMetadata { get; set; }
}
public class APIRequestMetadata
{
public Guid RequestId { get; set; }
public string CurrentUser { get; set; }
public string Controller { get; set; }
public string Method { get; set; }
public Dictionary<string, object> Parameters { get; set; }
}

This interface is then added to my Mediatr message definitions:

1
2
3
4
5
public class Query : IRequest<Result>, IAPIRequestContext
{
public string Status { get; set; }
public APIRequestMetadata APIRequestMetadata { get; set; }
}

Finally, I have custom Mediatr pipeline behavior that casts the Query object to the IAPIRequestContext and logs (using Serilog) the data in the APIRequestMetadata object:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
public class LoggingBehavior<TRequest, TResponse> : IPipelineBehavior<TRequest, TResponse>
{
public async Task<TResponse> Handle(TRequest request, RequestHandlerDelegate<TResponse> next)
{
Log.Debug("Entering LoggingBehavior with request {Name}", typeof(TRequest).Name);
var reqCtxt = (request as IAPIRequestContext);
if (reqCtxt != null)
{
if (reqCtxt.APIRequestMetadata != null)
{
var metadata = reqCtxt.APIRequestMetadata;
Log.Information("Request Id: {RequestId}, Current User: {User}, Controller: {Controller}, Method: {Method}", metadata.RequestId, metadata.CurrentUser, metadata.Controller, metadata.Method);
if (metadata.Parameters != null)
{
foreach (var param in metadata.Parameters)
{
Log.Debug("Request Id: {RequestId}, {ParameterName}: {ParameterValue}", metadata.RequestId, param.Key, param.Value);
}
}
}
}
var response = await next();
Log.Debug("Leaving LoggingBehavior with request {Name}", typeof(TRequest).Name);
return response;
}
}

With this setup, every controller in my API project emits a standard logging event, with having the logging code located in every one of my Mediatr handler methods. My plan is to combine this with a metrics pipeline behavior so I can track which API methods get used the most, and see how well they perform.

So, do you think this is pretty good idea for handling logging, or do you think this is the worst idea you have ever seen? Feel free to send me comments on Twitter or LinkedIn either way.

Version Gotcha When Using Enzyme with React

The Gotcha

I recently ran into a little gotcha with the Enzyme testing library and I want to document the fix in case I run into the same issue later on. I was working through the TypeScript React Starter tutorial, and noticed some odd output warnings in my console when running my unit tests:

Warning: ReactTestUtils has been moved to react-dom/test-utils. Update references to remove this warning.

Warning: Shallow renderer has been moved to react-test-renderer/shallow. Update references to remove this warning.

My tests were executing successfully, but I wanted to figure why these errors were occurring. It turns out that there are additional modules needed when using Enzyme with React and Jest, and these modules will differ based on which version of React you are using.

The Solution

If you are using a React version >= 15.5, you will need to install the react-test-renderer module in addition to Enzyme:

npm install react-test-renderer --save-dev

If you are using a React version older than 15.5, you will need to install the react-addons-test-utils module:

npm install react-addons-test-utils --save-dev

The tutorial I was following had instructions to install the latter, which was probably correct at the time it was written, but React v15.6.1 was installed as part of the project setup. Once I uninstalled react-addons-test-utils and replaced it with react-test-renderer my tests ran successfully without the additional warnings.

Running xUnit Tests with VSTS

Introduction

A few weeks ago I set up my first automated build pipeline in Visual Studio Team Services. For the most part it was fairly easy to setup and configure, but I ran into some issues getting my xUnit tests to run. The fix is simple, but I figure I will not be setting up these builds very often and I do not want to have to figure it out again in the future.

The Problem

Note: These instructions apply to the full .NET Framework, not .NET Core. I followed the instructions in the xUnit documentation for configuring the test runner for VSTS. The documentation said to set the Test Assembly field to the following:

**\bin\$(BuildConfiguration)\*test*.dll;-:**\xunit.runner.visualstudio.testadapter.dll

However, when the test step of the pipeline would execute, it would raise a warning that it could not find any test assemblies that conformed to the above pattern. I tried fiddling with the other options, but the pipeline still could not locate the test assemblies.

The Solution

Thankfully, the solution is very simple. Instead of setting the Test Assembly field to a one line expression, break it into two lines and remove the trailing semicolon:

**\bin\$(BuildConfiguration)\*test*.dll
-:**\xunit.runner.visualstudio.testadapter.dll

Once I made this change, the test step was able to find the test assembly and execute the tests. My best guess is that a VSTS update made changes to the Test Assembly field and the xUnit documentation has not been updated yet.

Hopefully this blog post will help others who run into this issue, as well as future me next time I need to setup a VSTS build.

Hexo Global License Plugin

Hexo Global License

One of the Pelican features I really liked was the global license plugin. This plugin took a configurable string representing a license statement and placed it into the footer of every page on the site. In my case, this was pretty handy as I license all of my blog content as CC-BY-SA and this way I did not have to remember to include an explicit license statement on each post.

Hexo did not have this functionality, so I sat down and built a Hexo plugin to mimic the behavior of the Pelican plugin. Thus, hexo-global-license was born. Since I use a Creative Commons license for all of my content, I built in support for all of the latest Creative Commons licenses. Add the appropriate settings to the config.yml file and the plugin will add the appropriate license text, image and link to the Creative Commons website. In the case where you want to use a different license, you can set the license type to custom and then include the text you want in the configuration.

The plugin is available on npm, and once I use it some more and shake out any major bugs I will submit it for inclusion in the Hexo plugins registry. If you try it and encounter problems, or if you have ideas to improve it please file an issue on the GitHub page.

Live on Hexo

Finished

Hexo is Live

If you are reading this, then the transition to Hexo has completed successfully. Hopefully the redirects are all working properly and new articles are showing up in your favorite feed reader. This should remove a lot of my blogging friction and make it easier for me to write articles more often.

Thanks for reading, and I look forward to posting some new material soon.

Moving to Hexo

We're Moving

Moving to Hexo

Just a heads-up for anybody who subscribes to this blog via RSS or ATOM. Sometime within the next week I will be migrating this blog from Pelican to Hexo. Pelican has served me well the past several years, but it is starting to cause enough problems that I want to try another static site generator. Pelican is a little awkward to run under Windows. For Python programs like Pelican, I typically install them inside of some form of virtual environment like virtualenv or an Anaconda environment. This means I have to remember to switch environments which I usually forget to do until I get strange errors trying to work with Pelican. Managing the plugins and themes is also somewhat different. I do not change either very often, and I end up having to read the documentation every time I want to add or update something. The final reason I am moving away from Pelican is it is rather slow. It takes a good 30-40 seconds to regenerate the site every time I hit save in my editor. This is really painful when experimenting with page layout for a new article.

After trying out a few different static site generators, I have settled on Hexo for the next version of my blog site. Hexo is a Node.js application and is a much smoother experience on Windows. During development, my site can regenerate in 3-5 seconds, which I really like. Plus, plugins and themes are all managed through npm which makes it easy to keep things updated.

The one downside to Hexo is it will not allow me to have both RSS and ATOM feeds. Going forward, the site will have an ATOM feed and a new JSON Feed. I am going to have automatic redirects configured that should route requests from the old feeds to the ATOM feed. Most feed readers that I know of understand both RSS and ATOM, so this should be seamless for most readers. I have a test post queued up for next week after I make the move, so if you are not seeing new content by then I would suggest manually updating your subscription.

Hopefully this will be a relatively smooth transition.

Thanks for reading!