Developer Journal - taskbook

Getting Started

I really like small, highly functional command line tools. I am a long-time user of Vim, and I am always on the lookout for additional tools that I can use without removing my hands from the keyboard. I am not necessarily against GUI tools, but I find myself to be most productive from a developer standpoint when I limit their use.

This time around, I want to take a look at a CLI productivity tool for managing tasks called taskbook.

Taskbook

Taskbook is command line program for managing tasks and notes. It has a nice, compact command syntax and a number of features for managing and categorizing tasks. Along with the basics of adding, deleting and marking tasks as complete, it has the ability to set categories, priorities, and even tag important tasks with a star. It is easy to install, and includes comprehensive help. It is also multi-platform and works well on Linux, Windows, and MacOS.

Prerequisites

Taskbook is written in JavaScript, and requires a reasonably recent version of Node.js. (I have used it with v8.x and v10.x)

Installation

From your favorite terminal program, enter the following to install taskbook:

>npm install --global taskbook

Usage

Once the installation process completes, you can try taskbook with the command:

>tb

By default, it will display the current list of tasks and notes. To view the help, enter this command:

>tb --help

With the help, it should be easy to get started entering and managing tasks and notes. More comprehensive information is available from the GitHub page above.

How I Use It

My primary task management tool is a Bullet Journal I keep in a paper notebook. However, I like to keep small task lists and notes for some of my personal programming projects. I mostly use it for small experimental projects to avoid cluttering up my paper notebook with failed experiments. It gives me an easy way to group my tasks and notes for test projects and keep track of where I left off.

Developer Journal - exa

Introduction

So, it looks like my goal of posting on a weekly basis was a little too ambitious. I stll think I can maintain a more regular posting schedule than I have in the past, but in the end I will need to make allowances for times when I do not have the time or the energy to write blog posts. Now on to the main point of this blog post.

I have been a long-time fan of command-line development tools for many years. While a good portion of my career has been spent in the Microsoft ecosystem, I still prefer using CLI tools over graphical tools integrated into development environments like Visual Studio. I would like to share some of my favorite CLI tools in hopes that it will encourage others to try them out and see how they can improve the development experience.

exa

exa bills itself as a modern replacement for the ubiquitous Linux ls command. The purpose of exa is to create a better file listing experience by having features that ls lacks, and having a more sensible set of defaults. It is written in Rust and ships as single binary file for Linux, with no additional dependencies or runtime requirements. At the moment, exa is a Linux-only tool, but so far I have had zero issues using it with the Windows Subsystem for Linux in Windows 10.

If you look at the screenshot of exa above, you will notice that exa, by default, provides a much nicer and more colorful file listing when compared with the default ls experience. Along with the nicer color defaults, exa includes some additional sorting options, including the ability to sort on any field in the output, and built-in git awareness that will display the current git status of files and remove files from the output that are defined in .gitignore.

While exa supports some of the same options as ls, it is not a complete replacement for ls. I would avoid aliasing it to ls as it would likely break scripts that are expecting the standard ls options and output.

Wrapping Up

If you find yourself wishing ls had better output, I would encourage you to give exa a try. It takes a little practice to remember to type exa instead of ls, but in my opinion it is worth it.

Weekly Journal 3 - Tool Containers

Introduction

This week I have something a little different. At work we have been using containers for running our automation tools. Containers make it easy to share our tooling in a consistent manner, without having to worry about the classic, “It works on my machine!” issues. I have found that containers also make it easy to have multiple versions of a tool available. Containers are also an excellent way to run Linux tools on Windows. It has worked so well, I have been experimenting with putting tools into containers for some of my personal projects. I figure it might be fun to share one of them here.

HTTPie

HTTPie is a CLI HTTP client, written in Python. It is similar to curl, but in my opinion has a much nicer interface. As it is a Python program, putting it into a container allows me to choose the version of Python specifically for this tool. For example, if I needed to run a tool that only works with Python 2, but the system Python on my machine is Python 3, I can create a container based on Python 2 without disrupting the system Python installation.

Dockerfile

I used Docker Desktop for Windows to create the HTTPie container, but this should work equally well with Docker for Mac or Docker Community Edition on Linux. To create the container first you need to create a Dockerfile that defines the contents of the container. For HTTPie, I used this Dockerfile from GitHub:

1
2
3
4
5
6
7
8
FROM alpine:latest

RUN apk add --no-cache python3 && \
pip3 install --upgrade pip setuptools httpie && \
rm -r /root/.cache

ENTRYPOINT [ "http" ]
CMD ["--help"]

Building the Container

To build the container, run the following command from the directory that contains the Dockerfile:

>docker build -t alpine-docker/httpie .

After a minute or so, the container should be built and in the Docker image cache.

Setting Up the Alias

To run the container, type the following command:

>docker run -t -i alpine-docker/httpie

If everything has worked properly, you should see the HTTPie help text. Typing in that command every time you want to use a tool is awkward at best, so we will set up an alias to make it easier.

For Windows Powershell, add these two lines to your profile:

1
2
function httpie_container() {docker run -i -t alpine-docker/httpie @Args}
Set-Alias httpie httpie_container

For Linux, add this line to your shell profile:

1
alias httpie="docker run -ti alpine-docker/httpie"

With the alias in place, you can type httpie to run the container and HTTPie:

>httpie -v https://duckduckgo.com

Wrapping Up

For the most part, setting up individual tools in containers is a straightforward process. The primary challenge is building the Dockerfile. In many cases, at least for popular utilities, you can likely find an existing Dockerfile on GitHub or Docker Hub, and either copy it wholesale or use it as an example to get started. The next step for me is to build more of these containers so I can have an entire library of tools available.

Weekly Journal 2 - rcm

Introduction

A short post this time around. I have been experimenting with ways to manage and backup my dotfiles. In the past I have copied the files into a git repository and stored them in GitHub, but it requires me to remember to refresh those copies whenever I make changes. I have been looking for a better option that would require less manual work to keep things properly backed up.

rcm

rcm is a tool for managing dotfiles. You create a separate directory to hold your dotfiles (I called mine .dotfiles) and create a git repository in it. Then you use the rcm command mkrc to add your configuration files to the dotfiles directory. mkrc moves the files to the directory, and creates a symlink in the original locations. Once all of the configuration files have been moved to the dotfiles directory, you can add them to the git repository and push them to GitHub, or your favorite remote git hosting service.

The really neat feature is you can easily move your dotfiles to a different computer system by cloning the git repository and running the rcup command. This will automatically replace the dotfiles on the new computer with symlinks to the dotfiles in the git repository. There are additional commands and advanced configuration options to play with. Check out the GitHub site and the man pages for all the details.

Limitations

Similar to direnv, rcm is made for Unix-like systems like Linux. It probably works with WSL in Windows, but I have not had a chance to try it out there yet.

Weekly Journal 1

Introduction

One of the challenges I have had as an operations manager is managing configuration settings for the many tools and applications we use on a daily basis. I have been looking for a sane way of managing configuration settings that did not involve abusing my .zshrc/.bashrc files, or creating dozens of one-off aliases. I think I have found a pretty good solution in the form of direnv.

direnv

direnv is a tool for automatically setting and switching environment variables. It works by adding a special configuration file. .envrc, in each project directory containing the environment variables for that project. When you enter that directory in a shell like bash or zsh it automatically sets those values in the current session. direnv also unsets the environment variables when you leave the project directory, which is nice from a security/credentials stand point. It makes it more difficult to accidentally use the wrong settings because you have forgotten what was last set.

It also support cascading configuration files. This means you can set global settings in the project root directory, and have more specific settings in a sub-directory.

Git Configuration

Since I tend to add things like API keys and other secrets to my .envrc, I have a global ignore rule in git to prevent them from being accidentally added to a GitHub repository In my case, I have a .gitignore_global file in my home directory that contains .envrc as a pattern to ignore, and git is configured to use that as the global ignore file.

>$ git config --global core.excludefiles '~/.gitignore_global'

Limitations

Near as I can tell, direnv only supports Unix-like systems like Linux or Mac, but it does not have support for Windows Powershell. It may work with the Windows Subsystem for Linux, but I have not had a chance to try that yet. I use it on Linux, and it works very well there.

If you primarily work in a Linux or Unix environment I highly recommend giving it a shot.

Weekly Journal 0

Introduction

Over the last several years, I have struggled to maintain consistency with this blog. To try and improve on my posting regularity, I am going to try something a little different. This weekly journal will be a short entry that I can put together once a week or so that documents something interesting I have learned about. In the past I have had trouble getting motivated to work on longer articles, but I am hopeful this will help me establish a more frequent posting schedule. The idea is that if I can establish a regular posting cadence, maybe I can use that to motivate myself to write more longer posts too.

Personal Productivity

For my first journal post, I am going to write about changes to my personal project and task tracking system. For the past several months I have been using a Bullet Journal for tracking projects, tasks, and my personal journal. It has been working surprisingly well, and I have decided to adopt it over the mix of apps I was using previously for this purpose.

Bullet Journal

Bullet journaling is a methodology that uses a small notebook and a system of simple templates to quickly capture and organize tasks and notes. The full methodology documented in the book by the system’s creator, Ryder Carroll, but there is a reference and getting started guide available for free on the website.

The basic idea behind bullet journaling is to capture tasks, events, and notes into the notebook during the day, and to then organize and process these items at regular intervals, called reflection in the Bullet Journal lingo. During reflection, you process items by marking them complete, rescheduling them for a later time, or crossing them off if they are not longer relevant. At the beginning of each month, you perform a larger reflection and recopy any open tasks to a new page in the journal. At first glance, it may seem redundant to go through the copying exercise, but it forces you to focus on the highest priority tasks. The review process helps to prevent accumulating a large list of low-priority tasks that will never get done.

What I really like about the system is the flexability it offers. Since it is a paper notebook, it is easy to adjust the templates or create new ones as needed. It is also easy to add bits and pieces from other productivity methodologies if it makes sense for your situation. For example, I like many of the ideas in [Getting Things Done][], and I incorporate a few them into my system. My notebook is fairly spartan, but other people create some spectacular layouts in their journals. You can find examples by searching for Bullet Journal using your favorite search engine, or by checking something like Pintrest. I lack the artistic talent necessary to make more elaborate layouts, and I would much rather spend time on my projects and tasks than on my notebook.

Prior to getting hooked on bullet journaling, I used a combination of an app like Todoist or Microsoft Todo to capture my projects and tasks, and OneNote to record my notes. Now I do most everything in my paper notebook instead. I still copy some of my information into OneNote so I can easily reference it on multiple devices and so I can use the search tools. However, everything initially gets created and managed in my notebook.

If you are looking to make changes or improvements to your personal productivity system, I would encourage you to give bullet journaling a try. With that, I am going to close this this entry. Hopefully the first of many to come. Thanks again for reading.

Note: In the spirit of full disclosure, I receive no compensation for this blog post. The links are not affiliate links; I am just a vocal fan of the system.

A New Direction

Changes Ahead

A New Direction

One-Year Anniversary

At the end of 2017, I left my role as part-owner of an IT consulting firm to take an in-house engineering manager role at a local software company. I have been in my new role as a manager for almost one year now. My employer affords me many of the amenities and perks that I used to enjoy when I was self-employed, so I do not feel like I lost too much. The biggest adjustment so far is I have not spent as much time doing actual technical work. I spend at around half of my time performing my management duties, and with the remaining time I get to do some DevOps engineering work.

In a way, I get to experience the best of both worlds. I still continue to work on my technical skills, both at work and on personal projects at home, but I do not get to spend as much time in the trenches as I used to do. So far, I like being a manager and I plan on sticking with it. It has its own set of challenges, and I look forward to learning and doing more in the coming year. I am still a developer at heart, so I would like to stay close to the engineering side of the business.

The Blog

Focus Shift

My overall blogging output has been way down since I stopped the “Distributed Weekly” list back in 2013. I think I have wasted more time feeling guilty about not blogging as I have actually writing articles. I am at a crossroads where I need to make a decision about whether or not I wish to continue having this blog. I have never had any sort of advertisements or other financial support for this blog. While it is not overly expensive, there is no sense in spending money to host my blog if I am not going to keep it updated.

2019 will be the year where I make the hard decision. If I cannot keep a reasonable posting schedule through the first half of this year, it will be time to retire the blog. I have an idea to create a weekly journal around a combination of interesting links I have seen, and new things I have learned. I am hoping this will give me new material to blog about, and an incentive to write the articles.

The Alternative

If I am unable to maintain a regular schedule, my plan is to archive the blog in my GitHub account. The source of my blog articles are stored as Markdown files, so they should render as-is in the GitHub web UI. If I am feeling motivated, I will attempt to deploy the rendered output using GitHub Pages. Either way, the articles themselves will be preserved on the off chance somebody other than myself wants to view them.

The first of the new journal articles should appear within the next week or so. If not, then it should be easy to see which direction this blog will go. Thanks for reading.

2018 Favorite Books

So, not my most active year for blogging, but I figure I can at least put together a quick post with my favorite books for the year. My reading this year reflects my transition to a dual-role DevOps engineer and manager. I also have a new plan for my blog in 2019, but I will cover that in a separate post.

Note: None of the links below are affiliate links. I get nothing from linking to these books. I try to link to the author or publisher site when possible, and Amazon as the default.

Favorite Books 2018

  • The DevOps Handbook - The definitive guide to implementing DevOps practices. If you are interested in getting into or learning more about DevOps, this book should be high on your list.
  • WTF? What’s the Future and Why It’s Up to Us - An interesting read on how modern technology is shaping the economy. Lots of questions and discussion around using technology to augment human workers instead of using it to replace them.
  • Building Evolutionary Architecture - An interesting study in how to design systems that will live for a long time. It covers architectural concepts and ideas that enable systems to adapt and change over time.
  • Kubernetes: Up and Running - A solid introduction to the world of container development and running them on Kubernetes. It covers setting up and configuring a basic Kubernetes cluster all the way through rolling deployments and scaling.
  • The First 90 Days - This was assigned as required reading by my employer. It is all about leaders in transition, from starting a new job or a promotion. Lots of strategies around getting up to speed quickly so you can be an effective leader.
  • Practical Monitoring - A solid monitoring strategy is a cornerstone of DevOps. This book offers strategies around metrics, logging, dashboards and much more without focusing on specific tools.
  • The Lean Startup - The classic book on agile and lean practices to accelerate innovation in startups and in new ventures in established businesses.
  • Faith Alone: The Doctrine of Justification - I enjoy reading about theology and church history. This is a great walk through the history of justification, and a systemic defense of it from the Reformed point of view.

Gobot and Arduino

Gobots

Go and Arduino

I have been trying to get back into working with my Arduino again, but I find myself wishing I could use a higher-level language like Python instead of C. While I have a decent grasp of C, I would much rather work with a language with modern affordances like garbage collection. Luckily, I stumbled over a framework for Go that fits the bill. While Go is definitely a lower-level language than Python, it is higher level than C and is nicer, at least in my opinion, to work with than C.

Enter Gobot

The framework I discovered is called Gobot. Gobot is a framework for building drones, robots, and IoT devices. It has support for numerous devices and platforms, including the Arduino and the Raspberry Pi boards that I own. Just in case I need to set up another Arduino board in the future, here is how I was able to install the Firmata library on the Arduino board.

Getting Started

Install Firmata

Gobot uses the Firmata protocol to communicate with with the Arduino, so the first step is to load the Firmata library onto the board. The Firmata library ships with the latest version of the Arduino IDE, so I downloaded and installed the IDE package, and followed these instructions for installing Firmata.

  1. Download the latest version of the Arduino IDE and install it
  2. Connect the Arduino board to the laptop
  3. Configure the IDE with the board type and port
    1. For the board type, go to the Tools | Board menu and select the proper board from the list
    2. For the port, go to the Tools | Port menu and select the port that has the board name next to it
  4. Upload the Firmata sketch to the Arduino board
    1. Go to the File | Examples | StandardFirmata menu to open the sketch
    2. Click the Upload button and wait for the sketch to build and transfer
    3. When the process is complete, you should see the message, “Done Uploading” in the status window

The board should be ready to work with Gobot now.

Install Gobot

For our purposes here, I am assuming you already have the Go language and environment already installed.

  1. Install Gobot into your local environment
    1
    $ go get -d -u gobot.io/x/gobot/...
  2. Go grab a snack and drink while you wait for the framework to download and install

Execute a Sample Project

  1. Connect an LED to the Arduino using your favorite sample, I am using the directions that came with my Adafruit Arduino kit, and make a note of which pin it is connected to.
  2. Enter the following code sample from the Gobot Getting Started documentation and save it in a file called blink.go; if necessary change the pin number to correspond to the pin that the LED is connected to
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    package main
    import (
    "time"
    "gobot.io/x/gobot"
    "gobot.io/x/gobot/drivers/gpio"
    "gobot.io/x/gobot/platforms/firmata"
    )
    func main() {
    firmataAdaptor := firmata.NewAdaptor("/dev/ttyACM0")
    led := gpio.NewLedDriver(firmataAdaptor, "13")
    work := func() {
    gobot.Every(1*time.Second, func() {
    led.Toggle()
    })
    }
    robot := gobot.NewRobot("bot",
    []gobot.Connection{firmataAdaptor},
    []gobot.Device{led},
    work,
    )
    robot.Start()
    }
  3. Run the sample program
    1
    $ go run blink.go

After the code compiles and gets uploaded to the board, you should see the LED blink. At this point everything should be ready for more advanced projects.

Next Steps

Now that I have my Arduino up and running with Gobot, my next challenge is to go back and redo the sample projects that came with my kit and see if I can get them all working in Go. Once I finish, I figure it will have given me enough practice to start working on something more elaborate.

Until next time, thanks for reading!

New Year, New Adventure

DevOps gears

New Job

In my last post, I mentioned that I was looking for a new job. Late last year I accepted an offer to become the DevOps Manager at VHT here in Akron, OH. I have held off on announcing it until now to allow StoneDonut to communicate my exit to their customers and partners. I have taken a much-needed vacation the first two weeks of this year, and have spent the past two weeks getting settled in my new position in DevOps.

Devving the Ops

I am very excited to be joining the DevOps team at VHT. I really enjoy working on tough technical problems and building tools and platforms for other developers. I think the DevOps space will be a great fit for my interests and my career growth. The opportunity at VHT was especially attractive as it will give me the opportunity to grow as a technical manager, while still allowing me to perform hands-on work as well. I am looking forward to using my development and integration experience to build the next-generation CI/CD platform for VHT’s development teams.

The Only Constant is Change

As one might expect, this career move will bring changes to my personal projects, including this blog. For most of my career, I have primarily worked with .NET, BizTalk, Azure, and other parts of the Microsoft stack. However, VHT is primarily a Linux and Amazon Web Services shop. As a result, the focus of this blog will shift to focus more on Continuous Delivery, AWS, and open source tools like Ansible and Jenkins. I plan on keeping somewhat current with C#, .NET Core and parts of Azure, but I will not be writing blog posts about BizTalk and enterprise integration any time soon. (Though maybe I will write up some posts about integrating open source tools together as part of a CD pipeline!) I am also hoping to write some posts about the custom tools and platforms we build at VHT.

On one hand, I am sad to leave StoneDonut. After 10 years, I definitely consider my former partners Chuck and Michael as mentors and friends. I most certainly would not be here without their help. However, I am also really excited to see where this new road leads. Thanks again for reading.