Monday, March 30, 2015

Jenkins slaves in AWS

This is just a short post on a problem that I was having running Jenkins in the Amazon cloud (AWS).  Our setup consists of a master Jenkins server (running Amazon Linux) and a Jenkins slave (running Windows Server 2012 R2).  The slave was connected to the master through an ELB (Elastic Load Balancer) so that if/when the master had to change IP addresses I wouldn't have to mess with any DNS changes.  This turns out to be a bad idea.

You see, Jenkins wants to keep an open TCP connection between the master and all the slaves.  That way it can keep tabs on what's going on, schedule new jobs, etc.  The problem with using an ELB is that it likes to close idle connections, by default after 60 seconds.  The result is that I was getting this error in my slave logs just about every minute:

This is far from ideal, as when the slave disconnected it was causing any running jobs to fail.  :(  The solution was pretty simple; remove the ELB and create a Route53 record set pointing to the master and use that for all the slave communications.  Ever since this simple change I've had few (if any) communications problems.  Yeah!  :)

Thursday, November 14, 2013

(Ab)using Chef in AWS OpsWorks.

AWS OpsWorks

AWS OpsWorks is a wonderful tool that I'm growing more and more fond of every day.  However, not being familiar with Chef I've been hamstrung by what I can do with it.  I wanted to be able to create a clean, simple layer but still retain the ability to deploy code automatically to it.  The "other" layer type was perfect, except it didn't include any deployment tools (or other goodies).  So, banging my head against the wall for a couple of days I have some tips to share.

Tip #1 - Chef-Solo is globally scoped

OpsWorks uses Chef-Solo which is globally scoped.  This means that there is no Chef server and you don't have to do any funky namespace things in your custom Chef recipes.  I was trying to copy the OpsWorks Chef recipes into my custom cookbooks, and trying to use Git submodules, and all kinds of funky stuff.  The simple answer is that you don't have to do any of that; just use it like its in your custom cookbook and it will work.

Tip #2 - Drop the manafests

Just a small tip here, Chef-Solo doesn't use the metadata.rb file so there is no need to include it in your cookbook. Unless you really want to, that is.

Tip #3 - Just do it!

You don't have to create your own custom cookbook if you don't want to.  If you find something in the opsworks-cookbooks that you want to use, say opsworks_bundler, then just include it in the Custom Chef Recipes section of the layer edit page:

OpsWorks option screenshot


As with most of my blog posts, this one is written with future me in mind.  If I don't write it down, I'll forget it.  However, it will hopefully help someone else that is in my position of only knowing enough Chef to be dangerous.  I really shouldn't have had to bang my head against this for so long.  :/

Friday, April 12, 2013

Setting your own ID using AWS::Record::HashModel

June 2014 Update: This no longer works with the newer versions of the AWS SDK.  This information is retained merely for historical reasons and you should not use the code below.  You have been warned.

In a recent project I needed to save things to DynamoDB and be able to use my own values for the ID. The problem is that by default AWS::Record::HashModel sets the ID field for you automatically and you can't change it. I asked about it on the official AWS forums and the answer was that it can't be done. At least not yet.

So I had to come up with a solution myself. It's not pretty, but Monkey Patching does come in handy sometimes. So, for any others out there who are trying to solve the same problem, he's what I did.
  1. Create a monkey_patch.rb file somewhere in your project where it will be loaded after the main AWS Gem.
  2. Add the following code to the monkey_patch.rb file:

    module AWS
      module Record
        module AbstractBase
          module InstanceMethods
            # Allow the user to override the ID
            def populate_id
              @_id = @_data['id'] || UUIDTools::UUID.random_create.to_s.downcase

  3. Add an "id" field to your AWS::Record::HashModel object.

    class MyModel < AWS::Record::HashModel     
        string_attr   :id     

  4. Now, whenever you call the create method on MyModel if you provide an ID in the hash then it will get used as the hash_key value.

    MyModel.create{:id => "some_value_that_I_make_up"}

Two things to be aware of, though:
  1. You have to add some other attributes to the model before it will be persisted properly. Just having an "id" won't cause it to save properly.
  2. It's up to you to ensure that you don't create multiple models with the same ID.

Monday, April 01, 2013

Rake task to deploy to AWS OpsWorks


I've spent the day building a deploy Rake task for my AWS OpsWorks environment and I thought I would share it with you.

My project is a pretty simple Sinatra application that is being hosted in Amzon's cloud environment (AWS) with the help of the OpsWorks tools.  One of the things I wanted to do is to make it easy to deploy the application from the command line.  I guess I've gotten lazy with Heroku's "deploy from a Git push" and I wanted something similar in my new environment.

After working out a couple of kinks, it wasn't too difficult really.  Here's how to set it up in your environment.  Basically, you just need to create a couple of files and fill in some default values from your OpsWorks environment.

The opsworks.yml file defines all the layer/app ID's in all the regions you want to deploy to.
# config/opsworks.yml
  layer_id: "your-layer-id"
  app_id: "your-app-id"

  layer_id: "your-layer-id"
  app_id: "your-app-id"

The Rakefile file defines the rake tasks needed to deploy your software. Just make sure that you have the 'aws-sdk' Gem installed and a valid AWS credentials. I'm using a config/aws.yml file but you can do whatever you like.
# Rakefile
require 'rubygems'
require 'bundler'
Bundler.require if defined?(Bundler)

# Authenticate to AWS
client =

desc "Deploy the app to the LIVE environment"
task :deploy do
  regions = YAML.load_file('config/opsworks.yml')

  deploy_options = {}
  deploy_options[:command] = {name:'deploy'}

  # Loop through each region
  regions.each do |region, options|
    deploy_options[:instance_ids] = []
    deploy_options[:app_id]   = options['app_id']
    deploy_options[:comment]  = "rake deploy from '#{Socket.gethostname}'"
    instances = client.describe_instances({layer_id: options['layer_id']})

    next if instances.nil? || instances.empty?

    # Capture the details for each 'online' instance
    instances[:instances].each do |instance|
      if 'online' == instance[:status]
        deploy_options[:instance_ids] << instance[:instance_id]
        deploy_options[:stack_id] = instance[:stack_id]

    puts "Deploying to #{deploy_options[:instance_ids].count} instances in the #{region.upcase} region..."
    client.create_deployment deploy_options


Just run rake deploy to deploy your code to any "online" instances that matches the regions/apps/layers in your config file.

Thursday, March 28, 2013

Integrating ELB into an OpsWorks stack

UPDATE: Amazon has now built ELB capability into OpsWorks, so you no longer need to use this workaround.

As you might have guessed by my recent posts I'm experimenting with the AWS OpsWorks platform.  I have been generally very pleased with everything except its lack of integration with existing AWS tools, specifically Elastic Load Balancers.  OpsWorks does include an HAProxy stack that you can use to load balance your system, but it lacks several features that ELB has and I find valuable.

To start, ELB is just easier to setup and manage.  Basically, you don't have to do anything at all with it, just create an ELB instance through the GUI and attach your instances to it.  It also multiple provides SSL termination points and automatically scales based on load.  To top it off, it's cheaper than running your own HAProxy stack through OpsWorks.  :)

The problem is that OpsWorks doesn't provide a way to integrate with ELB, at least not yet.  So you have to wire it up yourself.  This is not too big of a deal if you're already familiar with the AWS command line tools and Chef, which is what Amazon uses to manage their instances.  Unfortunately, I'm not super familiar with these things so I had to blindly work my way through the setup.  Here's how I did it.

1) Create an ELB in the normal way and give it some name "my-project-lb"

2) Create a custom Chef cookbook and check it into Github or store it in an S3 bucket or something.  It should look something like this:
$ mkdir -p cookbooks/aws/recipes
$ touch cookbooks/aws/recipes/default.rb
$ touch cookbooks/aws/recipes/register.rb
$ touch cookbooks/aws/recipes/deregister.rb

3) Edit the recipes in your cookbook to look something like this, replacing code as necessary:
## cookbooks/aws/recipes/default.rb
package "aws-cli" do
  action :install

## cookbooks/aws/recipes/register.rb
include_recipe "aws"

execute "register" do
  command "aws --region #{node[:opsworks][:instance][:region]} elb register-instances-with-load-balancer --load-balancer-name my-project-lb --instances '{\"instance_id\":\"#{node[:opsworks][:instance][:aws_instance_id]}\"}'"
  user "deploy"

## cookbooks/aws/recipes/deregister.rb
include_recipe "aws"

execute "deregister" do
  command "aws --region #{node[:opsworks][:instance][:region]} elb deregister-instances-from-load-balancer --load-balancer-name my-project-lb --instances '{\"instance_id\":\"#{node[:opsworks][:instance][:aws_instance_id]}\"}'"
  user "deploy"

3) Add your new cookbook to the OpsWorks custom Chef cookbook in your Application stack.

4) Finally, add your new recipes to the proper stack lifecycle events.  I chose Configure to run the "aws::register" recipe and  Shutdown to run the "aws::deregister" recipe.

Now just sit back and enjoy your new, cheaper, easier-to-use auto-balancing OpsWorks system.  :)

Friday, March 22, 2013

Rack Applications under AWS OpsWorks

AWS OpsWorks

First a bit about AWS OpsWorks, it's Amazon's answer to operating your own datacenter.  With OpsWorks you can do things like easily autoscale your systems by time or load, deploy your applications from Github (or S3), monitor your stack with Ganglia, and a whole raft of other things.


The OpsWorks architecture is designed in what they call "stacks".  These stacks are just groups of components that work together to do whatever task you want them to do (i.e. provide a web service of some kind).  The stack is broken up into layers with each layer having a unique job within the stack (load balancer, app server, DB server, etc.)  

Rails (Rack) Layer

The layer I'm concerned with in this post is the application server layer.  There are several different types of apps that are supported out of the box; PHP, Rails, Node, and static HTML.  I wanted to run a Ruby Sinatra application, which is close to a Rails app, but not quite.  It turns out that the "Rails" layer is actually a "Rack" application and natively supports anything that runs under Rack, such as Sinatra.  :)

All you have to do to get your Rack application running in OpsWare is create these three files in your project and deploy your application just like normal.  :)

source ''

gem 'sinatra'
gem 'unicorn'


require 'rubygems'
require 'sinatra'

get '/' do
"hello world!"

require 'sinatra'
set     :env, :production
disable :run
require './app.rb'
run Sinatra::Application

Monday, March 18, 2013

Programming Choices


I frequently ask myself what languages are on top in my mind.  In the past I've loved almost everything from PowerBuilder to Perl to Java.  There are a couple of languages I've really disliked (looking at you Cobol) but I don't have to work with them much so its not much of a problem.  Anyway, here's what I would use to build something today.

Things I would use


Ruby is very solid in my mind and is my go-to language for almost any job.  It's been criticized in the past as slow and awkward but those days are gone and today's Ruby is a joy to work with.  I love the syntax of the language and how everything seems to be named properly.  I rarely have to look up documentation as my first guess is usually correct.  It's REPL (irb / Pry) is beyond compare and almost removes the need to debug.  I once thought that nothing could rival the Java JAR ecosystem, but I think that Ruby Gems is more extensive, more modern, and beats it in almost every way that counts.


Go has really peaked my interest in the last 6 months or so.  There are more and more people talking it up and several companies releasing major projects that are written in Go.  The upsides are plenty, from a fresh, clean syntax, to the out-of-the-box tools (gofmt, gofix, etc.), to the performance.  The community is very active and its backed by a major computing powerhouse (Google).  Still, its a fairly new and immature language and I think you would be hard pressed to get your company to start building mission critical apps in it.  Well, unless your company is forward thinking and willing to take some risks to get major awards.


Java was in my toolbox for a long time but anymore I just really don't like using it.  Sure, it's fast and can run almost anywhere and has lots of libraries, but it just leaves me empty.  I think my feelings turned when Sun was sold to Oracle and I saw the writing on the wall.  Since then it just seems like the world has abandoned the language, along with Oracle.  The fact that most Java code is "enterprise"-ready doesn't really help matters.  Most of the Java code I see in the wild is ultimately flexible and ready to be used for almost any purpose.  It's also nested so deeply that it's extremely hard to follow the code logic and I'm afraid to touch it.  That said, it's not really the languages fault that people write it that way

Don't even look at me


What can I say about PHP that hasn't been said before?  I think it has uses in quick and dirty projects, but using it for anything of production value is just a fools errand.  Worse than that I firmly believe that writing PHP is bad for your programmer brain.  It teaches the wrong things and guides you down the wrong paths.  Hell, they put a "goto" operator into the language in 2009!  Jeff Atwood commented that the way to make PHP go away is to provide better alternatives, and I believe that Ruby is finally becoming that alternative.  Its easier to write, faster, and easier to deploy (Bundler).  Hosting solutions (Heroku) and cloud providers (AWS OpsWorks) are building one button deploy systems that match or exceed PHP's.


Look, my point is that there is no one solution for every job.  Next time you start a project take a long hard look at the problem you are solving and pick the best tool.  Ruby is great for small utilities, miscellaneous web based work and (with Rails) the perfect solution for CRUD websites.  It even works for game programming and writing iOS applications.  I find Go a perfect fit for heavy duty server systems (image processing, number crunching, etc.)  Do what fits your job and makes you happy.