Dynamic Inventory


EC2 Build Worklow

Using EC2 in your build workflow means that you’ll necessarily be provisioning a lot of instances over the course of the day, perhaps dozens or hundreds.

Unless you have a crystal ball, you don’t know the IPs or DNS names of these instances ahead of time. This creates a chicken-and-egg issue around using and managing the instances. How can you login to an instance that was just created moments ago if you don’t already know how to address that host?

Inventory

If you use Ansible, you’re already familiar with the concept of inventory. You can easily manage hundreds or thousands of systems from a single playbook as long you have pre-listed them in an inventory file. This is essentially a hosts file that identifies by name or IP address all the systems - the inventory - that you want to manage. That inventory file is read by ansible at playbook run-time, and the playbook iterates over the hosts list to run whatever tasks are specified.

So how do you handle dynamically generated inventory? These host names & IP addresses don’t yet exist at the moment your build process begins.

Use variable_host in your ansible hosts definition.

Dynamic Inventory

In your ansible playbook, define your hosts like this:

- hosts: "{{ variable_host | default('localhost')}}"
remote_user: ec2_user
roles:
- aws

This holds the configuration that’s critical to using EC2 in ansible

  • sets hosts to a variable - variable_host - that will be used if the variable is present
  • if variable_host isn’t set, the default value of localhost is used
  • sets the remote_user variable to ec2_user for EC2 ssh logins
  • sets the default role where all your tasks are defined

The variable_host var can be set to equal host(s) that match some criteria, like all hosts with the tag name build_123_spec , which can be generated from an ansible command immediately after provisioning new instances with that tag name.

The default value of localhost is required for tasks that are operating soley against the EC2 API, i.e. managing instances from outside a login session onto them from the perspective of the EC2 infrastructure. These commands include provisioning new instances and then controlling their state. Localhost in this case means the ansible host that is running the commands. variable_host is used for all tasks that require a session, or login, on the instance. This is used to establish the host name or IP address for the ssh login that ansible sets up for that task.

Bring it All Together

Your workflow would look like this:

ansible-playbook aws.yml --tags "provision-ec2" --extra-vars "instance_tag=build_123_spec count=5"
ansible-playbook aws.yml --tags "clone,build,deploy" --extra-vars "variable_host=tag_Name_build_123_spec"

The above two commands, when run in this order, will do this

  • provision five new EC2 instances and tag each with the Key:value pair Name:build_123_spec
  • run the clone, build and deploy tasks on any host in the inventory that has the tag we just applied

Published by in Cloud Computing and Development and tagged ansible, aws, ec2 and inventory using 482 words.