DevOps for the database is as glorious as bringing ice to a party. You absolutely need to have ice but no one is going to say “Wow! Great ice at this party!” Of course, if you forget it, you will absolutely hear complaints about warm drinks. The same is true with DevOps: If you forget the database, you’re going to have a bad release process.

Research from Dr. Nicole Forsgren’s 2018 State of DevOps report proves this:

“ Teams that do well at continuous delivery store database changes as scripts in version control and manage these changes in the same way as production application changes.” — 2018 State of DevOps Report

Titan goes a long way in solving the problem of database delivery and speeding development workflows. It’s as simple as cloning an existing database locally, making changes, and persisting those changes for others to use. We’ve had the process for a while with application code development. It’s about damn time we have that for the data.

Evolving the Database Structure

Of course, the next challenge is how do we get those structural changes (i.e., DDL for new tables, columns, etc.) to our later environments. We can’t just ship a Titan commit hash over to the SRE team and say “Use this one!”. We must evolve our database structure so that we do not lose production data. By using Liquibase and Titan, not only are we able to speed development efforts, but we can also accelerate our software delivery to test, production, and all points in between.

You need to decrease the time setting up databases for development (Titan) and you need to store you database changes in source code control (Liquibase).

Using Liquibase and Titan

First, read Eric Shrock’s post and download titan. For this example we’ll assume your just running the data found in the hello-world postgres respository:

titan install titan clone s3://titan-data-demo/hello-world/postgres hello-world titan ls

This shows you that Titan is up and running and everything is ready to go with your Titan created Postgres database.

A few housekeeping items need to be handled before we get into it:

Create a directory for the Postgres JDBC jar. I used /home/r2/jdbc . Make sure those are world readable as you’re going to mount that on the Liquibase docker container.

. Make sure those are world readable as you’re going to mount that on the Liquibase docker container. Create a directory for the Change Log. I used /home/r2/changelog and made sure it was world readable so the Liquibase container can write to it.

and made sure it was world readable so the Liquibase container can write to it. You need to determine the IP address your Titan Postgres container is running on. Just issue docker network inspect bridge and find the IP address of your titan container. Mine was 172.17.0.3.

and find the IP address of your titan container. Mine was 172.17.0.3. Create an alias to avoid lengthy arguments. Remember to set update ${JAR}.

alias liquibase=”docker run -v /home/r2/jdbc:/liquibase/jdbc -v /home/r2/changelog:/liquibase/changelog liquibase/liquibase --driver=org.postgresql.Driver --classpath=/liquibase/jdbc/${JAR} --url=”jdbc:postgresql://172.17.0.3:5432/postgres” --changeLogFile=/liquibase/changelog/changelog.xml --username=postgres --password=postgres”

2. Next, let’s get a Change Log for our Titan-provided Postgres database:

liquibase generateChangeLog

This will create a changelog.xml file that is the baseline of your database. You can view it on your local system in your “changelog” folder.

Pro tip: Liquibase supports XML, JSON, YAML, and annotated SQL. Simply change the extension to get a different format (e.g., changelog.xml becomes changelog.yaml). SQL would have to be changelog.postgres.sql to tell Liquibase you want a Postgres compatible annotated SQL file.

3. Next, you need to assert that your newly created changelog has already been persisted to the schema. Of course, by definition it already has. So, we’re going to synchronize this file with a schema. Liquibase does this by creating a table DATABASECHANGELOG.

liquibase changeLogSync

Examine the table with the following command:

psql postgres://postgres:postgres@localhost/postgres -t -c 'SELECT * FROM DATABASECHANGELOG;'

This will create a changelog.xml file that is the baseline of your database. You can view it on your local system in your “changelog” folder.

4. Finally, persist the updated database to the titan server with the following:

titan commit -m "baselined with Liquibase" hello-world

And check in the changelog.xml file into the same source code respository as your application.

5. Now let’s make a change. But instead of just going to the database and monkeying around with SQL, let’s add the following new changeset to the XML.

<changeSet author=”r2" id=”2">

<createTable tableName=”newtable”>

<column name=”colname” type=”VARCHAR(255)”/>

</createTable>

</changeSet>

Save the file and run the following to make sure you have updated the file correctly:

liquibase validate

Now issue this to see the SQL that will be run the server:

liquibase updateSQL

Finally, run the following to persist the new table:

liquibase update

You can also verify that the table was created:

psql postgres://postgres:postgres@localhost/postgres -t -c ‘\d newtable;’

Once it all looks good commit, your data changes to titan and your updated Change Log to source code control.

titan commit -m "Created table newtable with Liquibase." hello-world

That’s it! You’ve done it. Now you can use Liquibase to update database schema change in non-development environments. After all, your software isn’t done until it’s in production. That’s the DevOps way.

Summing it up

Titan and Liquibase can certainly help you, the developer. But, you can also help your friends in Test and Operations. And that’s also the DevOps way. Don’t forget to bring the database/ice to the DevOps party. Your friends in Test and Operations may not rave about the great ice, but you’ll never hear them complain about warm drinks.

As we’ve learned from every State of DevOps report, communication matters. When teams work together and have visibility into the database change process, database changes don’t slow them down or cause problems when they perform code deployments.