Gift list dev diary: parser tests

All development for this project is shared on github at https://github.com/codonnell/mygiftlist-blog. I will endeavor to focus on the most relevant snippets of code rather than go over all of them on this blog. For interested parties, the github repo is available to survey all of the minor details.

In this post we’ll write a couple of unit tests for our resolvers. The core business logic of our backend application lives in the resolvers, and they change rapidly. This makes them the most important part of our application to test. We’re using docker to host our test database and a Makefile for a bit of convenience, but I won’t cover how that works here. The git commit has the full changeset.

I like to run my tests against a separate database from my development one. This is important to me because I often need to clean up state in the database between test runs, and I would prefer not to accidentally truncate my development database by running a test. In fact, it took me all of 5 minutes to do this for the first time when working on the code for this post.

After struggling to get mount to run tests against a separate system of components, I decided to switch to using integrant. With integrant, you must pass system components as values to functions instead of having functions access global vars, as mount does. This means we can create an entirely new system value to pass around in test code, which eliminates the possibility of truncating the development database on accident.

Since we’re using deps.edn to manage our dependencies, we’ll use the cognitect test runner. As the project gets larger and we have more tests, we can consider adopting a more feature-rich test runner.

Fixtures

In order to test our resolvers, we need to provide an isolated environment to run each test. In our case, that means connecting to a test database and truncating all the tables in between each test. We’ll do this using a couple of fixtures.

( ns rocks.mygiftlist.test-helper ( :require [rocks.mygiftlist.config :as config] [rocks.mygiftlist.db :as db] [next.jdbc :as jdbc] [integrant.core :as ig] [clojure.java.io :as io] [clojure.string :as str])) ( def system nil) ( defn use-system "Test fixture that initializes system components and sets it as the value of the `system` var, runs the test, then halts system components and resets `system` to nil. If no system components are passed in, initializes and halts the full system." [ & component-keys] ( fn [test-fn] ( alter-var-root # 'system ( fn [_] ( let [ig-config ( merge ( ig/read-string (slurp ( io/resource "system.edn" ))) ( ig/read-string (slurp ( io/resource "resources/test.edn" ))))] ( if (seq component-keys) ( ig/init ig-config component-keys) ( ig/init ig-config))))) ( test-fn ) ( ig/halt! system) ( alter-var-root # 'system (constantly nil))))

The first fixture here is use-system , which takes an optional set of component keys. We use this to spin up and down system components for our tests. Note that this is merging in the test.edn system map, so it will use the :test config profile. A quick peek at our configuration map shows that this has the effect of connecting us to a different (test) database.

{ :database-spec { :username # or [ # env POSTGRES_USER "postgres" ] :password # or [ # env POSTGRES_PASSWORD "password" ] :server-name # or [ # env POSTGRES_HOSTNAME "localhost" ] :port-number # long # profile { :dev 15432 :test 15433 :prod # env POSTGRES_PORT} :database-name # or [ # env POSTGRES_DB "postgres" ] :sslmode # or [ # env POSTGRES_SSLMODE "disable" ]} :port # long # profile { :dev 3000 :test 3001 :prod # env PORT}}

This is great! We can run our tests from the repl without accidentally wiping out the database we’re using for development.

The second fixture truncates a list of tables after running the test. We can run this after each test so they all start with a fresh database.

( def ^ :private tables [ :user ]) ( defn- double-quote [s] ( format "\"%s\"" s)) ( def ^ :private truncate-all-tables "SQL vector that truncates all tables" [(str "TRUNCATE " ( str/join " " ( mapv (comp double-quote name) tables)))]) ( defn truncate-after "Test fixtures that truncates all database tables after running the test. Assumes the `use-system` fixture has started the database connection pool." [test-fn] ( test-fn ) ( jdbc/execute-one! ( ::db/pool system) truncate-all-tables))

Parser Tests

With these fixtures in place, we’re ready to write tests for the user-by-id and insert-user resolvers we wrote in the last post. First let’s look at a test for insert-user .

( ns rocks.mygiftlist.model.user-test ( :require [rocks.mygiftlist.db :as db] [rocks.mygiftlist.parser :as parser] [rocks.mygiftlist.type.user :as user] [rocks.mygiftlist.model.user :as m.user] [rocks.mygiftlist.test-helper :as test-helper :refer [system]] [com.fulcrologic.fulcro.algorithms.tempid :as tempid] [clojure.test :refer [use-fixtures deftest is]] [honeysql.core :as sql])) ( use-fixtures :once ( test-helper/use-system ::parser/parser ::db/pool )) ( use-fixtures :each test-helper/truncate-after) ( defn example-user ([] ( example-user {})) ([m] (merge # ::user { :id # uuid "e58efe42-8a06-45b0-a11e-3f609580932d" :email "me@example.com" :auth0-id "auth0|abc123" } m))) ( deftest test-insert-user ( let [tempid ( tempid/tempid ) { ::parser/keys [parser] ::db/keys [pool]} system u ( example-user { ::user/id tempid}) ret ( parser {} ` [( m.user/insert-user ~ u)])] ( is (= { :count 1 } ( db/execute-one! pool { :select [( sql/call :count :* )] :from [ :user ]})) "There is one user in the database after insert" ) ( is (= (select-keys u [ ::user/email ::user/auth0-id ]) ( db/execute-one! pool { :select [ :email :auth0_id ] :from [ :user ]})) "The user in the database's email and auth0 id matches what was inserted" ) ( let [user-id ( ::user/id ( db/execute-one! pool { :select [ :id ] :from [ :user ]}))] ( is (= { ::user/id user-id :tempids {tempid user-id}} (get ret ` m.user/insert-user)) "The parser return value has the user id and tempids mapping" ))))

We need the parser and connection pool for these tests, so that’s what we pass to our test-helper/use-system fixture. We generate an example user, insert it, and verify that there is one user in the database with attributes matching what we passed in. Next we verify that the return value of the mutation has the user id and the correct tempid mapping. This is important to ensure that optimistic updates will work properly in the client.

The test for user-by-id is similar, but I’ll make one philosophical note. The created-at timestamp is generated in the database, so we don’t have access to its exact expected value. We have the option of writing a test which verifies that the timestamp is within some range of the current instant, but that kind of test can be brittle. Since the way the created-at value is set is unlikely to change and the value our query returns is probably right as long as it is not nil, we only check that the created-at value is a java.time.Instant . To me, this hits a sweet spot. We get a test that is robust and gives confidence that our code is working properly.

( deftest test-user-by-id ( let [{ ::user/keys [id] :as u} ( example-user ) { ::parser/keys [parser] ::db/keys [pool]} system _ ( db/execute-one! pool { :insert-into :user :values [u]}) ret ( parser {} [{[ ::user/id id] [ ::user/id ::user/email ::user/auth0-id ::user/created-at ]}])] ( is (= u (select-keys (get ret [ ::user/id id]) [ ::user/id ::user/auth0-id ::user/email ])) "The query result attributes match what was inserted" ) ( is (instance? java.time.Instant ( get-in ret [[ ::user/id id] ::user/created-at ])) "The query result's created-at value is an instant" )))

And that’s it! We have a couple of tests written for our parser and the framework to write more as they’re needed.