Overview

For the first two major versions of API, MongoDB was the only supported database engine. With the release of 3.0, we wanted to offer developers the flexibility to use the database engine that most suited their projects, regardless of size and scalability needs.

In line with the introduction of support for multiple template engines in Web 3.0, we moved all interactions with MongoDB from the app core into a plugin system — which we call data connectors — which allows API to work with virtually any database engine.

We’ve been hard at work creating data connectors for various vendors, like CouchDB, RethinkDB and even a flat file JSON filestore. In this tutorial, we’re going to show you how to build your own.

By the end of this tutorial we’ll have a partially-complete API Data Connector for MySQL, with a subset of the database operations implemented. Everything else can be finished for homework!

A Data Connector Template

During the development of API 3.0, the DADI engineering team put together a sample repository to make it easier to build your own connector. It even contains a test suite which should help you craft a robust connector for your chosen database engine.

The repository can be found here: https://github.com/dadi/api-connector-template. You can fork this repository and follow along.

Beginning the Data Connector

Fork the dadi/api-connector-template repository into your own profile or organization.

repository into your own profile or organization. Rename it to something nicer — let’s call it api-mysql .

. Clone the repository to your local machine.

$ git clone https://github.com/johndoe/api-mysql.git Cloning into 'api-mysql'...

Change into the repository directory and install the dependencies.

$ cd api-mysql/ $ npm install

Now install any dependencies needed to interact with your chosen database engine. For our MySQL connector, we’ll need the Node.js MySQL driver.

$ npm install mysql2 --save

To get a checklist of the things your connector must be able to do, run the test suite.

$ npm test

Create a MySQL database if one doesn’t already exist:

CREATE DATABASE 'my_database';

Create a MySQL user if one doesn’t already exist:

CREATE USER 'mysqluser'@'localhost' IDENTIFIED BY 'password'; GRANT ALL PRIVILEGES ON *.* TO 'mysqluser'@'localhost';

Configuration file

Each data connector normally requires its own configuration file containing things such as database host, database name and connection credentials. The configuration files follow the naming convention <connector>.<environment>.json . For example the MongoDB data connector uses a file called mongodb.development.json when running in development mode. These configuration files are placed in the config directory of your connector module and also in the config directory of your API application.

For the MySQL connector we’ll need a file called mysql.development.json . Because we're starting with the unit tests, let's create the test mode configuration first.

Rename config/apiConnector.test.json to config/mysql.test.json

to Add the following configuration, changing values to match your setup:

{

"database": {

"host": "localhost",

"port": 3306,

"database": "my_database",

"user": "mysqluser",

"password": "password"

}

}

Edit config.js so the line that loads the configuration file is correct:

// conf.loadFile('./config/apiConnector.' + env + '.json') conf.loadFile('./config/mysql.' + env + '.json')

Database operations

This tutorial will only focus on the bare minimum required for a data connector and as such will only implement three database operations: connect, insert and find.

To keep it simple, we’ll do all our work in the existing file lib/index.js .

Before adding code for the database operations, require the mysql2 dependency at the top of this file:

lib/index.js

const config = require('../config')

const debug = require('debug')('api:mysql')

const EventEmitter = require('events').EventEmitter

const util = require('util')

const uuid = require('uuid') // require MySQL module

const mysql = require('mysql2')

Connect

When API calls the connector’s connect() method it expects the connector to create a connection to the database and assign it to the property this.database , before resolving the promise.

To connect to MySQL we call createConnection and supply the connection options from the configuration file:

/**

* Connect to the database

*

* @param {ConnectionOptions} options

*/

DataStore.prototype.connect = function (options) {

debug('connect %o', options) return new Promise((resolve, reject) => {

// read configuration options from config/mysql.<environment>.json

const dbConfig = config.get('database') // connect!

this.database = mysql.createConnection(dbConfig) // everything is ok, emit 'DB_CONNECTED' event

this.readyState = STATE_CONNECTED

this.emit('DB_CONNECTED', this.database) // problem connecting? emit 'DB_ERROR' event

// this.emit('DB_ERROR', err)

// return reject(err) return resolve()

})

}

Insert

All API data connectors should support inserting an array of documents. For this simple version of the MySQL connector we’re only going to handle inserting single documents.

In the sample connector repository, the insert() method returns an empty array:

return new Promise((resolve, reject) => {

let results = []

return resolve(results)

})

Replace the code above with the following:

return new Promise((resolve, reject) => {

const insertQuery = `INSERT INTO ${collection} SET ?`

const findQuery = `SELECT * FROM ${collection} WHERE _id = ?` return this.database.query(insertQuery, data[0], (err, results, fields) => {

if (err) {

return reject(err)

} // query the database for the new document(s)

return this.database.query(findQuery, data[0]._id, (err, results, fields) => {

return resolve(results)

})

})

})

Two things you may notice in our new insert() method. The first is that we don't have to build a full INSERT statement to pass to the database. The Node.js MySQL module that we're using handles converting objects passed to the query method into SQL statements, saving us quite a lot of manual transformation. The following snippet is from the documentation for the mysql module, which the mysql2 module is based off.

const post = {id: 1, title: 'Hello MySQL'} let query = connection.query('INSERT INTO posts SET ?', post, (error, results, fields) => {

if (error) throw error



}) console.log(query.sql) // INSERT INTO posts SET `id` = 1, `title` = 'Hello MySQL'

The second is that we’re making a second call to the database to get the results of the insert. This is because API expects a result set back from insert() that contains the inserted documents, and MySQL only returns the number of affected rows.

Before we can run any tests for the insert() method, we have to remove the "skip" instruction in the test file. In test/index.js , change the following:

// describe.skip('insert', function () {

describe('insert', function () {

If you run npm test now the first thing you should notice (if the database has connected correctly) is an error saying that a table doesn't exist.

Uncaught Error: Table ‘my_database.users’ doesn’t exist

Automatic table creation

We really need data connectors to automatically create tables that don’t exist, so that new collections can be added to API without having to perform any maintenance on the underlying database.

Let’s add a createTable() method that can be called from each of the database operations to ensure the requested table exists. This method needs to read the schema for the requested collection and generate a CREATE TABLE query that includes the names of the columns and their data types. See the DADI API documentation for Collections for detailed information.

/**

* Create a new MySQL table if the specified one doesn't exist

*

* Queries the MySQL information_schema database, returning a count of

* tables that exist for the specified database + table_name parameters

*

* If the table doesn't exist, build a CREATE TABLE statement to execute against the database

*

* @param {String} name - the name of the table to check or create

* @param {Object} schema - the API collection schema fields

* @returns {Promise.<undefined, Error>} A promise that returns an Array of results,

* or an Error if the operation fails

*/

DataStore.prototype.createTable = function (table, schema) {

return new Promise((resolve, reject) => {

// does the table exist?

let tableQuery = `SELECT COUNT(*)

FROM information_schema.TABLES

WHERE TABLE_SCHEMA = '${config.get('database.database')}' AND TABLE_NAME = '${table}'` return this.database.query(tableQuery, (err, result) => {

if (result[0]['COUNT(*)'] === 0) {

let createTableQuery = 'CREATE TABLE ' + table + ' (' // add an _id column as a default

createTableQuery += '_id VARCHAR(100),' // add columns based on the collection schema

Object.keys(schema).forEach(key => {

createTableQuery += `${key} ${schema[key].type === 'String' ? 'VARCHAR(255)' : 'INT'},`

}) createTableQuery += 'PRIMARY KEY(_id))' this.database.query(createTableQuery, (err, result) => {

return resolve()

})

} // the table already exists, return

return resolve()

})

})

}

Now modify the insert() method to include a call to createTable() :

return new Promise((resolve, reject) => {

const insertQuery = `INSERT INTO ${collection} SET ?`

const findQuery = `SELECT * FROM ${collection} WHERE _id = ?` // first check the table exists

return this.createTable(collection, schema).then(() => {

return this.database.query(insertQuery, data[0], (err, results, fields) => {

if (err) {

return reject(err)

} // query the database for the new document(s)

return this.database.query(findQuery, data[0]._id, (err, results, fields) => {

return resolve(results)

})

})

})

})

Test again!

Running npm test now should see some of our insert tests passing:

insert

✓ should insert a single document into the database

- should insert an array of documents into the database

✓ should add _id property if one isn't specified

✓ should use specified _id property if one is specified

Find

Having already implemented a find within the insert() method, it should be simple to make our connector's find() method work. Add the following to lib/index.js in the existing find() method:

const findQuery = `SELECT * FROM ${collection} WHERE _id = ?` // query the database

return this.database.query(findQuery, query, (err, results, fields) => {

return resolve(results)

})

Before we can run any tests for the find() method, we have to remove the "skip" instruction in the test file. In test/index.js , change the following:

// describe.skip('find', function () {

describe('find', function () {

When you run npm test the first find test should fail because it's expecting the database to contain only a couple of records. We should really be clearing the database before each test run. For this we'll use the convenient dropDatabase() method in the sample connector.

Find the beforeEach() method in test/index.js and modify it as follows:

beforeEach(function (done) {

done()

}) beforeEach(function (done) {

var apiConnector = new ApiConnector() // clear the 'users' table

apiConnector.connect({ database: 'content', collection: 'users' }).then(() => {

apiConnector.dropDatabase('users').then(() => {

done()

})

})

})

Find the dropDatabase() method in lib/index.js and modify it as follows:

DataStore.prototype.dropDatabase = function (collectionName) {

if (this.readyState !== STATE_CONNECTED) {

return Promise.reject(new Error('DB_DISCONNECTED'))

} debug('dropDatabase %s', collectionName || '') return new Promise((resolve, reject) => {

const deleteQuery = `DELETE FROM ${collectionName}` // delete all

return this.database.query(deleteQuery, (err, results, fields) => {

return resolve()

})

})

}

If we run the tests now, we should have one passing test (and six pending):

find

✓ should find a single document in the database

- should return the number of records requested when using `limit`

- should sort records in ascending order by the `createdAt` property when no query or sort are provided

- should sort records in ascending order by the query property when no sort is provided

- should sort records in ascending order by the specified property

- should sort records in descending order by the specified property

- should return only the fields specified by the `fields` property

Tasks to complete

Before this connector can be finished and used with API, there are a number of problems to resolve. Obviously we’ve left the update() and delete() methods for you to implement, but the following information about "options" also needs to be considered.

Passing options to find()

API calls the find() method with an options object which contains values that instruct the connector how to do things such as sorting and limiting the records returned. See the API documentation and the JSON Filestoreconnector for ideas on implementing.

We’ve left these unit tests pending in the sample connector repository. Simply remove the .skip for each of these when you're ready to test your implementation.

Returning results & metadata

While we haven’t added this to the sample connector, API actually expects a result from the data connector’s find() method that contains both the result set and a "metadata" block that contains the total count of matching records and the number of pages. You can see how the MongoDB connector does it. This uses the NPM package @dadi/metadata which you can add to your project's dependencies.

Publishing the connector to NPM

Edit the package.json. We need to rename the package (it’s still called @dadi/api-connector-template). Change the name property to be api-mysql:

{

"name": "api-mysql"

}

You should also change the repository property to reflect your own GitHub repository:

"repository": {

"type": "git",

"url": "https://github.com/johndoe/api-mysql.git"

}

When you’re ready to publish the connector, execute the following command:

$ npm publish --access public

Testing with API

So far we’ve been testing the connector in isolation. Once all the tests are passing, it’s a good idea to use npm link to test the connector with your API application. This can be done before you publish to NPM, to ensure everything is working before you make it publicly available. See this excellent article regarding the use of npm link .

Usage with DADI CLI

The easiest way to install API is with DADI CLI, a command-line tool that allows developers to spin up an API instance in just a few seconds using a single command.

Now that your data connector is available on NPM, it can be pulled into CLI for new API installations. The following command will install the latest version of API in a directory called my-new-api using our fresh MySQL data connector, taking care of installing all the dependencies.

$ dadi api new my-new-api --database=api-mysql

When users don’t supply a --database parameter, CLI will present them with a list of data connector modules straight from NPM.

$ dadi api new todo-api

✔ Checking the available versions of DADI API

✔ Pulling the list of available database connectors from NPM

? Which database engine would you like to install? (Use arrow keys)

❯ @dadi/api-mongodb — A MongoDB adapter for DADI API

@dadi/api-filestore — A JSON datastore adapter for DADI API

To ensure an optimal and safe experience, we filter this list with modules that are built or trusted by DADI. If you think your data connector should be on this list, get in touch — we’ll be delighted to review your work and, if necessary, help you smooth any rough edges before adding it to the list.

That’s it for today!

The code for this article, forked from the sample data connector, can be found here.

For more information about the API Data Connectors, or help building your own, ask us in the usual channels. You can connect with the engineering team on Discord, via email (email addresses in our profiles) or send a tweet to @dadi.