My mum’s car battery died today. Good that this time we weren’t in the middle of nowhere. Apparently, it’s becoming a tradition to talk about my next project’s boilerplate the same day I have problems with a car. See what happened last year! This time I couldn’t solve it by myself but, hey, at least I learnt something and I didn’t have to pay for the new one! Let’s start building our boilerplate for a Koa, Redux, React application including Webpack, Mocha and SASS. In the readme you can read about how to start it up.

This post intends to be a very detailed roadmap of how to build a boilerplate using some of the latest technologies in web application development from the ground up, going from very basic concepts to some relatively more advanced topics. Basically what I would have liked to find when getting started, I hope it serves you well!

This boilerplate was especially created for the development of listlogs.com.

Setting up the repository

We will start creating a github repository for the new project at github.com. I am assuming you have git installed.

Once you have created your new github repository, clone it into your local folder, in my case:

git clone https://github.com/mezod/boilerplate-koa-redux-react.git

The first thing we should do inside our new empty repository is to define which files and folders we don’t want git to keep track of. To do so, let’s just create a .gitignore file at the root of the project with the following content:

node_modules/ build/ npm-debug.log .DS_Store

We want to ignore them because

node_modules/ will contain all those js files for vendors and libraries our project depends on, such as koa, redux or react, together with the js files for tools we will use during the development of the project, such as Webpack, Mocha or ESLint.

build/ will contain the bundled files of our project for the development .environment.

npm-debug.log is a file npm creates when every time an error occurs.

.DS_Store is something OS X specific. It contains system specific custom attributes related to the directory. As a result it can be safely ignored.

Defining the project directory structure

We will have both the backend and the frontend together in the same folder structure that will look like

/api /src /methods /models /routes /tests /app /build* /dist* /fonts /images /src /actions /components /constants /containers /reducers /stores /styles* /tests

*build/ will contain the bundled version of our code to work while in development.

*dist/ will contain our production ready code.

*styles/ might be used to contain generic style definitions, such as variables, font-face or others, used throughout the whole application.

Adding .gitkeep to empty folders

For github to keep track of our empty folders we will just add a .gitkeep file to each empty folder. We should remove it once each folder starts getting the real code files.

Installing the Node and npm

We will need to install node.js, a JavaScript runtime environment, and npm, node’s package manager, which will help us manage our project dependencies.

As I already had them in my computer, I made sure to update node and npm to the latest versions (node 4.1.0 and npm 3.3.3 as I’m writing this) to make sure we don’t have any compatibility issues with the libs we will be using.

To install node just go to their downloads' page and get an installer. Before proceeding, make sure it was installed properly by running

node –v

in the command line. It would be great if node is at least version 4.0.0 as we might be using some ES6 syntax that node is already able to interpret. I will be using 4.1.0.

Npm is installed together with Node. However, make sure to update it to the last version (3.3.3 for now) using

sudo npm install npm -p

Yep, a package manager that takes care of updating himself! Again, make sure it was properly updated with

npm -v

Creating the project

Inside the project folder just

npm init

to create a new npm project. Not a big deal, as the official docs say “This will ask you a bunch of questions, and then write a package.json for you.” If you want to go through it and get an empty package.json you can just press enter at each question. I wasn’t that lazy and my package.json looks like:

{ "name": "boilerplate-koa-redux-react", "version": "1.0.0", "description": "A boilerplate for a Koa Redux React application with Webpack, Mocha and SASS", "main": "index.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "repository": { "type": "git", "url": "git+https://github.com/mezod/boilerplate-koa-redux-react.git" }, "keywords": [ "boilerplate", "koa", "redux", "react", "webpack", "mocha", "sass" ], "author": "Joan Boixadós", "license": "MIT", "bugs": { "url": "https://github.com/mezod/boilerplate-koa-redux-react/issues" }, "homepage": "https://github.com/mezod/boilerplate-koa-redux-react#readme" }

This package.json will become very useful in the future. We will add a register for every single dependency when installing it so that we can easily get them installed at once if downloading the repository in another computer. The same way you can now install at once all the dependencies of this boilerplate’s package using “npm install”.

Before we start, you might want to know that to install packages via npm and saving the register to the package.json you can do things like

npm install package --save

to install package and add it to the list of production dependencies. To uninstall simply

npm uninstall package --save

Alternatively, using those commands with the --save-dev flag would add package to the list of development dependencies.

By using

npm install

npm will download and install all the packages defined in package.json while using

npm install --production

will only install those packages defined for the production environment.

The backend

This boilerplate is thought for those projects that will have an API. The API, that will be implemented with Koa.js will be connected to a MySQL database. Tools like Bookshelf, an ORM, will help us in the process.

The API will initially be consumed by the client also included in this boilerplate but it is designed having in mind that it could be consumed by a mobile application created with react-native in the future.

Other relevant tools we will be using are Sequelpro to manage the MySQL database and Postman desktop app to manage and test the API Endpoints.

Installing Koa.js

We will be using Koa.js to develop our API. Koa is a web framework especially designed for this purpose and builds up on Express.js, probably the most popular framework used to create APIs, by using ES6 generators among other features.

Note that we need at least Node 0.12 for Node to support ES6 generators, which Koa uses extensively. Don’t get fooled by the weird versioning Node has suffered in the last months, when it went from 0.12 to 4.1.0 after merging with io.js.

To install Koa just

npm install koa --save

We should now have a node_modules/ folder in our root folder containing the koa package.

To make sure everything is set up properly, we can create a very basic web server in a server.js file inside the api/src where we:

Require the koajs module. Instantiate a koa object creating our app. Create a default response. Tell koa which port number to listen to.

The file should look like

var koa = require('koa'); var app = koa(); app.use(function *(){ this.body = 'Hello from koajs'; }); app.listen(3000);

To check that everything is alright let’s start our web server with

node api/src/server.js

Now, if you go to http://localhost:3000 in your web browser, you should be getting the response. Hello from koajs!

Before we proceed, let's just add a script to our packages.json (you can remove the default script)

"scripts": { "webserver": "node api/src/server.js" }

Now, we can also run the webserver like so

npm run webserver

Complementing Koa

Koa.js is a very minimal library that builds up using other complementary packages. One that we will need for sure is koa-route a very simple middleware that will help us to handle the routes.

To install it we just

npm install --save koa-route

We want to create an API so we will also need some CORS support. We can make good use of the koa-cors package, which basically provides a Koa middleware to enable Cross Origin Resource Sharing (CORS).

Let’s just add it to the project

npm install --save koa-cors

Installing Bookshelf.js

For our project, the Koa.js API will retrieve its data from a MySQL database. To connect the API with our database we can make good use of several packages.

Firstly, we will install Bookshelf.js, a JavaScript ORM that will come in very handy to deal with the models of our API. If you are familiar with Backbone.js models and collections you might be happy to hear it follows its patterns and naming conventions. Bookshelf, as any other ORM, aims to provide a simple library for common tasks when querying databases in JavaScript, and forming relations between these objects.

No matter what, we will install it as usual

npm install –save bookshelf

Bookshelf is built on the Knex SQL query builder and thus, we will be needing to install it too:

npm install --save knex

Knex works with seveal different databases, we will be using it for MySQL. To be able to do so, we need to install the node.js driver for MySQL too:

npm install --save mysql

Setting up Bookshelf falls out of the scope of this post but it is relatively easy to achieve following the documentation.

The Frontend

The first thing we will need in the frontend is the HTML index page for the app that we can put in build/index.html :

<!DOCTYPE html> <html> <body> <div id="app"></div> <script src="bundle.js"></script> </body> </html>

In this document we basically define two things. First, we create the <div> with id app where we will render our app. And second, we define the .js file that will contain all the js of our app, minified: bundle.js. This file will be automatically generated by webpack as explained later on under the webpack configuration.

For now, let’s just create the javascript applications entry point file with some dummy code under src/:

console.log(‘Wololo');

Babel

Before we start adding all the libraries to improve the workflow and other frameworks we will use for this project, let’s have few words about Babel and why will we be using it. Similarly to what happens in the backend, we will be writing part of our code using ES6 syntax in the frontend. Some ES6 features are already supported by Node.js 4.1.0 but some others aren’t. Therefore, we will install Babel, that will transpile some of the ES6 features into ES5.

Note: While ES6 provides syntax for import/export (ES6 Modules), it currently does nothing, anywhere, because the loader spec is not finished. ES6 Modules are not yet a thing; they do not yet exist. Babel simply transpiles import/export to require .

To install Babel we will simply:

npm install --save babel-core

Be careful not to install the babel package, because that would only install the Babel CLI and not the Babel API.

Webpack

Webpack is a module bundler. It helps us to move from several modules with dependencies between them to a set of static bundled assets. Webpack approaches the problem of module bundling through configuration. In other words, it takes care of creating our production-ready asset bundles. However, Webpack offers several other functionalities that have a considerable impact in the development workflow. Javascript and css minification, JSX transformation, SASS compilation, hot reload, automatic testing, lazy-loading, etcetera. It basically makes our lives easier.

Thus, we will be using it to ease our front-end development workflow together with its development server (webpack-dev-server). To add both libraries to the project we simply:

npm install --save webpack npm install --save-dev webpack-dev-server

Note that we added webpack with --save and webpack-dev-server with the --save-dev. This is basically because in package.json dependencies we will only have all those vendors we need to create the bundled version of our code. Every other lib we will use during development, such as libraries related to hot reload, linting or testing, will be in the devDependencies part of the package.json.

Additionally, it might be interesting to add these packages globally so that we can launch them from the command line. To do so just use the global flag –g:

npm install -g webpack webpack-dev-server

Configuring webpack

It’s time to start configuring Webpack. To do so, we need to create the file webpack.config.js at the root of the project. This way Webpack and its development server will be able to find it by convention.

Make sure that the paths fit our previously chosen folder structure. Notice also, that webpack expects absolute paths.

module.exports = { entry: [ './app/src/index.js' ], output: { path: __dirname + '/app/build', publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: './app/build' } };

We have basically defined that index.js is our entrypoint and that we want our app to be built into the app/build/bundle.js bundle. The file name must match the name we gave it in the script tag when we defined our index.html. We are also defining that the build directory should be used as the base of the development server.

We can now run webpack to produce our bundle.js:

webpack

A bundle.js should now be inside the /app/build/ folder with the code of our index.js at the very bottom.

Additionally, we can now run the dev server

webpack-dev-server

To make sure it’s working just go to localhost:8080 in your browser and make sure that our logging statement from the index.js appears in the console. Wololo!

It is important to note that we can use the last two commands, webpack and webpack-dev-server straight in the command line because we installed them globally using the –g flag.

In case we want to use this code in other machines where webpack and webpack-dev-server are installed locally (maybe other contributors? ;D) we can define a couple of scripts in the package.json:

… "scripts": { "webserver": "node api/src/server.js", "build": "node_modules/.bin/webpack", "dev": "node_modules/.bin/webpack-dev-server" }, …

that we will execute in the command line with

npm run build

to produce our bundle.js, and

npm run dev

to start our dev server.

Html-webpack-plugin: Making it nicely

Our config works, but entails a couple of annoying problems. If we manually create our index.html inside the build folder, then we need to change the gitignore file in order not to ignore it. Seems messy. Why don’t we just get webpack to create the base index.html for us? Html-webpack-plugin will help us to generate an entry point to our application, creating links to all the assets like the js bundle and the minified css.

Let’s intall it:

npm install --save html-webpack-plugin

We will also resolve paths so that the config is more robust as well, the webpack.config.js should now look like:

var webpack = require('webpack'); var path = require('path'); var HtmlwebpackPlugin = require('html-webpack-plugin'); var ROOT_PATH = path.resolve(__dirname); module.exports = { entry: [ path.resolve(ROOT_PATH, 'app/src/index'), ], module: { loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loader: 'babel' }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: path.resolve(ROOT_PATH, 'app/build'), publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: path.resolve(ROOT_PATH, 'app/build'), historyApiFallback: true, hot: true, inline: true, progress: true }, plugins: [ new webpack.HotModuleReplacementPlugin(), new HtmlwebpackPlugin({ title: 'Listlogs' }) ] };

If you now run webpack on the command line

webpack

You will see that webpack generates both, the index.html and the bundle.js! In the future we will also configure it to minify our css. Great! And now we don’t mess up with the .gitignore.

I have used the path.resolve, which works like if we were navigating the file system using cd, to generate the absolute paths. This makes our config file more robust.

Adding babel to the config

We previously installed babel but we now need webpack to automatically use babel to transpile our ES6 into ES5 for us. To do so we define which file extensions we want to be transpiled (.jsx, .js), and also we exclude the folders we don’t want to transpile:

module.exports = { entry: [ './app/src/index.js' ], module: { loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loader: 'babel' }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: __dirname + '/app/build', publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: './app/build' } };

Adding the resolve helps us make sure that whenever we use the ES6 import, it will work independently of the file extension and independently of if we set it at the moment we define the import. However, it would be wise to define a code style that forces to always use the same file extension. In our case, we defined it to .jsx.

At this point, the config should work but when we check the running development server we realize we have an issue, babel cannot be loaded. We get something like:

ERROR in Loader /Users/mezod/Desktop/projects/listlogs/node_modules/babel/index.js didn't return a function

What’s wrong? Well, what happens is that webpack is trying to load babel-loader before babel. So let’s just also install babel-loader to fix this issue:

npm install --save babel-loader

And the error is gone! We could decide to uninstall the Babel package now, but we will keep it in case we need to use the Babel CLI.

Configuring Babel

We could create a config file .babelrc for Babel in the root of our project if we wanted to. There are several advanced options. However, we consider they fall out of this boilerplate’s scope. It’s still worth it to give them a look to know of the existence of some options you might be needing in the future.

Setting up webpack-dev-server

We have already used the webpack-dev-server and realized how helpful it can be in our development workflow. It automatically refreshes the content in the browser while we are developing our application. One special feature we will take advantage of is Hot Module Replacement (HMR) which will be really useful while developing in React.

For now let’s just set it up. First, to invoke our development server through npm remember that we added the script “dev” in our package.json:

… "scripts": { … "build": "node_modules/.bin/webpack", "dev": "node_modules/.bin/webpack-dev-server" }, …

So that we can start our development server with

npm run dev

However, we want to start our development server with a few improvements, so we set them up in the webpack.config.js file:

var webpack = require('webpack'); module.exports = { entry: [ './app/src/index.js' ], module: { loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loader: 'babel' }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: __dirname + '/app/build', publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: './app/build', historyApiFallback: true, hot: true, inline: true, progress: true }, plugins: [ new webpack.HotModuleReplacementPlugin() ] };

Don’t forget to require webpack. We have basically enabled Hot Module Replacement and HTML5 History fallback, to improve the UX in our app by allowing the History API routes to work. In other words, the back and forth arrows of the browser. The inline setting embeds the webpack-dev-server runtime into the generated bundle easing the set up of the Hot Module Replacement by not having to set up a bunch of entry paths.

If we restart the server now, ctrl+c and

npm start

And access http://localhost:8080/ in the browser, we will see in the console that the Hot Module Replacement is now enabled:

[WDS] Hot Module Replacement enabled.

Want to see something that will blow your mind? If now create our first component:

/app/src/components/boilerplate.js

module.exports = function () { var element = document.createElement('h1'); element.innerHTML = 'Wololo wololo!'; return element; };

and require it in our entry point

app/src/index.js

//console.log('Wololo'); var component = require('./components/boilerplate'); var app = document.createElement('div'); document.body.appendChild(app); app.appendChild(component());

We will see that the browser automatically refreshes!! Nice!! Even by developing this small example, I messed up with the file paths and realize how powerful this is for the workflow. Notice that we didn’t restart the server. Now any further changes in the boilerplate component we just defined instantly update in the browser :D

We could also run the application from localhost:8080/webpack-dev-server/bundle . This way we get information about the status of the rebundling process.

There are several configuration options like enabling lazy loading or changing the default port 8080 to anything else passing the port parameter to devServer, i.e “port: 4000”.

React

React is a change in the paradigm of front-end development frameworks. In fact, it is not a framework per se, as it needs to build up together with other libraries to provide a complete solution. In our boilerplate, we will be using it together with Redux and Immutable.js among others. As they state in their front page, React can be associated to the V of the MVC pattern. React primarily deals with the task of defining user interfaces. This tendency is understood by the fact that as React makes re-rendering the DOM become relatively inexpensive, we need lot less of the M and C.

Developing React applications with Redux and Immutable.js is a very good combination of technologies. On the one hand, React components are just a stateless projection of the application state at a given point in time. Why? Well, as your app grows, storing state in the DOM doesn’t scale well. So we need to store state somewhere else, definitely not in the DOM. On the other hand, Redux stores the state in an immutable data structure. Exactly what we need! So basically, the code dealing with the user interface (React components) is isolated from the state of the application (Redux store). This separation of concerns also makes it very easy to test our code, which is always a great advantage.

Enough talk, to install react in our project we just

npm install --save react

React example view

Even though this section should fall out of the scope of this post, it is still a good idea to create a very simple React view to make sure that we installed everything smoothly. Having some code to play with, will also allow us to see the power and usefulness of the react-hot-loader development tool.

Note: Feel free to get rid of the components/boilerplate.js since we won’t be needing it anymore.

Without entering into the details of React, let’s just create our base App component:

import React from 'react'; import Todo from './Todo.jsx'; export default class App extends React.Component { render() { return <Todo />; } }

that imports the following Task component

import React from 'react'; export default class Todo extends React.Component { render() { return <div>Build my boilerplate</div>; } }

Finally, we need to update our index.jsx to

import React from 'react'; import App from './components/App.jsx'; main(); function main() { const app = document.createElement('div'); document.body.appendChild(app); React.render(<App />, app); }

Note that we changed the extension of the file from .js to .jsx because the file now contains JSX content. The rendering logic will first create the DOM element into which it will later render our React application.

http://localhost:8080/ should now show

Build my boilerplate

Nothing very exciting so far other than the fact that now it is React who is rendering the displayed message.

React-hot-loader

If we change the Todo component, we will see that the browser automatically refreshes for us. That’s what we configured it to do when we configured the Hot Module Replacement (HMR) plugin. However, refreshing the page means also loosing our current application state. This can be very annoying while developing, just imagine having to do the same steps to get to the desired application state every time we want to check a change in the code. This is where react-hot-loader comes into place to improve our development workflow. It will automatically swap our React components code as we develop without forcing a browser refresh and thus, without losing the application’s current state. Let’s see it in action.

As we only want this package for the development environment we will install it with the --save-dev flag as follows:

npm install --save-dev react-hot-loader

Now we need to make a couple of tweaks to our webpack config, that should now look like

var webpack = require('webpack'); var path = require('path'); var HtmlwebpackPlugin = require('html-webpack-plugin'); var ROOT_PATH = path.resolve(__dirname); module.exports = { entry: [ path.resolve(ROOT_PATH, 'app/src/index'), ], module: { loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loaders: ['react-hot', 'babel'], }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: path.resolve(ROOT_PATH, 'app/build'), publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: path.resolve(ROOT_PATH, 'app/build'), historyApiFallback: true, hot: true, inline: true, progress: true }, plugins: [ new webpack.HotModuleReplacementPlugin(), new HtmlwebpackPlugin({ title: 'Listlogs' }) ] };

What we just configured is already deprecated

Yep, no kidding. Dan Abramov just killed it in favour of his new baby. I know how you feel… :_D However, as he hasn’t had enough time to document it yet, we will go with the old approach for now.

Source maps

Let’s talk a little bit about source maps and how can they improve our workflow. You may have realized that now, the code we have access to from the development tools, i.e our bundle.js is quite different to the code we initially wrote to the editor. In other words, the code in the development environment and the production environment isn’t very similar to our human eyes. Well, that has an explanation, we have run it through compilation, minification, concatenation, and other kinds of optimizations. This makes it quite unreadable for us when on Firebug or Chrome Dev Tools.

Source maps help us by pointing in our production code to its exact mapping in the original code.

To enable this functionality we just change the webpack.config.js to:

var webpack = require('webpack'); var path = require('path'); var HtmlwebpackPlugin = require('html-webpack-plugin'); var ROOT_PATH = path.resolve(__dirname); module.exports = { devtool: 'source-map', entry: [ path.resolve(ROOT_PATH, 'app/src/index'), ], module: { loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loaders: ['react-hot', 'babel'], }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: path.resolve(ROOT_PATH, 'app/build'), publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: path.resolve(ROOT_PATH, 'app/build'), historyApiFallback: true, hot: true, inline: true, progress: true }, plugins: [ new webpack.HotModuleReplacementPlugin(), new HtmlwebpackPlugin({ title: 'Listlogs' }) ] };

When we now run

npm run build

We should see a bundle.map.js inside our build/ folder. Those are our new helpful source maps!

You might want to check the React developer tools. At this time, the unsigned version for Firefox doesn’t work by default because Firefox blocks all unsigned plugins for security reasons. While we wait for Mozilla to sign the plugin we can change the xpinstall.signatures.required value in about:config.

React-router

Because React takes care of the views and Redux is only concerned with the state of the app, we will need other libraries like react-router to deal with the routing of our app. With react-router we can associate different components to different paths. Additionally react-router keeps our UI in sync with the URL. It has several other features like lazy loading, dynamic route matching and location transition handling.

To add it to the project we just

npm install --save react-router

Redux

Redux is a minimal implementation of the Flux architecture. Redux holds the application state in one place, and defines a minimal yet powerful way of interacting with that state. You can read the official explanation here. If all of this sounds like gibberish to you, this blog post builds up the explanation in a more down to earth way and summarizes the three principles of Redux in a very understandable way:

Everything that happens in your app is an “action”. These actions can be caused by users, browser events, or server events. Doesn’t matter. Everything that changes something in your app does it via an “action”. You have one giant state object that represents all the state in your app. These are not special Models, or Collections, just objects, arrays, and primitives. You write “reducers” for everything that changes state using the reduce() method of an array. A reduce function gets a starting state, the current value and returns the new state. That’s exactly what we want to do in response to actions. We get the starting state, the current action, and we return the new state.

For now, we will just install it

npm install --save redux

Together with Redux, we will need to install the official React bindings for Redux, react-redux.

npm install --save react-redux

We will also install redux-thunk a thunk middleware for Redux. A thunk is a function that wraps an expression to delay its evaluation. Basically, we will use redux-thunk to be able to connect redux synchronous action creators together with network requests. This might not mean much to you right now if you haven’t used Redux before, but trust me, you’ll need it. We install it with

npm install --save redux-thunk

Finally, we also need to install isomorphic-fetch a library to use the new fetch API. This API is used to make network requests replacing the traditional XMLHttpRequest, in other words we will use it for the AJAX calls. We need to use the isomorphic-fetch library because most browsers don’t yet support this API natively.

npm install --save isomorphic-fetch

To improve the DX (Developer Experience) with Redux, you can also install the Redux Developer Tools which comes with a set of features to develop and debug with the Redux Store in a more appropriate way. In the demo video you can see an early version of the dev-tools in action. Because of the Redux Store containing the state at each change in the application, we can go back and forth between the application states. It’s like a time machine! Pretty cool!

To add it to the project just:

npm install --save-dev redux-devtools

Once you start creating Redux objects you can fully integrate it following the instructions.

Immutable

We will use Immutable.js data structures to hold the application state. This library works wonderfully with the philosophy of Redux.

A Redux application's state is an immutable data structure in the form of a tree. That means that as long as it exists, the tree representing the state will never change. It will keep representing the same state forever. To move to the next state, Redux produces another tree that reflects the changes we make to the state, thus, representing the new state. This means any two successive states of the application are stored in two separate and independent trees.

You might want to follow Tero’s much more complete explanation.

Anyhow, at this point it is obvious that Immutable.js is a useful library for us, so that we make sure we don’t mess up any of the Redux’s principles.

To install it:

npm install --save immutable

Mocha

We will be using Mocha for our tests, both of the API we will be doing in Koa.js and of the React application. Test-driven development will allow us to incrementally build our application with the certainty that after every change we aren’t breaking things that used to work.

Let’s install it only for the development environment, of course:

npm install --save-dev mocha

Additionally, we will install Chai, an assertion/expectation library that comes in very handy to define what you are expecting your code outputs to be.

To install it:

npm install --save-dev chai

Also, we install the chai-immutable library which extends Chai to support also Immutable data structures.

npm install --save-dev chai-immutable

To execute our tests we can use the following command

./node_modules/mocha/bin/mocha --compilers js:babel/register --recursive

This command basically asks Mocha to recursively search for all the tests in the project folder, to transpile any code in ES6 using Babel if it’s the case, and to run them. Right now, if we execute it, it should throw an exception given we haven’t defined any tests yet.

We can make a script for it in our package.json whose “scripts” section should now look like:

"scripts": { "webserver": "node api/src/server.js", "test": "mocha --compilers js:babel-core/register --recursive", "build": "node_modules/.bin/webpack", "dev": "node_modules/.bin/webpack-dev-server" }

so that we can run them with

npm run test

again, this should throw an error for now.

To keep improving our development workflow let’s add another script called “test:watch” that will launch a process to look after changes in our code and automatically run the tests for us.

The “scripts” in our package.json should now look like:

"scripts": { "webserver": "node api/src/server.js", "test": "mocha --compilers js:babel-core/register --recursive", "test:watch": "npm run test -- --watch", "build": "node_modules/.bin/webpack", "dev": "node_modules/.bin/webpack-dev-server" }

Given the fact that we also want to test React, we will need a DOM. Instead of running the test in an actual web browser with a library like Karma, we will use jsdom, a DOM implementation that runs in Node.js. This is a good blogpost on how to test React using jsdom.

We just install it like any other package for development:

npm install --save-dev jsdom

To be able to use jsdom to test React we will need to do a little bit of setup. First, we need to create the jsdom versions of the document and window objects browsers provide. We then need to put this objects into node’s global namespace object to make them accessible to React. We can set up a test helper file inside our tests folder for this purpose:

App/tests/test_helper.js

import jsdom from 'jsdom'; const doc = jsdom.jsdom('<!doctype html><html><body></body></html>'); const win = doc.defaultView; global.document = doc; global.window = win;

Once we’ve done this, we also need to expose all the jsdom window object properties, such as navigator, putting them into the node’s global namespace object so that the properties provided by window can be used without the window. prefix, mimicking the environment we would have with a real browser, and enabling React code to hook up properly.

The app/tests/test_helper.js should look like

import jsdom from 'jsdom'; const doc = jsdom.jsdom('<!doctype html><html><body></body></html>'); const win = doc.defaultView; global.document = doc; global.window = win; Object.keys(window).forEach((key) => { if (!(key in global)) { global[key] = window[key]; } });

Finally, we should remember to import and use the chai and chai-immutable libraries we previously installed for the tests that may need them.

The app/tests/test_helper.js should finally look like

import jsdom from 'jsdom'; import chai from 'chai'; import chaiImmutable from 'chai-immutable'; const doc = jsdom.jsdom('<!doctype html><html><body></body></html>'); const win = doc.defaultView; global.document = doc; global.window = win; Object.keys(window).forEach((key) => { if (!(key in global)) { global[key] = window[key]; } }); chai.use(chaiImmutable);

Finally, we can update our package.json test script so that we can run the tests:

"test": "mocha --compilers js:babel-core/register --require ./test/test_helper.js 'test/**/*.@(js|jsx)'"

We basically changed the –recursive flag which would overlook .jsx files for a glob that will find both, .js and .jsx files. We don’t need to change the watch script.

We still run tests with

npm run test

However, in the future, when we get several test files testing the several components in might come in handy to know how to run tests on a specific file. If we had an hypothetic core_spec.js file full of tests, we would call

npm run test -- test/core_spec.js

SASS

We will be using SASS as our CSS preprocessor. Discovering LESS a couple of years ago was great, it made your life much easier and the learning curve is rather shallow. However, it has its limitations which have made SASS to probably be the better solution at the moment.

To compile SASS files into plain CSS we need the sass-loader.

So let’s add it to our project by

npm install --save sass-loader

While we install it we see that this library has a peer dependency from the node-sass library, so let’s just also install it

npm install --save node-sass

However, this is not enough. Webpack understands only JS, and thus we need another library to turn our CSS into JS, that is where css-loader comes into game. We install it as usual:

npm install --save css-loader

We need one more library, style-loader, which will take care to embed our styles into the application.

npm install --save style-loader

Finally, we need to update our webpack, the webpack.config.js should now look like:

var webpack = require('webpack'); var path = require('path'); var HtmlwebpackPlugin = require('html-webpack-plugin'); var ROOT_PATH = path.resolve(__dirname); module.exports = { devtool: 'source-map', entry: [ path.resolve(ROOT_PATH, 'app/src/index'), ], module: { loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loaders: ['react-hot', 'babel'] }, { test: /\.scss$/, loaders: ['style','css','sass'] }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: path.resolve(ROOT_PATH, 'app/build'), publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: path.resolve(ROOT_PATH, 'app/build'), historyApiFallback: true, hot: true, inline: true, progress: true }, plugins: [ new webpack.HotModuleReplacementPlugin(), new HtmlwebpackPlugin({ title: 'Listlogs' }) ] };

Let’s make sure it works by creating our first .scss file, App.scss inside the components folder with a basic style:

body{ background-color: blue; }

and let’s require such file from our App component

app/src/components/App.jsx

import React from 'react'; import Todo from './Todo.jsx'; require('./App.scss'); export default class App extends React.Component { render() { return ; } }

Let’s restart the webpack-dev-server and browse http://localhost:8080/, beautiful, right?

Change the styling and go see what happens to the browser…yaaay, it automatically updates the styles!

Few words on folder structure

If you remember correctly, when we defined the folder structure we created a styles folder inside our src. However, we placed the .scss file we just created inside the components folder. Well, this is an open case. Normally, we are used to have a styles folder with all the styles files, maybe one per each page, or per each widget, etcetera. But we are now working with React components. This means that every single part of the web application will be one component or another. So it sounds like a good alternative to create a folder for each component inside the components folder that contains both, the .jsx and the .scss for that component. This solution enforces to split styles at a component level and makes maintainability way easier. Your choice!

Linting our code with ESLint

Linting is the process of running a program that will analyze the code for potential errors. It is a way to find syntax errors and standardize with a set of rules the way code is written, which is especially interesting when the code is developed and maintained by multiple coders.

As Juho explains in his chapter dedicated to linting in his book SurviveJS, linting can considerably improve our workflow by detecting errors in our code before they become actual problems.

Douglas Crockford made JSLint, the first linter for JavaScript, which was highly opinionated. Later, as an alternative that allowed formuch more customization JSHint appeared. Finally, ESLint is an evolution of the previous linters that not allows you to create new rules but you can also hook it up with custom parsers and reporters, so that you can use it also with Babel and JSX syntax. You can also control the degree of severity for each rule.

Additionally, we find a specific linter for code style, JSCS, even though ESLint already covers most of its features.

For this boilerplate, we will be installing ESLint, it allows for enough flexibility while already introducing basic code style linting.

We will need to install some packages, so let’s begin!

npm install --save-dev eslint

To install the tool itself.

npm install --save-dev babel-eslint

To lint all valid Babel code,

npm install --save-dev eslint-plugin-react

To install a set of default react specific linting rules.

We can now create a script to be able to call for linting from the command line, the scripts in the package.json should now look like:

"scripts": { "webserver": "node api/src/server.js", "test": "mocha --compilers js:babel-core/register --require ./test/test_helper.js 'test/**/*.@(js|jsx)'", "test:watch": "npm run test -- --watch", "build": "node_modules/.bin/webpack", "dev": "node_modules/.bin/webpack-dev-server", "lint": "eslint . --ext .js --ext .jsx" }

This script would trigger ESLint against all the .js and .jsx files of our project. Because of that not being necessary, we will restrict which folders we actually want to be linted by defining which ones don’t need to be, for example our production folder. To do so we just need to create a .eslintignore file in the root of our project with the following content:

build

We also need to activate babel-eslint for ESLint to lint our Babel code too. The same applies to the react specific rules we previously installed, we need to activate them too. To do so we just create an eslint configuration file in the root of our project called .eslintrc with the following content:

{ "parser": "babel-eslint", "env": { "browser": true, "node": true }, "plugins": [ "react" ], "rules": { "new-cap": 0, "strict": 0, "no-underscore-dangle": 0, "no-use-before-define": 0, "eol-last": 0, "quotes": [2, "single"], "react/jsx-boolean-value": 1, "react/jsx-quotes": 1, "react/jsx-no-undef": 1, "react/jsx-uses-react": 1, "react/jsx-uses-vars": 1 } }

You can decide the degree of severity of each rule with a 0 to disable it, a 1 to have warnings, and a 2 for ESLint to emit errors. It might be a smart idea to check the documentation.

Finally, we need to connect ESLint with webpack to automatize the whole process. First we need to install the eslint loader:

npm install --save-dev eslint-loader

And configure it in the webpack.config.js which should now look like

var webpack = require('webpack'); var path = require('path'); var HtmlwebpackPlugin = require('html-webpack-plugin'); var ROOT_PATH = path.resolve(__dirname); module.exports = { devtool: 'source-map', entry: [ path.resolve(ROOT_PATH, 'app/src/index'), ], module: { preLoaders: [ { test: /\.jsx?$/, loaders: ['eslint'], include: path.resolve(ROOT_PATH, 'app') } ], loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loaders: ['react-hot', 'babel'] }, { test: /\.scss$/, loaders: ['style','css','sass'] }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: path.resolve(ROOT_PATH, 'app/build'), publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: path.resolve(ROOT_PATH, 'app/ build '), historyApiFallback: true, hot: true, inline: true, progress: true }, plugins: [ new webpack.HotModuleReplacementPlugin(), new HtmlwebpackPlugin({ title: 'Listlogs' }) ] };

Note that we defined it inside the preLoaders section of the configuration, which gets executed before loaders. If linting fails, we don’t need to go further.

If we npm start now, linting will already be working under the hood.

It is important to know that there’s much more to linting. For example, you could define different levels of linting depending on the environment. Some environments have conventions of their own. It would be a good idea, for example, to add the mocha environment to our .eslintrc so that ESLint doesn’t keep warning us about custom Moch keywords such as “describe”.

"env": { "browser": true, "node": true, "mocha": true }

Creating the production ready files to deploy

We are almost done! It’s time to configure Webpack to help us at the time to deploy our application. To generate the necessary files for the production ready application we have to add a new script to our package.json file:

"scripts": { "webserver": "node api/src/server.js", "test": "mocha --compilers js:babel-core/register --require ./test/test_helper.js 'test/**/*.@(js|jsx)'", "test:watch": "npm run test -- --watch", "build": "node_modules/.bin/webpack", "dev": "node_modules/.bin/webpack-dev-server", "lint": "eslint . --ext .js --ext .jsx", "deploy": "NODE_ENV=production webpack -p" }

We use the “production” environment variable to allow our required modules to do their optimizations (minimification, uglify, etc). It is important to note that we could decide to create a separate webpack configuration file for the production environment, and thus, we would need to specify such file in the script like:

"deploy": "NODE_ENV=production webpack -p --config webpack.production.config.js"

However, we find it unnecessary at the moment. This means that we need to tweak our webpack.config.js a little but to consider the production environment. As we said earlier, we already use /build for the development build files, and now, we will define /dist for the production ready files. Let’s specify that in the config file:

var webpack = require('webpack'); var path = require('path'); var HtmlwebpackPlugin = require('html-webpack-plugin'); var ROOT_PATH = path.resolve(__dirname); module.exports = { devtool: process.env.NODE_ENV === 'production' ? '' : 'source-map' , entry: [ path.resolve(ROOT_PATH, 'app/src/index'), ], module: { preLoaders: [ { test: /\.jsx?$/, loaders: process.env.NODE_ENV === 'production' ? [] : ['eslint'], include: path.resolve(ROOT_PATH, 'app') } ], loaders: [{ test: /\.jsx?$/, exclude: /node_modules/, loaders: ['react-hot', 'babel'] }, { test: /\.scss$/, loaders: ['style','css','sass'] }] }, resolve: { extensions: ['', '.js', '.jsx'] }, output: { path: process.env.NODE_ENV === 'production' ? path.resolve(ROOT_PATH, 'app/dist') : path.resolve(ROOT_PATH, 'app/build'), publicPath: '/', filename: 'bundle.js' }, devServer: { contentBase: path.resolve(ROOT_PATH, 'app/dist'), historyApiFallback: true, hot: true, inline: true, progress: true }, plugins: [ new webpack.HotModuleReplacementPlugin(), new HtmlwebpackPlugin({ title: 'Listlogs' }) ] };

If we now run

npm run deploy

We should get our production ready files inside the dist/ folder in the root. Yay!

Note that we also changed the line where we enable source-maps, so that we only generate the source-maps for the development environment! When running npm run build, we should have bundle.map.js in our build/ folder, on the other hand, when we run npm run deploy, we shouldn’t have a bundle.map.js in our dist/ folder. Similarly, we will only lint our code when in development environment.

Finally, we have to remember to add the new dist/ folder to .eslintignore, as similarly to what happens with build/, we don’t want the linter to apply its rules here. The content of .eslintignore should now be

build dist

Deploying the application

It is interesting to explain how was this boilerplate thought for when the D day arrives and the application gets finally deployed. There are several advanced tools that can be used to deploy the application. However, here I will only present the three main deployment flows I spotted that make no use of deployment specific tools.

Some people prefer to create the production ready files when in local, tarball those files (dist/ and api/) and sftp them to the server where these can be uncompressed. This solution is probably the most elegant. It doesn’t copy the source code of our client application to the server which always makes it more secure. However, doing it manually can get messy and tedious. People generally create serve scripts that do the whole process for them, but this is relatively advanced and falls out of scope for this post, if anything does! Another solution is to pull the source from github, for example, into the server, where the production ready files are generated (with scripts like our deploy). This involves having to do some work from the server. Also, in this case, the source code would be in the server, unless we manually remove it. The final solution and the one we will be using, as we consider will improve our workflow, is to generate the production ready files locally, pushing them to the github repo and pulling them from the server. This is mostly the reason behind not having a dist/ in our gitignore.

Few words about scripts

If you followed along through the whole post, besides having my deepest respect, you will have seen we have been adding a bunch of scripts that come in very handy. Scripts really do improve our workflow. Here you can see some examples of how flexible they can get to be.

For now, I’ll just add a final script that I think can be very useful, which will help us get rid of our dist/ and build/ folders. We will call it “clean” and add it to the package.json:

"scripts": { "webserver": "node api/src/server.js", "test": "mocha --compilers js:babel-core/register --require ./test/test_helper.js 'test/**/*.@(js|jsx)'", "test:watch": "npm run test -- --watch", "build": "node_modules/.bin/webpack", "dev": "node_modules/.bin/webpack-dev-server", "lint": "eslint . --ext .js --ext .jsx", "deploy": "NODE_ENV=production webpack -p", "clean": "rm -rf app/dist app/build" }

What would come next?

As you probably noticed, this post became a beast. In this final (I promise!) section I will just explain other improvements that the boilerplate could have in the near future.

Multiple webpack configs : If you remember properly, in the section where we created the production ready files we had to do some tweaks to our configuration. This is ok for now, because these were just a couple of changes that can be easily spotted and maintained. However, this is an antipattern. The proper way to go would be to have a webpack configuration for the development environment (to finally generate the /build) and one for the production environment (to finally generate /dist). But this is not it, we could also use having a webpack configuration for the backend code (/api).

: If you remember properly, in the section where we created the production ready files we had to do some tweaks to our configuration. This is ok for now, because these were just a couple of changes that can be easily spotted and maintained. However, this is an antipattern. The proper way to go would be to have a webpack configuration for the development environment (to finally generate the /build) and one for the production environment (to finally generate /dist). But this is not it, we could also use having a webpack configuration for the backend code (/api). Advanced deploy scripts : This way we could develop some nice workflows that eased our job and improved the security of our web application by not pushing the source code to the server. You might also want to read into Travis CI.

: This way we could develop some nice workflows that eased our job and improved the security of our web application by not pushing the source code to the server. You might also want to read into Travis CI. PostCSS : Transforming styles with JS plugins together with Autoprefixer a plugin to add vendor prefixes to CSS rules.

: Transforming styles with JS plugins together with Autoprefixer a plugin to add vendor prefixes to CSS rules. Inline styles .

. Inline Images .

. Inline fonts.

npm’s website is full of very useful packages we might need in the near future, here I compiled a short list:

Fluxible or Flummox for isomorphic / universal / portable applications. Or in plain English, libraries that will help us deal with server-side rendering to support SEO.

or for isomorphic / universal / portable applications. Or in plain English, libraries that will help us deal with server-side rendering to support SEO. History to easily manage session history in browsers.

to easily manage session history in browsers. Socket.io to make real-time applications possible, maybe you want to have social notifications in your app?

to make real-time applications possible, maybe you want to have social notifications in your app? lru-memoize for memoization.

for memoization. webpack-merge for advanced webpack configuration.

Standing on the shoulders of giants

Here, a list of some of the most relevant bibliography links and github repos I used to create this boilerplate!

Final words

If you made it this far you really deserve an applause.

I hope this post helped you somehow. If you find any issues or have any problems, I’d be glad to hear about them! Cheers!

@mezod