If you’re like me, you probably have a half dozen projects festering in various states of decay, all with different build processes. The frequency at which webpack, webpack loaders, babel, etc… increment versions, coupled with the insane pace of new “best practices” in the frontend world, makes build maintenance feel like a full time job. An insanity inducing, bug-riddled, nightmare of a job.

extreme anger

In this article, I’ll explain how I set up a version controlled npm package that I use to build and develop all my frontend projects.

Now you don’t have to include webpack, its loaders, and all the other build tools you need in every new project.

The config’s structure

/src <- dev files

/bin <- scripts that will be exported

/webpack <- webpack configs

/lib <- output files (built)

.babelrc <- babel config

package.json <- :-)

postcss.config.json <- postcss config

tsconfig.json <- typescript config

Clean up

For our first script, let’s write something that clears out certain directories and then remakes them. It’s not the flashiest example, but it’s necessary and will be good to illustrate how all the pieces of this repo come together.

First, let’s make our source file, src/bin/clean.js.

#!/usr/bin/env node



import shell from 'shelljs';



const targets = ['dist/client', 'dist/server'];



shell.echo('clean: removing targets:', targets);

shell.rm('-rf', targets);



shell.echo('clean: creating targets:', targets);

shell.mkdir('-p', targets);

This will import the shelljs package and then run two commands (clear out some folders, and create some folders). Since we’re writing this in ES6, we’re going to need to transpile to ES5 so projects can import it without having to transpile themselves.

Transpiling

Let’s install the packages we need to make this script, and the transpilation work.

npm install babel-cli babel-core babel-preset-es2015 babel-preset-stage-0 babel-register babel-runtime shelljs

Babel and its associated packages let us transpile code from ES6 -> ES5 (it can do a lot more than that as well!) Shelljs lets us run shell commands in node via an API.

{

"presets": ["es2015", "stage-0"]

}

In our ./package.json, let’s write a script that will do our transpilation

"scripts": {

"clean": "rm -rf lib",

"premake": "npm run clean",

"make": "babel ./src -d ./lib"

},

"bin": {

"pwln-clean": "./lib/bin/clean.js"

}

Alright let’s take a look at what we just did. We created a few scripts for internal usage in the “scripts” key of package.json.

The “scripts” section of package.json is used for internal scripts, or things we want to be contained within this package.

clean : this removes our /lib (built files) directory

: this removes our /lib (built files) directory premake : this automatically runs first anytime we do “npm run make”. it calls the above script

: this automatically runs first anytime we do “npm run make”. it calls the above script make: this uses the babel cli to recursively transpile all code in the ./src folder and outputs it to the ./lib folder

Bin there bin that

The “bin” section defines scripts we want 3rd party consumers to be able to invoke once they install the package.

pwln is an arbitrary name, should be the name of whatever you decide to call your package. In this case, it stands for paul walker loves node

pwln-clean: this runs our built clean.js file

Updating .gitignore

Once you get the above setup, you’ll notice that everytime you npm run make it will add your transpiled code to ./lib, which git will say should be committed. However, we don't really need to add the transpiled code to git, since given the same src files the output will always be the same.

Let’s take care of this and create/update .gitignore

node_modules/*

npm-debug.log

.DS_STORE

lib/

The last step is to ensure that our package always has the most up to date /lib code when installed. For this, we need to update package.json:

"scripts": {

"prepublish": "npm run make"

}

For a list of all the groovy npm hooks and when they get run, check out the npm docs

Let’s get this into Github

In order to use this code in our other projects, we have to do three things:

chmod the lib folder Push it to github Publish the package in a npm registry

chmod -R +x ./lib

so we can access the bin scripts.

To push it to github, add a remote address for the githubl url

git add remote origin <url>

To publish a package to the npm registry:

npm publish

You’ll need an npm user set up, the CLI will guide you through the process if you don’t have one already.

Subsequent code changes and publishes must increment the package version.

npm version <patch>

before publishing to increment. Check out npm’s guide to versioning here

Import and run

Once you’ve published, you can install the package in any npm initialized project by doing

npm install <project_name>

For local development, you’ll want a way to test your packages before publishing. You can install/update npm packages by changing the version number in package.json to a string of where to grab the package

For example,

"dependencies": {

"pwld": "file:///<path_to_package>"

}

or

"dependencies": {

"pwld": "https://<github_url>"

}

With this method, you must clear out your old version of the package, or increment the version and run npm update

Alternatively, you can locally link packages to pull directly from the source.

In your config project’s directory, run

npm link

And in the importing project’s directory, run

npm link <package_name>

Love, hate, and webpack

Webpack is an amazing tool and understanding how it works is a fundamental part of being a frontend engineer. However, for a lot of devs, setting up webpack and debugging build errors is akin to burning in the 9th circle of hell. In this section, I’ll go over my webpack build, what each piece does, and how to abstract the entire build process into a nice little package.

it’s going to be a bad day

The config code will make some assumptions about the projects that will import it. Namely, the folder structure. Feel free to substitute folders, names, etc… to best fit your use case.

/app

/client

- index.js <- client entry point

/server

- index.js <- server entry point

/dist

/client <- client webpack output

/server <- server webpack output

/images

/scss <- includes, mixins, colors, etc...

/public <- misc files served by express

Modern frontend projects tend to require a build process that outputs both a server and client bundle. The majority of code is shared, but one bundle will be run in node and start an http server, while the other will be downloaded and run in the browser. If you’re unfamiliar with this paradigm, I’d recommend researching universal javascript before you continue.

Descent into ~~madness~~ config

The webpack configuration is split up into three files.

client config

server config

configuration shared amongst both

Additionally, each config file changes for production vs. development builds.

Before we get into the config itself, let’s define some helper functions to resolve paths and install missing dependencies

npm install app-root-dir babel-eslint babel-loader url-loader awesome-typescript-loader svg-react-loader sass-loader css-loader postcss-loader webpack-node-externals webpack uglifyjs-webpack-plugin assets-webpack-plugin extract-text-webpack-plugin chalk babel-preset-react autoprefixer

This paths file is a key-map of paths that we will be using in our config.

/* src/webpack/paths.js */



import appRootDir from 'app-root-dir';

import fs from 'fs';

import path from 'path';



export function resolvePath(...paths) {

return path.resolve(appRootDir.get(), ...paths);

}



export default {

clientIndex: resolvePath('./app/client/index.js'),

serverIndex: resolvePath('./app/server/index.js'),

clientOutput: resolvePath('./dist/client'),

serverOutput: resolvePath('./dist/server'),

app: resolvePath('./app'),

scss: resolvePath('./scss'),

shared: resolvePath('./app/shared'),

client: resolvePath('./app/client'),

server: resolvePath('./app/server'),

images: resolvePath('./images'),

config: resolvePath('./config'),

test: resolvePath('./test'),

dist: resolvePath('./dist'),

public: resolvePath('./public')

};

Shared Config

webpack.shared.config.js defines config options that will be used in both the client and server webpack builds.

/* src/webpack/webpack.shared.config.js */



import paths from './paths';



export default function webpackSharedConfig(options) {

const {

// nodeEnv is the environment we are building for (production, development, ...)

nodeEnv,

port = 8080

} = options;



const isDev = nodeEnv !== 'production';



// determines the type of source maps we want to generate. influences build/rebuild time.

// https://webpack.js.org/configuration/devtool/

const devTool = isDev ? 'cheap-module-eval-source-map' : 'hidden-source-map';



// controls when webpack emits warnings for asset file-sizes

const performance = isDev ? false : { hints: 'warning' };



// the path to webpack's generated output. in production, this could point to a CDN

const publicPath = '/dist/client/';



// what file extensions should webpack automatically resolve?

// eg: import myFile from './myFile.js' vs import myFile from './myFile'

const resolveExtensions = ['.ts', '.tsx', '.js', '.jsx', '.json'];



// what aliases should be usable?

// eg: import myFile from 'scss/myFile.scss' will reference paths.scss/myFile.scss

const resolveAlias = {

app: paths.app,

scss: paths.scss,

shared: paths.shared,

client: paths.client,

server: paths.server,

images: paths.images,

config: paths.config,

test: paths.test,

dist: paths.dist,

public: paths.public



};



// run esLintLoader before webpack compiles

const eslintLoader = {

enforce: 'pre',

test: /\.jsx?$/,

exclude: /node_modules/,

loader: 'eslint-loader',

options: {

failOnError: false,

emitWarning: true,

quiet: false,

failOnWarning: false

}

};



// js loader, ignore node_modules and .babelrc (use presets defined here)

const babelLoader = {

test: /\.jsx?$/,

exclude: /node_modules/,

loader: 'babel-loader',

query: {

presets: ['es2015', 'react', 'stage-0'],

babelrc: false

}

};



const urlLoader = {

test: /\.woff2?$|\.jpe?g$|\.ttf$|\.eot$|\.png$/,

loader: 'url-loader?limit=3000&name=[name]-[sha512:hash:base64:7].[ext]'

};



const tsxLoader = {

test: /\.tsx?$/,

loader: 'babel-loader!awesome-typescript-loader'

};



const inlineSvgLoader = {

test: /\.svg?$/,

loader: 'svg-react-loader',

exclude: /node_modules/

};



return {

nodeEnv,

isDev,

resolveAlias,

publicPath,

resolveExtensions,

devTool,

performance,

eslintLoader,

babelLoader,

urlLoader,

inlineSvgLoader,

tsxLoader,

paths,

port

};

};

Server config

Specific config for server bundle. It expects an object generated by shared config as an argument

/* src/webpack/webpack.server.config.js */



import nodeExternals from 'webpack-node-externals';

import webpack from 'webpack';

import UglifyJsPlugin from 'uglifyjs-webpack-plugin';

import { CheckerPlugin } from 'awesome-typescript-loader';



export default function webpackServerConfig(options) {

const {

nodeEnv,

isDev,

resolveAlias,

publicPath,

resolveExtensions,

devTool,

eslintLoader,

babelLoader,

urlLoader,

performance,

inlineSvgLoader,

tsxLoader,

paths,

port

} = options;



const plugins = [



// expose variables available in project

new webpack.DefinePlugin({

'process.env.NODE_ENV': JSON.stringify(nodeEnv),

'process.env.PORT': port,

__CLIENT__: false,

__SERVER__: true,

__DEV__: isDev,

__PRODUCTION__: !isDev

}),



// used to report async errors (webpack in --watch mode)

// https://github.com/s-panferov/awesome-typescript-loader

new CheckerPlugin()

];



if (!isDev) {

// minifies js

plugins.push(new UglifyJsPlugin());

}



return {

target: 'node',



node: {

__dirname: true,

__filename: true

},



devtool: devTool,



// only bundle things on this whitelist

externals: nodeExternals({

whitelist: [

/\.(eot|woff|woff2|ttf|otf)$/,

/\.(svg|png|jpg|jpeg|gif|ico)$/,

/\.(mp4|mp3|ogg|swf|webp)$/,

/\.(css|scss|sass|sss|less)$/

]

}),



performance,



entry: {

// the entry point for the server build

// requires all subsequent files

index: paths.serverIndex

},



// where webpack should output built files

output: {

path: paths.serverOutput,

pathinfo: true,



// how should webpack reference the built files

publicPath,



filename: '[name].js',

libraryTarget: 'commonjs2'

},



resolve: {

modules: ['node_modules'],

extensions: resolveExtensions,

alias: resolveAlias

},



// rules for what loaders should be used on various file types

module: {

rules: [

babelLoader,

inlineSvgLoader,

urlLoader,

tsxLoader,

{

test: /\.s?css$/,

use: ['css-loader?minimize', 'postcss-loader', 'sass-loader']

}

]

},



plugins

};

};

Client config

Specific configuration for the client side js bundle.

/* src/client.webpack.config.js */



import AssetsPlugin from 'assets-webpack-plugin';

import ExtractTextPlugin from 'extract-text-webpack-plugin';

import webpack from 'webpack';

import UglifyJsPlugin from 'uglifyjs-webpack-plugin';

import { CheckerPlugin } from 'awesome-typescript-loader';



export default function webpackClientConfig(options) {

const {

nodeEnv,

isDev,

resolveAlias,

publicPath,

resolveExtensions,

devTool,

eslintLoader,

babelLoader,

urlLoader,

performance,

inlineSvgLoader,

tsxLoader,

paths,

port

} = options;



const plugins = [

// create a separate vendor.js bundle on output that includes packages imported from node_modules

// add a chunkhash to the filename for cache busting (eg: vendor-029192849.js)

new webpack.optimize.CommonsChunkPlugin({

name: 'vendor',

filename: 'vendor-[chunkhash].js',

minChunks: function(module) {

return module.context && module.context.indexOf('node_modules') !== -1;

}

}),



// create a separate manifest.js bundle on output that gives instructions on how to read the various output chunks

new webpack.optimize.CommonsChunkPlugin({

name: 'manifest',

minChunks: Infinity

}),



// create an assets.json file on output that has a map of the filenames/location of built files, including their hashes

// eg: { vendor: { js: <path>/vendor-029192849.js } }

// this is so we can include the script links to built js/css files when we generate HTML

new AssetsPlugin({

filename: 'assets.json',

path: paths.clientOutput,

prettyPrint: true,

includeManifest: 'manifest',

metadata: {

publicPath

}

}),



// extract out required css into a single chunkhashed file

new ExtractTextPlugin({ filename: '[name]-[chunkhash].css', allChunks: true }),



// polyfill for native promises

new webpack.ProvidePlugin({

'Promise': 'native-promise-only'

}),



// expose variables available in project

new webpack.DefinePlugin({

'process.env.NODE_ENV': JSON.stringify(nodeEnv),

'process.env.PORT': port,

__CLIENT__: true,

__SERVER__: false,

__DEV__: isDev,

__PRODUCTION__: !isDev

}),



// used to report async errors (webpack in --watch mode)

// https://github.com/s-panferov/awesome-typescript-loader

new CheckerPlugin()

];



if (!isDev) {

plugins.push(

new UglifyJsPlugin()

);

}



return {

target: 'web',



devtool: devTool,



entry: {

index: paths.clientIndex

},



performance,



output: {

path: paths.clientOutput,

pathinfo: true,



// js bundle that contains code from all entry points + webpack runtime

// include chunkhash for versioning/cache busting

filename: '[name]-[chunkhash].js',



chunkFilename: '[name]-[chunkhash].js',



publicPath

},



resolve: {

modules: ['node_modules'],

extensions: resolveExtensions,

alias: resolveAlias

},



module: {

rules: [

babelLoader,

inlineSvgLoader,

urlLoader,

tsxLoader,

{

test: /\.s?css$/,

// exclude: /node_modules/, can't exclude, have to process mui stuff

use: ExtractTextPlugin.extract({

loader: ['css-loader', 'postcss-loader', 'sass-loader']

})

},

]

},



plugins

};

};

3rd party configs

In order to use postcss, we need to provide a config .js file so it knows what plugins we want to use.

/* postcss.config.js */



module.exports = {

plugins: [

require('autoprefixer')

]

};

Compiling

Now that we have our configs, we need to run a webpack build with them.

/* src/webpack/build.js */



import webpack from 'webpack';

import webpackClientConfig from './webpack.client.config.js';

import webpackServerConfig from './webpack.server.config.js';



// expects options built from webpack.shared.config.js

function build(options) {

// you can pass multiple configs to webpack in an array to compile multiple builds

// in this case we pass the server + client build configs

const compiler = webpack([webpackClientConfig(options), webpackServerConfig(options)]);



return compiler.run((err, stats) => {

if (err) {

console.error(err);

return;

}



console.log(stats.toString({ colors: true }));

});

}



export default build;

Finally, we need a script that will pass in our options to webpack.shared.config.js and run the previous build.js file.

/* app/bin/build.js */



#!/usr/bin/env node



var chalk = require('chalk');



// these are ES6 files so we need to require using the .default syntax

var build = require('../webpack/build.js').default;

var webpackSharedConfig = require('../webpack/webpack.shared.config.js').default;



const options = webpackSharedConfig({

nodeEnv: process.env.NODE_ENV

});



console.log(chalk.green('Building project with pwln'));

console.log(chalk.blue('Options:'), options);



build(options);

Adding the bin

Like we added the bin definition in package.json for our clean script, we must do the same thing now for our new webpack build.

/* package.json */



"bin": {

"pwln-clean": "./lib/bin/clean.js",

"pwln-build": "./lib/bin/build.js"

},

We are mapping the lib/bin/build.js file to the command pwln-build so it will be available in projects that import this package.

If you’re using npm link to install and test this project locally, you will have to re-run npm link each time you update the bin key

Recap so far

We just went through a ton of code. Let’s take a quick breather and recap everything we’ve accomplished so far.

We’ve created an npm package that has two commands.

pwln-clean

removes and remakes output folders

pwln-build

runs two webpack builds, one for the client and one for the server

outputs vendor, manifest, and bundle js files which are hashed for cache-busting

outputs an asset.json file that maps the above to their hashed filenames

handles ES6, typescript, jsx, images, fonts, and sass files

compiles differently for production and development

We’ve mapped these commands to .js scripts in the lib/bin directory which is created by running babel-node on the app/bin directory. (ES6 transpilation)

We’ve (optionally) published our package to NPM and can npm install it anywhere we want.

Importing the package

Before we go any further, let’s test what we’ve made so far by using it in an actual project. I’ve setup a barebones universal react project. Feel free to fork or copy from it what you see fit!

As mentioned earlier, it is very important that the project’s input/output paths match the config. Other than that, the build should work regardless of implementation details such as state management, routing, etc… The example project leaves these things out and only includes the minimum to get a working example of react rendering html on the server/client.

If you’ve matched up the paths in your own project, installing is easy as

npm install <your_config_package> or if you want to use this example, npm install pwln

Once the package is installed, add it to a package.json script like so

/* package.json */



"scripts": {

"clean": "pwln-clean",

"build": "npm run clean && NODE_ENV=development PORT=8081 pwln-build",

"start": node dist/server

}

Running npm run build should log webpack output and result in files being generated in dist/server and dist/client. To run the server, run the index.js file in dist/server. In the case above, npm start

clean package.json :)

Development tweaks

The server should be available on port 8081, or whatever port is set with process.env.PORT. It’s great that we have our build command, but there’s no way we could sanely develop with it. Every time we change a file, we’d have to manually kill the node server, npm run build , and start the server again. We need a separate mechanism that watches for file changes and continually rebuilds and then restarts the node server.

Since we’re running our JS on the server to build HTML, we need to ensure that every rebuild has sync’d client and server output. If the client bundle changes on output but the server doesnt, React will complain about invariants (plus there would be a lot of bugs!)

Let’s get back into our config repo and create a serverWatcher.js file. This file will use chokidar to listen for file changes and start/stop our node server in response to these changes.

Before making these files, install the necessary dependencies

npm install chokidar

/* src/webpack/serverWatcher.js */



import appRootDir from 'app-root-dir';

import chalk from 'chalk';

import chokidar from 'chokidar';

import path from 'path';



const spawn = require('child_process').spawn;



let server;



function startNodeServer(callback) {

let stopWatcher = false;



console.log(chalk.green('Starting node server...'));



server = spawn('node', ['dist/server/index.js']);



// logs out things going to stdout, eg: console.log

server.stdout.on('data', function(data) {

process.stdout.write(data);

});



// logs out things going to stderr, eg: console.warn, errors

server.stderr.on('data', function(data) {

process.stdout.write(data);

});



// triggered by errors and our server.kill('SIGTERM') call below

server.on('close', (code, signal) => {

server = null;



// on a critical error

if (code === 1) {

// set the parent's status

callback(false);



// set this level's status

stopWatcher = true;

}



if (!stopWatcher) {

server = startNodeServer(callback);

}

});



return server;

}



function serverWatcher(callback) {

// chokidar monitors for file changes at this glob

const watcher = chokidar.watch(path.resolve(appRootDir.get(), './dist/server/*'));



watcher.on('ready', () => {

server = startNodeServer(callback);

});



// triggerd on file change

watcher.on('change', (path, stats) => {

console.log(chalk.red('Detected changes in dist/server, attempting to restart node server.'));



if (server) {

console.log(chalk.red('Killing old server...'));



// sends a shut down call to the server (non error)

server.kill('SIGTERM');

}

});



watcher.on('error', error => console.log(chalk.red(`Watcher error: ${error}`)));



// when we want to shut down the process, make sure it is shut down

process.on('SIGINT', () => {

process.exit(0);

});



return watcher;

}



export default serverWatcher;

Now that we’ve created the functions to start/stop the node server, we need to programatically tell webpack to run in watch mode.

/* src/webpack/watch.js */



import chalk from 'chalk';

import webpack from 'webpack';

import webpackClientConfig from './webpack.client.config.js';

import webpackServerConfig from './webpack.server.config.js';



// our node server start/stop functions

import serverWatcher from './serverWatcher.js';



function watch(options) {

const compiler = webpack([webpackClientConfig(options), webpackServerConfig(options)]);



let server;



// serverWatcher.js needs to be able to tell watch.js (this file) if it should restart

function setServer(bool) {

server = bool;

}



// .watch will continually rebuild on file changes (vs. .run)

compiler.watch({

aggregateTimeout: 300

}, (err, stats) => {

if (err) {

console.error(err);

return;

}



// only log webpack errors when watching

console.log(stats.toString('errors-only'));



if (!server) {

setServer(true);



console.log(chalk.green.bold('Starting node serverWatcher...'));



serverWatcher(setServer);

}

});



}



export default watch;

Alternatives are good

You can sub out these two files with whatever implementation of webpack watch and node server restart that you want. Instead of writing the outputs as files, you could use something like webpack-dev-server to serve your assets from memory. Or instead of using chokidar, you could use nodemon.

Ultimately, the exact method isn’t important, and since everything is contained within this repo, changes will be reflected in all of your future projects :).

More bin

The last steps are adding the bin script to start the watch file and adding the script to the package.json

/* src/bin/dev.js */



#!/usr/bin/env node



var chalk = require('chalk');

var watch = require('../webpack/watch.js').default;

var webpackSharedConfig = require('../webpack/webpack.shared.config.js').default;



var options = webpackSharedConfig({

nodeEnv: process.env.NODE_ENV,

port: process.env.PORT

});



console.log(chalk.green('Running project in watch mode with pwln'));

console.log(chalk.blue('Options:'), options);



watch(options);

And now the package.json

/* package.json */



"bin": {

"pwln-clean": "./lib/bin/clean.js",

"pwln-build": "./lib/bin/build.js",

"pwln-dev": "./lib/bin/dev.js"

}

Don’t forget to re-run npm link after adding the new bin if you’re using that to develop the config locally.

TLDR; Putting it all together

If you stuck with it, congrats! You now have your very own version controlled frontend build process published to npm (or at least in a git repo somewhere). It can be installed in any of your projects and should just “work”.

It supports:

es6

jsx

svgs, images, fonts

sass, postcss

typescript

And whatever else cool junk you decide to add!

we did it

Here’s the version of the config we built in this article

And here’s a disgustingly simple project that uses it to build