by Jonas Verhoelen

Node.js brought the programming language JavaScript from the browsers to the server side and into the command lines of this world. Since then, the Node.js ecosystem has become an established choice for software development. Express.js is a popular choice for web applications. According to The State of JavaScript (and Google), TypeScript is the most popular superset language for developing JavaScript applications.

This article deals with the basics of Express.js and Node.js as well as the recipes that build on each other to bootstrap a web application. Suggestions and approaches are shown how to develop, test, and build locally. Over time, this boilerplate project for web apps on this stack will be developed.

Further articles in this series will go deeper into the matter. As in a cookbook, recipes will describe how challenges in web development can be tackled.

Setup of the Node Express aplication in TypeScript – Step 1

On the development system, Node.js is needed. npm (Node Package Manager) – build tool and interface to the huge Node ecosystem – is included and sufficient for our project. It is recommended to use a version manager such as nvm (Node Version Manager) for Node. This way the Node version can be changed in different projects and new versions are quickly installed. The file .nvmrc stores the Node version to be used in a project.

nvm, Node and npm can now be tested directly on the step #1 branch of the example repository of this series of articles. Let’s clone the initial branch:

git clone --branch 01-setup git@github.com:jverhoelen/node-express-typescript-boilerplate.git

This version of the code contains generously distributed bootstrapping of an ExpressServer and the required configurations tsconfig.json and package.json . The status of the project can be tested as follows:

nvm use nvm install lts/dubnium # only required once npm install npm start

The Express Server is running, but can only be terminated again so far. To explain and get to know the most important concepts of Express.js, the first feature will be added in the next step.

Introducing Express.js middlewares and request mappings – Step 2

Express.js is all about HTTP Request and Response. They describe exactly what you expect: body, parameters, headers, etc. The request has state – it can be read, extended, and modified at any point of processing.

Request processing is performed completely by a set of request handlers (also called Middlewares). These are functions that receive the request, response, and Next-Function as parameters. The request can be read and modified as required. Response is an API for building the HTTP response and sending it. JSON body, HTML string, template rendering, redirect and much more is available. The Next-function is called when the processing of the request is supposed be passed to the next request handler in the chain. The last request handler in the chain usually returns the response that completes the HTTP request handling. Middlewares can be defined globally in the order in which they are to be executed.

In addition, further middleware chains can be put before individual request mappings. The following code adds three middlewares to a request handling that are often required for web applications and are not yet automatically activated:

private setupStandardMiddlewares ( server : Express ) { server. use ( bodyParser. json ( ) ) server. use ( cookieParser ( ) ) server. use ( compress ( ) ) } private setupStandardMiddlewares(server: Express) { server.use(bodyParser.json()) server.use(cookieParser()) server.use(compress()) }

Request mappings define how a request pattern is handled after the global middleware has been executed. For example:

server. get ( '/api/cat/:catId' , noCache , this . catEndpoints . getCatDetails ) server.get('/api/cat/:catId', noCache, this.catEndpoints.getCatDetails)

After the global middlewares, the GET request /api/cat/123 is still handled by the middleware noCache and sends the response to the client using the method this.catEndpoints.getCatDetails .

The state of step #2 of the application illustrates these concepts. In addition, some NPM dependencies for used middlewares and TypeScript types were added. The state after step #2 can be checked out; the changes since step #1 can be seen in the pull request.

After running npm install and restarting the server, the REST endpoint is available, for example at http://localhost:8000/api/cat/123 . If the cat ID is less than 90, only status 404 is returned.

Cutting and designing Express.js applications – Step 3

This article recommends a domain-oriented structure of folders and files. This means that code that belongs to a domain and a feature lives nearby. For example, code for business logic, data access, mapping, and presentation of a cat API should be as close as possible. These interdependent components are likely to be developed and tested together. They should not be divided according to the type of file (e.g. repository, model, service, middleware), but according to their domain. Sometimes there are files with classes, types, constants, and functions that cannot be cut by domain. Code for tasks such as logging, security or telemetry can easily be structured in folders like middlewares or security , depending on the amount.

In the next version of the repository, branch 03-server-application-design, the previously server-side files will be moved to service/server/ for clarification. There we now find a CatService and a CatRepository in the service/server/cats/ directory. A new middleware is also added, so all custom middlewares of the repository are now in service/server/middlewares/ .

> tree service service └── server ├── Application.ts ├── ExpressServer.ts ├── cats │ ├── Cat.d.ts │ ├── CatEndpoints.ts │ ├── CatRepository.ts │ └── CatService.ts ├── index.ts ├── middlewares │ ├── DatadogStatsdMiddleware.ts │ ├── NoCacheMiddleware.ts │ └── ServiceDependenciesMiddleware.ts └── types ├── CustomRequest.d.ts ├── connect-datadog │ └── index.d.ts └── hot-shots └── index.d.ts

After checking out the 03 branch (pull request with changes since step #2), the server is first restarted manually. The new REST endpoints /api/cat , /api/statistics/cat and /api/cat/<id> contain interesting information about cats (whose attributes may remind you of a band from Birmingham, UK). They can be tried manually in the browser.

In this step, the application has grown by a handful of files, whose correctness will be verified with unit and integration tests in the next section.

Unit and integration testing in Express.js applications – Step 4

Unit tests help verify the behavior of one or more related code components such as classes or functions. Stubbing and mocking are used if the behavior or result of dependent components in unit tests are to be consciously controlled or their calls checked for correctness.

In integration tests, features are tested in their entirety, e.g. at the level of an HTTP API. For this purpose, the service is started temporarily or rolled out on a continuously deployed environment such as the development-stage. Integration tests can work with fake data in order to control the inputs for the application logic and thus provide predictable test results.

The variety and combinability of JavaScript testing frameworks, libraries, and extensions is not easy to grasp. The article “An Overview of JavaScript Testing in 2019” by Vitali Zaidman is worth reading if you want to learn more about the world of JavaScript testing tools and their differences.

With the tools Mocha, Chai, Sinon and Supertest, I propose a combination that has established itself in a long-term project with this stack. UI tests, the top of the classic test pyramid, will be covered in a future article of this series.

The test-runner Mocha is known for being easy to combine with other tools. Only a manageable amount of configuration is required. Chai is the most popular library for Test Driven Development (TDD) and Behaviour Driven Development (BDD) assertions. In addition to the large basic vocabulary for the formulation of tests, Chai can also be easily extended by third-party libraries. Sinon is probably the most powerful and most used library for test spies, mocks and stubs in JavaScript and should therefore not be missing in the arsenal. Supertest is a suitable library to address REST APIs in integration tests.

In branch 04-unit-and-integration-tests, tooling is built in and a few unit tests are added. Repository, service and endpoint class as well as middleware will be tested. The test files are next to the implementations and end with “Spec.ts”. An advantage of this is that a test can be found immediately regardless of the features of the IDE. Also, changes in the files are sorted alphabetically in GitHub and GitLab pull and merge requests. This makes it easier for colleagues during the code review. The following example tests the CatService :

import * as sinon from 'sinon' import { expect } from 'chai' import { CatService } from './CatService' import { exampleCats } from './exampleCats' describe('CatService', () => { const sandbox = sinon.createSandbox() let catService: CatService let catRepository: any beforeEach(() => { catRepository = { getAll: sandbox.stub().returns(exampleCats) } catService = new CatService(catRepository) }) describe('getCatsStatistics', () => { it('should reflect the total amount of cats', () => { expect(catService.getCatsStatistics().amount).to.eq(5) }) it('should calculate the average age of all cats', () => { expect(catService.getCatsStatistics().averageAge).to.eq(69.2) }) it('should calculate an average age of zero if the amount of cats is zero', () => { catRepository.getAll.returns([]) expect(catService.getCatsStatistics()).to.deep.equal({ amount: 0, averageAge: 0 }) }) }) })

All unit tests can be run with npm run test:unit . Integration tests that verify behaviour of the running cats REST API are located in test/integration/ and their files end with “Test.ts”. They are executed with npm run test:integration . The example CatsApiTest.ts from the project looks like this:

import * as request from 'supertest' import { expect } from 'chai' import * as HttpStatus from 'http-status-codes' import TestEnv from './TestEnv' const { baseUrl } = TestEnv describe('Cats API', () => { describe('Cat details', () => { it('should respond with a positive status code if the cat is known', () => { return request(baseUrl) .get('/api/cat/1') .expect(HttpStatus.OK) }) it('should respond with 404 status code if the cat is not known', () => { return request(baseUrl) .get('/api/cat/666') .expect(HttpStatus.NOT_FOUND) }) it('should respond with cat details data if the cat is known', () => { return request(baseUrl) .get('/api/cat/1') .expect((res: Response) => { expect(res.body).to.deep.equal({ id: 1, name: 'Tony Iommi', breed: 'British Shorthair', gender: 'male', age: 71 }) }) }) }) })

For tests of request handlers used in request mappings, the last part of processing an HTTP request, the library “expressmocks” is a good choice. Examples of its use can be found in the README on GitHub of “expressmocks” and in the branch for this step (see CatEndpointsSpec.ts).

To familiarize yourself with the tooling, the branch can be checked out for this step:

git checkout 04-unit-and-integration-tests . The changes to step 3 are visible in this pull request.

Serving a frontend, code sharing and hot-reloading – Step 5

In this step, an extension takes place that accomplishes a simple frontend to be delivered by the server and frontend and backend to share code with each other. Tooling for Hot Module Replacement (HMR) is also installed and configured. This enables immediately visible and testable changes during development.

The terms Isomorphic JavaScript or Universal JavaScript may no longer be “hip” and are discussed controversially. However, there are reasons to share code between front- and backend. It could be interface definitions between backend and frontend or domain model functionality, for example. To do this without much effort, the code base is divided into frontend, backend, and shared folder. But Webpack should only build two bundles, frontend and server, and both bundles should also make the files accessible in “shared”. Frontend and backend bundle may import files from shared, but not the other way around. Likewise, files in the frontend and backend cannot import each other. Shared code must be provided via dependencies or the “shared” folder.



Code sharing via libraries should be treated with caution. For example, if a team maintains five microservices (or three more comprehensive self-contained systems), they may choose to share code in their own library in accordance with “Don’t Repeat Yourself” (DRY). The team can decide whether to share business, technical, or any code in it. However, they will need to pay the cost for all services’ dependence on the shared library, especially if business code is shared. Changes to this library may require all services to be touched again to migrate to API changes. Releasing the library also makes the further development of the services dependent on the release process of the services, which also means additional time and maintenance. In the area of microservices and self-contained systems, shared libraries including business code are not recommended. To avoid coupling, code duplication should be accepted instead. However, if the result is too much duplicate code across services, this may be an indication of incorrectly defined bounded contexts or a missing service. With shared code between frontend and backend, on the other hand, advantages can be achieved that do not have to be paid for with the expensive disadvantages of a shared library.

Webpack and awesome-typescript-loader are used to build frontend and backend. To build the TypeScript frontend, you need your own tsconfig-frontend.json , which configures that frontend TypeScript code transpiles to ES5, accepts JSX syntax for React, and needs to be able to interact with the DOM.

By using webpack-dev-middleware and webpack-hot-middleware as Express.js middlewares, the server is given the task to enable Hot-Module-Replacement of the frontend during the local development. Nodemon is used for the reloading of the backend in case of code changes. This is a Node.js tool that restarts a Node process as soon as changes occur to files in the configured path. In the new definition of npm start , Nodemon now controls the execution of the Node.js code in TypeScript.

This pull request shows all changes between step 4 and 5 and thus how the changes are implemented and configured correctly. How the development feels now can also be tested after checking out the branch for step 5.

At this point, the code base for this article is final. Which npm scripts are available from now on is explained in the documentation of the repository.

A glimpse into the boiling pot

The minimal Express.js application from the first step has become a more technically savvy application that is easy to develop, test and build. At the same time, some basic features of Express.js, Node.js, TypeScript and the familiar testing and build tools have become clear. With the Cat API, minimal functionality has been implemented, but no persistence or complex business logic has been established yet.

It should become clear that Express and Node do not provide all the tasty recipes for modern web development without extension. However, the Node universe offers many good, self-contained but combinable libraries that should be used. Compared to frameworks such as Spring Boot, not every required tool is ready for use, but has to be found, evaluated and configured with some effort. This gives you more flexibility and control over how the web application is developed and what tools and dependencies you choose.

In the next article of this series you will find more recipes for solving web application challenges: How to deal with security, hardening and rate limiting? Topics such as configuration management and secrets, logging and metrics are also covered. Gradually, the applications will be built on principles such as Self-Contained Systems and Twelve-Factor Apps.