Funny Story, NPM Doesn’t Provide an Immediately Obvious Way to Run Multiple Scripts at Once

We’ve all been there — you open up an app for local development and you need to run your Express/Node.js backend server and your React UI server simultaneously, but in order to do so you have to open up two terminal (or more) windows, cd into two different package.json files via the command line and run two different start commands.

Sound familiar? If only there were a better way…

The Solution(s)

Quite simply, is a better way. And like all good things in web development today, there’s a thousand different ways to achieve the same end.

Today, I will share four ways to run multiple Node.js commands or NPM scripts at once.

Ready? Ok, let’s run through them.

Option 1: Bash && Chained Commands

Good old Bash, can’t beat the command line.

This solution is the most straightforward and requires no extra NPM packages or other software — it is literally just the command line shell.

Fun fact: the "scripts” in a package.json file are actually just terminal commands that would be run in an OS’s shell (like Bash). So by using standard shell syntax, you can chain together commands that NPM runs by calling the key associated with those combined values — see my example below.

Inside your NPM script, in the "start" command, or whatever you want to use to run your servers (provided they’re in the same repo, of course), just chain together your two start scripts like so.

"scripts": {

"start": "react-scripts start",

"dev": " (cd server && npm run start) & npm run start"

}

The way this works, is that at the root level of the project is my React project (the "react-scripts start" command executes in the shell when you type in npm run start in the terminal, and one level inside of that is a folder called server which holds the Node.js server used to proxy calls to all the backend microservices, and starts up with the same command at its root level thanks to its own package.json .

Here’s a very dumbed down diagram of the project structure, for reference:

root/

├── package.json

├── server/

| ├── package.json

As you can see, both levels of the project have separate package.json files with their own dependencies and individual NPM start scripts, but by simply chaining the two together with the & in between them, both Node commands can be run.

Additionally, using the && means that the script will wait until the server has successfully started before spinning up the client-side React application (because && in bash means the things on both sides of the && must evaluate to true). If the server can’t start for some reason, the whole NPM command will fail and the UI won’t start without its backend server ready to go too.

That’s the quickest, easiest, most built-in way to run multiple Node.js commands in different file locations, at once.

Now, I’ll move on to a few NPM packages that make the process even simpler, and require no knowledge of Bash.

Option 2: Concurrently

NPM package #1: Concurrently

The first NPM package I’ll introduce you to is called Concurrently. It’s name is pretty self-explanatory: it runs multiple commands concurrently. 😝

It’s not as robust, in terms of customization, as my next NPM package recommendation, NPM-Run-All, but I think the majority of the time (hopefully) all the extra configuration is unnecessary.

After running npm i concurrently to install it, you can then set up your NPM start script to run multiple commands just by separating each individual command with quotes.

So in a package.json file, your "scripts” command might look something like this, (note the need for escape quotes here):

"scripts": {

"start": "react-scripts start",

"dev": ""concurrently \"cd server && npm run start\" \"npm run start\""

}

And once again, you should be off to the races. Also worth noting is you can run these same types of commands with quotes surrounding each argument from the command line as well, after Concurrently’s been installed globally.

I also recommend checking out the documentation for more cool tricks you can do with it, like shortening commands, supporting wildcards, etc.

Option 3: NPM-Run-All

NPM package #2: npm-run-all

Moving on, this package is another popular option from NPM, called NPM-Run-All.

The NPM page proclaims npm-run-all “A CLI tool to run multiple npm-scripts in parallel or sequential.”

It is a similar concept to how Concurrently works, but the syntax is slightly different and npm-run-all touts how it can shorten a very long, single start command like: npm run clean && npm run build:css && npm run build:js && npm run build:html

Into: npm-run-all clean build:*

The npm-run-all CLI is installed can be installed via NPM or Yarn: npm install npm-run-all — save-dev , and once installed, it boasts three different commands, based on your needs:

npm-run-all (the main command, which has documentation on all of the flags and additions you can pass in via the command line)

run-s (run sequential — for when you need one command to finish before the next one starts)

run-p (run parallel — like when both the UI and server pieces of the application need to run side by side).

If, for example, the package.json scripts looked like the following:

{

"scripts": {

"clean": "rimraf dist",

"lint": "eslint src",

"build": "babel src -o lib"

}

}

That could become with npm-run-all: npm-run-all clean lint build .

The combinations can also get much fancier with combinations of parallel and sequential runs together, depending on what your needs are. See the documentation for more details.

Pretty easy, right?

And now, on to the last option for running multiple NPM commands at once, Docker.

Option 4: Docker-Compose

The final solution — and a little out in left field: Docker.

Docker and Docker-Compose are whole other articles, which I’ve covered in detail, here and here on Medium. I’d recommend checking both of those out if you’re unfamiliar with the virtual containerization platform that is Docker. It’s an extremely powerful and effective tool, when used properly.

For this article, I’ll keep the Docker conversation focused specifically on the Dockerfile and NOT the docker-compose.yml , which is another piece of the Docker puzzle.

The Dockerfile provides all the instructions and commands a user could call on the command line to assemble a Docker image. Essentially, it defines an app’s environment so it can be reproduced anywhere.

The docker-compose.yml defines the services that make up the app, so they can be run together in an isolated environment. That’s a separate piece unrelated to the focus of this article.

Note: If you’d like a more in-depth explanation of using docker-compose to improve your application development in lower life cycles and even production, please see this article I wrote on the subject.

You can also see my MERN project repo, which uses docker-compose to spin itself up, complete with a MySQL database instance running locally.

In terms of the Dockerfile, though, after you’ve installed Docker on your machine, it’s very simple to write for a fully JavaScript project.

Here’s my file structure for the React UI ( client/ folder) and Node.js backend ( api/ folder) for reference of the file structure.

root/

├── server/

├── client/

├── docker/

├── docker-compose.yml

├── Dockerfile

And here’s the contents of the whole Dockerfile.

Dockerfile

FROM node:9

WORKDIR /app

CMD ls -ltr && npm install && npm start

Here’s what happening in the lines above:

Download a version of Node.js from Docker hub,

Define the working directory for each app (both go at the root of their respective containers as /app ),

), NPM installs all the dependencies in each using their own package.json files, and once the dependencies are downloaded, start the apps (both have an npm start command in their "scripts" .

That’s it. That’s all that’s needed because the docker-compose tells the multiple applications ( server/ and client/ ) how to structure themselves, where to mount needed volumes, open ports to the outside environment, get the needed Docker images for things like databases, etc.

I’d recommend checking out my Docker-powered project repo to see more examples of what I’ve described above.

It’s a little bit of a different tack to run multiple NPM scripts at once from the other solutions, with what is arguable a bit more initial overhead, but it works, and if you’re familiar with Docker’s many benefits, it’s a pretty sweet way to run your projects.

Conclusion

Whatever you choose as a resolution, this is a problem that every JavaScript developer runs into at some point or other. You need a config file to run before your server starts, you need two servers to run side-by-side to handle UI views and API calls, you need watcher files to keep running while you make file changes — there’s a million reasons. Regardless of the exact scenario, eventually there will come a time when it will be beneficial to run multiple Node.js commands or NPM scripts at the same time.

You could do this manually with multiple, open terminal instances, or you could use one of the solutions I suggest above. One is pure shell scripting, two are popular, stable NPM packages, and one is the pretty far removed, but extremely powerful solution of using Docker to containerize and run multiple apps.

Check back in a few weeks, I’ll be writing about Reactjs or something else related to web development, so please follow me so you don’t miss out.

Thanks for reading, I hope this gives you some new ideas on how to approach running your own simultaneous NPM scripts when the need arises. Please share this with your friends if you found it helpful!

If you enjoyed reading this, you may also enjoy some of my other blogs:

References and Further Resources: