Microservices have become a mainstream approach for building scalable and robust cloud applications in Node.js. At the same time, there are still some barriers to entry, a few of which are making decisions about:

organizing the project structure.

connecting custom services to the third-party ones (databases, message brokers, etc.)

dealing with the code shared among the microservices.

containerizing the project.

running and debugging the stack locally and deploying it to the cloud.

An out-of-the-box solution to all this is the SMF framework:

https://github.com/krawa76/smf

Let’s see how it can help create and deploy a prototype of the microservice stack without writing a single line of code.

Create project

Install the framework, create a new project and cd to the project directory:

$ npm install -g sokyra-microservice-factory

$ smf new test-stack

$ cd test-stack

The boilerplate code with a demo service is generated and we can easily run the project:

$ smf up

This generates Docker artifacts (docker-compose & environment variables files), builds the image and runs the container locally:

docker-compose log

If we open the project in an editor, we’ll see the auto-generated demo service with main.ts module which produced the entries in the log above. The other important files are smf-stack.json (project config), smf-env.json (containers env variables), a generic Dockerfile and smf-docker.yml (docker-compose):

demo service, the main module

To stop the project, run

$ smf down

Add new service

Let’s add a service that sends/receives messages via a message broker and saves something to a database:

$ smf add service service1

Select Basic worker template, then RabbitMQ and MongoDB services, then use 0 to exit the menu:

This creates the new service subfolder with some boilerplate code in the main module:

Let’s run the project again to see it in action:

$ smf up

Now we have 4 containers running: RabbitMQ, MongoDB, demo & service1. The latter sends / receives messages via RabbitMQ and saves fake records to MongoDB:

docker-compose log

We can again stop the project using the smf down command.

We can add more services similarly and if we select the same message broker service they will all exchange messages via the message hub.

Deploy

It’s easy to deploy our project to a remote server which has Docker & Docker-Compose installed. If you don’t have one so far, you can create it in Amazon AWS EC2 using the following simple instruction:

https://github.com/krawa76/smf/blob/master/README-provisioner.md

Docker Hub account is also required. You can sign up for free here if it’s missing:

https://hub.docker.com/

Open smf-deploy.json file in the editor and fill in Docker Hub login/password, host address and the remote machine SSH credentials (ssh key path).

Run this command to deploy the project:

$ smf deploy

When the process ends, we can ssh to the remote machine and see our microservices running there:

$ ssh -i "/Users/me/.ssh/aws-key.pem" ubuntu@ec2-x-x-x-x.compute1.amazonaws.com $ docker ps

(gives the list of services) $ docker logs -f test-stack-service1

(give the live log)

Now we have a working prototype of our containerized microservice stack in the cloud.

What’s next?

Start adding more logic. Since every service is a separate NPM package, we can cd to the service folder, install extra packages, write mode code in the main.ts module, add new JavaScript modules, etc.:

$ cd services/service1

$ npm install ...

More info about debugging, using shared modules, back-end/front-end demos, etc. is here:

Happy coding!