In the last couple of months I've been working on a decentralized finance product for an upcoming company building Bitcoin Cash applications. Reliability is very important for us but I've been having some issues with the public and free bitcoin.com rest API.

Initially my thought was to replicate the backend and run it in-house, but it was relatively difficult to set up and manage, which led me to look for alternatives. I wanted something simple and reliable, with as few dependencies and set up costs as possible, with existing libraries available on NPM so I could integrate easily with the rest of my stack.

Knowing that the Electron Cash wallet has been around for a long time and seeing that Bitcoin Unlimited is building out an Electrum backend to use with their node software, I decided to look for libraries on NPM. There was more than a handful and many of them advertised that they were entirely free from dependencies.

I tried a bunch of libraries but due to various issues nothing really seemed to work. The NPM statistics showing just a handful of downloads per week for most libraries, the code quality being consistently low rated and most of them being old and unmaintained speaks for itself.

The feature set is also wildly broad with a lot of overlap with most libraries supporting most functionality, but none really supporting all, like encrypted connections, persistent connections, notification subscription, versioning support and so on.

The protocol seemed fairly well documented and even though none of the existing libraries fit my needs I had plenty of code to learn from and reference, so I got started.

It took me a while, but now I have a library thats easy to use, encrypted by default, and has clean and easy to read code. Let's go over some examples and showcase how it works.

Sponsors of JonathanSilverblood empty empty empty Become a sponsor Get sponsored

The library is open source and published to NPM under the electrum-cash name, so you can install it like you would any other NPM library:

npm install electrum-cash

You can read the source code and technical documentation on gitlab.

The simplest setup is to connect to a single server of your choice. While this looks very basic, under the hood you are actually getting a persistent, encrypted network connection with automatic keep-alive messages and built-in version negotiation and enforcement.

// Load the electrum library. const ElectrumClient = require('electrum-cash').Client; // Wrap the application in an async function to allow use of await/async. const main = async function() { // Initialize an electrum client. const electrum = new ElectrumClient('MyApplication', '1.4.1', 'bch.imaginary.cash'); // Wait for the client to connect. await electrum.connect(); // Declare an example transaction ID. const transactionID = '4db095f34d632a4daf942142c291f1f2abb5ba2e1ccac919d85bdc2f671fb251'; // Request the full transaction hex for the transaction ID. const transactionHex = await electrum.request('blockchain.transaction.get', transactionID); // Print out the transaction hex. console.log(transactionHex); // Close the connection. await electrum.disconnect(); }; // Run the application. main();

For professional use, having a single point of failure is not an acceptable outcome. Integrating network reliability directly in an application lets you build the best user interface but has a higher development cost and is prone to difficult errors such as race conditions.

By building these features into the library instead, you get to spend more time building on your application, and less time fighting network based backend problems.

From a usage perspective there is very little change, just change from using a Client to using a Cluster and provide some basic setup configuration.

With a cluster, you can add more than one server and if a server goes down the library will automatically mark the server as unavailable and send your requests to different servers on your server list.

When the server becomes available again, it is automatically re-enabled.

If your application requires consistent short response times, adding multiple servers and requesting data from them in parallel, then using the first available response allows you to run your application with the latency of the fastest server at all times.

If you're not running your own servers, or your requirements are very stringent you can configure the cluster to cross-verify multiple server responses before handing them over to your application.

By doing so you ensure that no single server can provide you with fraudulent data, but the latency of the response will be longer as we need to wait for more servers to respond before giving the data to your application.

Using a single server has the privacy benefit that you're only disclosing information to a single peer. For clusters you would instead use a large number of servers to spread your requests making it difficult for any single server to determine the context for your actions.

If you don't have any specific requirements and just want the backend to work and get out of your way, you can strike a balance between reliability and performance by getting all of the good stuff together.

Configure the cluster to have a decent integrity confidence with at least two servers having to be consistent, set it to poll a handful of servers and then add as many servers as you want to have for backup. The default strategy for selecting which servers to poll is random, so you automatically get some load-balancing as well.