Even as interest in applying blockchain technology to financial services intensifies, a basic issue must be overcome: Owing to the highly sensitive information that banks deal with—customer accounts, funds, identifying information—how can they safely test new blockchain and associated distributed ledger applications?

A German-based consulting company—with extensive dealings in the U.S.—has successfully demonstrated that such blockchain testing, at scale, is possible. In September, GFT announced that it worked with a blockchain application owned by Royal Bank of Scotland, to create a test environment using real-world volumes, but simulated accounts.

“They asked us to test [their app] for scalability and to help understand how efficiencies could be gained in the underlying distributed ledger technology in order to achieve domestic and international payments,” says Nick Weisfeld, head of GFT’s blockchain and data practices section, in an interview with Banking Exchange.

“The most popular distributed ledger technology out there is Ethereum,” says Weisfeld. “But it is not really built for high-volume payment solutions. They are built for distributed low-volume solutions. Our role was to try to make it work in high-volume payments transactions.”

Details of the testing

Royal Bank of Scotland issued a paper, “Proving Ethereum For The Clearing Use Case.” In the paper technicians involved in the test, code-named “Emerald” and using the Ethereum network code, found that it produced a throughput of 100 payments per second, through six simulated banks.

A single transaction trip took a mean time of three seconds and a maximum time of eight seconds, which the technicians said “is the level appropriate for a national level domestic payments system.”

To facilitate the testing, GFT partnered with the Google Cloud Platform to simulate real-world distributed ledger models within a globally distributed and scalable test environment. Associated Google tools Bigtable and BigQuery facilitated analysis of the tests.

Now, hold for a reality check

At least one independent analyst was cautiously skeptical.

Gareth Lodge, an analyst with Celent, writes in a recent blog, “The Evolving ACH Landscape,” that while this may sound impressive on its face, actual processing in the real world likely would require up to 700 payments a second on a regular day, and more than 4,000 payments a second on peak days.

Still, Lodge says, “compared to as little as 18 months ago, the conversation has shifted noticeably from could it replace to should it replace [payments systems], signifying the very real possibility that it will happen in the near future.”(Emphasis added.)

Weisfeld makes no claim that what’s happened so far is conclusive.

“A number of things need to be sorted out and solved in the current distributed ledger technologies to make them usable in a production environment,” says Weisfeld.

“Scalability is one of them, to process a number of transactions ,” Weisfeld continues. “Things like security, reliability, and other nonfunctional requirements that need to be addressed to make those solutions work. What’s next is to drive out the standards and protocols that are required to ensure that those nonfunctional requirements are addressed in a uniform way.”

In the meantime, testing systems continue. GFT’s Weisfeld would not name specific U.S. banks that might seek his company’s testing services. However, he acknowledged talking with the R3 consortium focused on blockchain development owned by banks, many of which are U.S.-based.

“U.S. banks are involved in this,” Weisfeld allowed.

Quorum of one

In an unrelated development, J.P. Morgan Chase announced recently that it is working on tests of blockchain systems built off the Ethereum network code. Its widely reported test, code-named Quorum, focuses on boosting system security. Quorum was built in partnership with a company called EthLab.

Both Royal Bank of Scotland and J.P. Morgan have said they intend to open source their projects.

Visit the Banking Exchange Blockchain Channel