With the imminent releases of Alphanet & 0box, lets take a look at the potential of 0chain as a distributed alternative to cloud computing. This article focuses on enterprise use of the 0chain technology.

Conventional cloud storage & computing brief

Having remote (compared to local) storage and/or computing resources can have many benefits. These include:-

Efficiency (purpose-designed hardware)

(purpose-designed hardware) Accountability (categorized control and reporting)

(categorized control and reporting) Reliability & Maintenance (Reduction of in-house resources and training)

(Reduction of in-house resources and training) Scalability (Additional Resources On-demand)

(Additional Resources On-demand) Ease of access (from anywhere)

Housed in huge dedicated data centers, cloud providers rely on a massive concentration of the elements of storage, bandwidth and power. Hardware is invariably uniform for reasons of predictability and ease of maintenance. Single points of failure are avoided as much as possible through redundancy (spare capacity) for each element but as recent incidents show, there is still reason for concern with such centralized concentration of data and processing power.

https://www.theregister.co.uk/2018/09/17/azure_outage_report/

https://www.theregister.co.uk/2017/03/01/aws_s3_outage/

Distributed storage & computing brief

As the title suggests, distributed resources are decentralized, the nodes can be spread throughout a wide region, even globally.

These distributed networks can be made very resilient to failure, either by simple duplication of resources or more advanced techniques.

Distributed Computing vs Cloud Computing

So lets see how distributed computing compares with Cloud Computing for the same points as raised above:-

Efficiency - While cloud computing is generally deemed very efficient compared to local resources (think hundreds of dedicated cloud computers in place of thousands of local computers) it still has to reserve sufficient dedicated resources for peak demand.

A well-designed distributed solution should aim to have a minimal overhead for managing the distributed element and if so has potential to also be very efficient. Distributed computing/storage also gives an opportunity for sharing of ‘spare’ or currently unused resources (e.g. 1tb of a 2tb drive could be used for distributed purposes until such a time that the 2nd tb was required, or use of spare processing power when available) and so even have the potential to be more efficient!

Accountability - A blockchain-based distributed solution should offer excellent accountability, the very nature of a blockchain is all about immutability of transactions, each transaction automatically having a unique hash id and timestamp! Of course there is still the challenge for distributed systems to make this accessible while respecting privacy and security.

Reliability & Maintenance - Despite the news articles referenced earlier, cloud computing is probably a lot more reliable than trying to manage everything in-house. In case of failures, downtime would likely be much less than for a company without dedicated personnel and resources (of course cloud datacenter failures can effect many thousands of clients but this is comparison with in-house). However, with distributed resources, other nodes can kick in in the event of failure, and with the right block chain technology, this could painless or even transparent for the client. (The erasure-encoding that is available in the storage protocol is a good example of how several points of failure can be tolerated and managed in a distributed system.)

Scalability — While storage can potentially quite easily be scaled up on demand in a distributed environment, computing power can be a bit more tricky. This is certainly one of the biggest challenges with distributed computing. Specifically, applications need to be designed in such a way that they are split into chunks that can be effectively managed in a distributed environment. If these ‘chunks' can be varied in scale and resource requirements, then there is the opportunity for potentially cheaper ‘spare’ resources to be utilized.

Ease of access — Cloud Computing is well established and APIs are comprehensive and mature. Again, this is going to be a big challenge for distributed computing to catch up. However, some usual complications may be reduced substantially. If your distributed system can be trusted to be completely secure and reliable, that is two limbs of development that can be reduced in one go (security and error checking).

How does 0chain measure up as a distributed computing provider?

Efficiency -

i) Storage — The dedicated blobbers that are employed for storage provision are separate from the mining activity and are able to interface directly with clients. Combined with the nature of the token staking system, this effectively encourages self-monitoring and policing to avoid penalty with minimal interaction (challenges etc.) from miners. This results in a very efficient system. Effectively, the 0chain storage system, (nicknamed BOSS) is the first dApp on the platform.

ii) dApps — With a fast, efficient blockchain foundation the opportunity for efficient dApps looks very promising. Unlike dApps on some platforms that are interpreted on the chain, 0Chain dApps will be compiled native applications, interfaced to the 0chain through SDK/APIs. These applications effectively run at full speed, however special consideration has to be given for a distributed platform as they have to be made accountable to 0chain, typically in the form of a smart contract. Not all applications are suitable for distributed computing, and some perhaps need reworking to make them fit a distributed approach. There was a similar ‘leap’ to convert from local computing to cloud computing and in fact is a huge opportunity for developers.

So if a similar approach that has been given to the storage platform is given to the 0chain dApp platform, it also has the potential to be very efficient.

Accountability - 0chain, being a blockchain naturally offers excellent accountability and in fact this has been one of the aspects that has been important to the team from the outset. Naturally, dApps developers will need to leverage this blockchain accountability where required.

Reliability & Maintenance - 0chain, by utilizing the staking mechanism ingeniously puts the onus for data integrity and performance on the miners and blobbers themselves. In effect, miners and blobbers will take it upon themselves to be squeaky clean with no downtime or data loss to prevent risk of losing some of their stake or miss earning their mining rewards! The storage protocol allows for multiple points of failure with its erasure-encoding, and dApps will need to be developed in a way that embraces the distributed nature.

Scalability - with 0chain token economics, supply and demand will dictate the amount of stake required for storage or computing power, and this can fluctuate. If the token value rises, then it will be more lucrative for more miners so there should never be a shortage of supply, although a huge upsurge in demand may take a little time for sufficient new miners/blobbers to come on board. From the perspective of the client, there should be some spare capacity on the network for short-term increases in demand, especially if dApps can be developed to utilize variable chunks as mentioned earlier. To facilitate a short-term demand there exists the option to subsidize the miners with some of the locked tokens if the client does not wish to purchase additional tokens for the miners to earn the required interest for their service.

Ease of Access - APIs for transaction verification were released early in the testnet, but we are going to have to wait for a while after mainnet before dApp platform, smart contracts and extensive SDK/APIs are added.

In the meantime, we can gauge the success of the 0box storage platform and wallet technology and study the details given in the various whitepapers. In that way we can confirm that the team is able to deliver functioning wallet and storage products on an ultra-fast blockchain as promised and therefore be confident that they are able to continue to realize ongoing milestones required to fully exploit the platform.

What are the requirements for distributed computing?

Typically, the hardware required is much less specific compared with a cloud hosting provider (a minimum standard must be met) and diversity of resources could even be beneficial.

So distributed computing is not just the domain of the huge data center and is therefore an opportunity for enthusiasts to partake.

In the case of 0chain, with the super-fast block time, there is a strong reliance on fast dedicated bandwidth, so may be unsuitable for someone to contribute with their typical home broadband connection. More details are to be published soon.

Conclusion

I do believe that 0chain is well-placed to be a distributed computing platform in addition to storage. I have been impressed with the sensible and necessary changes that have been made to the protocols and whitepapers as the platform has matured and of course the testnet that counted over 6 billion transactions in a matter of weeks.

All this on a zero-cost block chain? 2019 promises to be a defining year for 0chain.

About the Author

(I am a blockchain enthusiast, fairly active in the 0chain telegram with ambassador status. My comments and views are not necessarily that of the 0chain team)

Reference

www.0chain.net

t.me/Ochain