We are happy to announce that we are releasing the latest version of our ODN network node software 0.8a, Surveyor, today.

Surveyor brings us an improved version of the bidding mechanism and implements the first version of the long-awaited consensus check, which utilizes the zero-knowledge privacy layer we worked on in previous releases.

The team behind OriginTrail has been consistent in delivering on our bi-weekly development roadmap, intended to make us iterate quickly and improve upon the mechanics of the ODN while it’s in the alpha stage. Now that Surveyor has been released, we are merely 2 releases away from the testnet at the end of June.

As always, the new code is available on our GitHub.

More Efficient Bidding on Agreements for Nodes

The initial version of the bidding mechanism (Kosmos release, April 23) was the first to implement the full cycle of operations in node communication when it comes to the agreements they form around data on the network. Simply put, the Data Creator node (DC), the one introducing new data to the network, forms agreements with Data Holder nodes (DH) to operate on and store data (D) on a particular observed supply chain (S). For the specific data set D, a set of agreements is made between the DC of the data provider, and several DH nodes, among which are both independent nodes within the network, as well as the associated partner nodes of the data provider entity. In that regard, it is important to understand how a node agreement is formed.

To form the set of agreements (A) associated with one data set D, the DC node of the data provider creates an initial offer (O). This offer contains the parameters set by the DC node such as:

the maximum amount of tokens the DC node is willing to provide as reimbursement per data unit for DH nodes,

the DC node is willing to provide as reimbursement per data unit for DH nodes, the minimum amount of required stake for the agreement to happen,

for the agreement to happen, the amount of time the agreement will last and

and a minimum reputation requirement for the DH nodes.

In previous releases containing the initial version of the bidding mechanism, the actual bidding was performed in a type of a blind auction during which each of the interested DH nodes applying for the offer O would send an encrypted amount. This amount would be revealed in the next step to mitigate the risk of nodes undercutting each other in the race. The final list of applicants would then be associated with a set of probabilities according to the parameters the nodes have applied with to the offer, which would then be utilized in a roulette type of random choice function. This system had its foreseen downsides as it didn’t scale for a large number of DH applicants, and because it had a cumbersome revealing period which was increasing complexity and cost of the mechanism.

The improved version in Surveyor utilizes a different approach which allows for DH nodes to apply with a pre-revealed bid if the node itself estimates that there is a high probability of being included in the agreement set. The important enabling change is that this probability is determined by the distance function used to rank all DH candidates, which incorporates all the necessary parameters of the offer, as well as the address space distance of the node address from the address of the data content hash. In this way, there is a mechanism with less complexity (no revealing needed and no complicated and bounded roulette) and with a fair density of data dissemination determined solely by the data itself. There will be several improvements and tweaks to the new mechanism as soon as there has been enough time to collect observations and derive conclusions on better parametrization.

Consensus Check on Top of the Zero-Knowledge Privacy Layer Allows for Validating the Observed Supply Chain

The second important feature of Surveyor was to introduce the consensus check built on top of the zero-knowledge implementation introduced in our previous release Zond. This particular ZK implementation is meant to provide for the ability to validate the mass balance (quantities) within the observed supply chain by being able to validate that the inputs match the outputs in any arbitrary supply chain location of relevance (provided that, of course, data of the input and output is available), without revealing the values of those inputs and outputs.

The consensus check utilizes this functionality, while additionally performing other checks as well, essentially making sure that claims along with a certain observed supply chain match between all the parties involved, including that all the parties confirm cooperation in the first place. A consensus, in this case, happens when all the stakeholder pairs in the supply chain have matching claims as well as having cumulative values validated by the ZK algorithm.

To see the consensus check in action, after importing the data in the system, send a request to the API endpoint to get the trail of a certain batch of products via route api/trail?uid=urn:epc:id:sgtin:Batch_1 and the response will provide you with the consensus parameters for each of the events in the product history trail.

Two Releases Away from the Testnet Release

As with each release, we are trying to fix as many elements we find issues with as possible, and improve documentation and test coverage. We additionally refactored the code improving the codebase and added fixes for several unhandled exceptions mostly found in the test suite.

We also improved the validation of the node startup, checking for all the required parameters, and improved verbosity of error messages. We resolved some bugs reported form the community and continued writing more integration tests. Furthermore, we improved the API for remote control of the node.

We are super excited to be wrapping up the development of the alpha and proceeding with the long-awaited beta, which will be born with the OriginTrail testnet. This marks a very important milestone as we will be able to measure much more reliably what type of bottlenecks we can observe in the system and how to mitigate and solve them. The testnet period will also be very beneficial from the perspective of adoption, as the interested companies will finally be able to try out the solution in its entirety, provide feedback and request improvements. Exciting times are coming and we can’t wait to see what type of use cases will emerge apart from the ones we are already familiar with.

Our tech team will answer all questions that will emerge regarding this release in the Reddit AMA on Wednesday, May 23rd. You are welcome to submit your questions here.