Welcome to Streamr’s core dev team update for March 2020. My apologies for the delayed post, but I hope the content will make up for the wait.

For the month of March, the team has been focusing on pushing forward the final specs for the upcoming Data Unions Framework public launch. We also kicked off the much anticipated token economics workshop, assisted by BlockScience team, to start laying the foundation of what will be a multi-stage process. Lastly, development on the Network front continues steadily and we began implementation of a new phase of Streamr Network benchmark testing by leveraging AWS instances deployed across the globe (the goal is to run the full experiment with 2048 nodes, as last time).

On the Data Union front, for those of you who have been following the progress of Swash team, you might have noticed they just went through a minor migration for their on-chain DU smart contract. As part of the migration process, all community earnings accumulated until that point, that were tied to the old contract (minus already withdrawn amounts), have been sent to each user’s wallet, without any gas cost on their front. The new version of the DU smart contract has undergone a full security audit by a specialised third-party, and is thus prepared for the upcoming public launch. On the DU Server side, as per last month’s announcement, we have implemented a series of optimisations and load balancing to address a higher volume of user interactions: joining requests and balance queries. Pilots like MyDiem and others building on top of DU Framework will all benefit from a more stable release from the get-go.

How did the two-day, fully immersive workshop on token economics go? I have to admit, it was an intense and worthwhile workshop, where we discussed many topics that could affect the rollout of Streamr token economics in near future:

How to mitigate the duality of the token as medium of exchange vs appreciation

How to derive a unit of accounting for resource usage

How to decide who pays for the usage, which segments of the data transport path to consider and which settlement frequency to use

Preventing Network attacks or abuses by making it expensive to do so

How to maximise value capture for the token layer, in addition to value creation

This first phase kick-off was aimed at doing a deeper dive into all components of Streamr ecosystem, documenting the roles played by each entity, and creating taxonomy and abstraction for each of them. The goal was to enable the development of simulation models to test out different incentive mechanism designs at later phases. In addition to that, during the workshop, we also discussed different types of tokenomics models being deployed across Web3 projects and analyzed the pros and cons of each possible variation, with respect to the Streamr ecosystem. Once the current documentation phase ends, we will probably start working on the base components needed to run a simulation model.

On the Network front, we had a few parallel tracks taking place last month. For the last phase of Corea Network benchmark testing, we moved away from the CORE network emulator that we have been using so far to analyze Streamr Network performance on large scale (around 2048 nodes at full range). Instead, we decided to leverage AWS node instances to measure real-world performance metrics by leveraging their data centres distributed across the globe. While this should give us a more realistic picture of how our Network would perform once released to the general public, there is also a real, not small, cost associated with running so many nodes. For that reason, our aim in the last few weeks has been optimising the process as much as possible to lower the cost needed to deploy and measure 2048 nodes all at once.

The Network core dev team continued working on the WebRTC implementation, which should bring native P2P functionality and eventually enable clients to have a dual role of serving as network nodes too, thus further increasing Streamr Network resiliency. In the past weeks, we have also dedicated resource to fixing some critical bugs that would have caused more issues down the road if left unsolved. More specifically, we noticed Network users (developers) were experiencing noticeable delays when requesting historical data re-sends from a stream. At a high level, the issue was caused by message reordering logic that waited for gaps to be filled, combined with increasing message queues due those wait times.

Before I sign off this for this month, I actually have a request to all you readers. One of our community members just posted about their new ride-sharing startup, which they will be launching in New York City sometime soon. They have already onboarded 750 drivers and are looking for some ideas or feedback on all potential data monetisation strategies they could leverage, so they can build an open data ecosystem and rightfully share revenue with their drivers. If you have some ideas you could contribute, any personal or industry insights, know someone who knows about this field… or even know someone who knows someone!… your help will be surely appreciated. You can keep up with the discussion at our Dev Forum.

As always, thanks for reading.

Network

Started focusing on network experiments on AWS

Debugging delays in message resends when requesting historical data. Last week testing and checking all possible places where resends could fail.

Reviewed key exchange PR for the Network encryption

Continued implementation on WebRTC

Finished key exchange by publishing new JS client to npm. Will do end-to-end testing on that. Then move on to JS implementation and documentation.

Gathering metrics in emulator

Updated permission management

Data Unions

Added version to smart contract

Each server will only run versions that match with the server

Tweaked UI flags for DU creation

Working on publish dialog, adding some tests and implementing new design

Last week fixed issues with BlockScout

Working on Swash server and smart contract migration to latest stable version

Core app (Engine, Editor, Marketplace, Website)