Inspired by Steve David's (then CIO of Procter & Gamble) vision of a "Consumer Driven Supply Network", SAP embarked on a journey to reinvent supply chain management systems back in the year 2001. The quest to become a leader in Gartner's Magic Quadrant would lead us to become a founding member of the MIT Auto-ID Center, where Kevin Ashton invented the term "Internet of Things" (IoT). It also lead us to collaborate with the brilliant minds of BiosGroup, a spin-off of the Santa Fe Institute founded by the godfather of complexity science, Stuart Kauffman. The meeting Albrecht Diener (SVP of GBU Supply Chain Management) and I had with Steve David and his supply chain visionaries including Jake Barr and Franz Dill was a pivotal moment in my professional life.

Not unlike blockchain technologies today, supply chain optimization was still a nascent field that held great promise, but struggled with technology limitations and organizational roadblocks. At a time when companies still struggled to implement centralized supply chain planning systems in their 4 walls, we no longer viewed supply chains as static, linear chains, but as dynamic, multi-entity networks that would need to become increasingly open and responsive to external data feeds and real-time events, be it from business partners or emerging sensor networks. Our vision back in 2001 was to blur the lines between the physical world and the world of computer systems using sensor technologies and software agents to create what we called "Real-World Aware Adaptive Supply Networks". The system we envisioned would need to be inherently event-driven and able to interprete and process the massive amounts of data generated by sensor networks in a decentralized fashion without human intervention. Could distributed intelligence provide an answer to realizing constantly evolving, "adaptive" supply networks that could make predictions and coordinate local decisions with adjacent nodes? What might we learn from swarm intelligence and self-organizing principles employed by ant colonies, bee hives or flocks of birds? Where existing supply chain optimization systems were centralized and created brittle plans unable to respond to real-world events, we envisioned a distributed network of self-interested parties, based on autonomous multi-agent systems.

However, we struggled with the question of what the underlying infrastructure of such an inherently distributed computing network might be. How would we manage state across a distributed system? How would we manage identities in what was supposed to be an open network and create trust? Who would carry the cost of hardware in such a shared infrastructure? How could we trust the results of local computations and distributed decisions? How would we manage the life-cycle of distributed, multi-agent systems?

When I first learned about blockchains and the Ethereum platform back in 2015, I was excited about its potential. At the same time, I understood blockchain architectures' inherent scalability limitations and transaction fees as obstacles to real world adoption. I also knew that the energy intensive mining and consensus mechanism of traditional blockchains would not be a good fit with low cost IoT devices and intermittent networks.

When IOTA introduced the first next generation blockchainless approach to distributed ledgers built specifically for the Internet of Things (IoT) and the machine-to-machine (M2M) economy, I was excited about its potential to bring virtually unlimited scalability and fee-less micro transactions to the world. However, important building blocks such as smart contracts were missing.

Enter Qubic

Qubic introduces not only oracles and smart contracts, but quorum-based computations, a form of distributed computing. It essentially makes the IOTA Tangle programmable and leverages the data transfer protocol and peer-to-peer fee-less micro transactions to create incentives for outsourced computations. In the future, Qubic will leverage the world-wide available, unused computation capacity to solve all kinds of computational problems.

Oracles

In our pursuit of demand driven supply networks, we asked ourselves how weather conditions or events might impact demand in local stores? How could we factor in local conditions in demand predictions and supply decisions? IOTA and qubic introduce Oracles, that serve as trusted sources of distributed data external to the IOTA Tangle that provide the input for quorum-based computational tasks. Examples of such external data could be temperature and air quality data, exchange rates or share prices, or personal attributed such as age, martial status or brand preferences.

Smart Contracts

Oracles may also feed smart contracts. It is expected that in the future smart contracts largely replace real-world paper contracts and drive process automation by eliminating the need for third party enforcement and validation. Smart contracts could not only model contracts in a distributed network and trigger transfer of ownership and associated payments, but could also model incentive structures and business policies. However, the distributed computing model and outsourced computations of Qubic open up entirely new possibilities.

Qubics

Individual Qubics are in essence pre-packaged quorum-based computational tasks.The Qubic protocol specifies the construction, execution, and evolutionary life-cycle of Qubics. It leverages the IOTA protocol for secure, trusted data transfer between the various participants in an open, distributed network. IOTA also has its own built-in payment system. where the IOTA tokens are used to pay for computational resources and provide an incentive system for Qubic operators. But Qubic not only creates a great incentive for people to run nodes. It will also create incentives for anyone to let others use spare computational power or storage capacity and create passive income for the service.

Just like we envisioned more than 15 years ago, Qubics are event-driven. They 'listen' to new data and transactions on the IOTA tangle and get triggered whenever the input data changes. One Qubic can trigger a resulting cubic to execute based on its output. Essentially, Qubic enables a distributed computing model and distributed artificial intelligence that would run based on the trusted, validated and immutable data on the IOTA Tangle. A range of computing devices can form assemblies that cluster a range of computational resources around a given task. A novel, function-based programming language called Abra will enable Qubics to run on virtually any hardware to make use of an almost limitless pool of generic computational resources. A variety of consensus mechanism can used to establish trust in data and computational results.

Qubic is another important milestone in the evolution of the IOTA tangle, the world's first distributed ledger based on a Directed Acyclic Graph (DAG) that promises to address the existing limitations of blockchain architectures around scalability, transaction fees and energy consumption. In essence, Qubic makes the IOTA Tangle programmable and introduces a distributed computing environment where machines can establish trust, share data and pay each other for any resource using fee-less micro transactions.

As somebody who appreciates holistic system thinking, I am truly impressed by the vision of the IOTA founders. The Qubic vision is a major steps towards making IOTA a complete platform. With Qubic, a range of pieces that in themselves have been impressive solutions to real-world limitations of blockchain technologies come together to form a holistic platform for distributed computing, distributed artificial intelligence and swarm intelligence, leveraging a rich set of trusted, immutable and shared data stored in the IOTA Tangle. The possibilities are immense.

Given my background in Supply Chain Management and Industrial IoT, I am still interested in the space. After all, our work at SAP on "Real-world Aware Adaptive Supply Networks" had a lasting impact on me. But I am already exploring the implications of Qubic on my new passion... stay tuned for my thoughts on how Qubic will power the future of connected, autonomous, shared and electric mobility. In the meantime, I wish my SAP friends, customers and partners a great Sapphire 2018!