IOTA uses numerous unique computing processes such as a tangle and ternary processing to reinvent the distributed ledger for the IoT.

This is the first article in a series of 3 exploring the roots, the applications and the possibilities of IOTA. The second article provides an overview of the data analysis and coordination methods employed by IOTA. The third article explores the Qubic protocol.

The future is already here — it's just not very evenly distributed. / William Gibson

IOTA is quite often misunderstood in the blockchain and crypto community. This is because it is a radical departure from the commonly accepted axioms of distributed ledgers and encompasses a host of concepts and technologies that may strike some as unorthodox. Interestingly, however, the origins of IOTA seem to stretch as far back as the early days of Bitcoin when efforts were being made to address the limitations of the first blockchain (while furthering the general concept of a permissionless distributed ledger).

As a cryptocurrency, IOTA is in a class of its own in that it is designed around a non-blockchain data structure with a highly scalable approach to transaction output. It targets the machine-to-machine interactions in the ecosystem of the Internet of Things.

The IOTA foundation was established as a non-profit and is based and headquartered in Berlin. They are dedicated to the development of open protocols and industry standards in the specific cross-departmental junction of Distributed Ledger Technology (DLT) and the Internet of Things (IoT)

What is IOT? The Internet of Things and a Ledger of Everything

“We need to get smarter about hardware and software innovation in order to get the most value from the emerging Internet of Things.” -Henry Samueli

The Internet of Things (IoT) is a concept describing an interconnected constellation of "smart" objects and devices such as environmental sensors, cameras, mobiles, biochip implants and wearable medical devices. "Things" in this sense refers to a mixture of hardware, software, data, and services, wherein each device in the network collects useful data and circulates it among the other devices. The value of such data could be almost limitless in terms of utility and predictive capacity.

The ability of IoT devices to transmit sensory input and be remotely controlled across existing and overlapping network infrastructures creates opportunities to more directly integrate the physical world into computer systems. This integration allows for a significant boost in efficiency, improved accuracy in analytics, and consequent economic benefits, while gradually reducing the need for human intervention. The result is a further rise to cyber-physical assemblies such as smart cities and grids, virtual power plants, SCADA (Supervisory control and data acquisition) systems and other critical infrastructure. Security expert Bruce Schneier describes the emergent properties of such systems as a "world-sized robot" in a 2016 essay, in which he writes:

"The World-Sized Web is being built right now, without anyone noticing, and it'll be here before we know it. Whatever changes it means for society, we don't want it to take us by surprise. These changes are inherently unpredictable because they're based on the emergent properties of these new technologies interacting with each other, us, and the world. In general, it's easy to predict technological changes due to scientific advances, but much harder to predict social changes due to those technological changes. For example, it was easy to predict that better engines would mean that cars could go faster. It was much harder to predict that the result would be a demographic shift into suburbs. Driverless cars and smart roads will again transform our cities in new ways, as will autonomous drones, cheap and ubiquitous environmental sensors, and a network that can anticipate our needs." -Bruce Schneier

In 2017, the number of online-enabled devices expanded to more than 8 billion, with estimates that by 2020 the IoT landscape will account for more than 30 billion interconnected devices and a global market value in the range of 7 trillion USD. This proliferation is paralleled with an explosion of associated vulnerabilities. IoT-focused exploits have quadrupled in the interval between 2016 and 2017, demanding an integrated approach to handling the entangled complexity of the IoT fabric.

A functionally flexible architecture capable of balancing latency, network accessibility, rising costs and sensitive data security concerns is required. IOTA positions itself as the first significant mover in this space.

IOTA: An Intersection of Cryptocurrencies, Internet of Things, and Artificial Intelligence

The mathematical foundations of the tangle were laid down in the white paper by co-founder Serguei Popov in an overview of how the nature of blockchains and their associated transaction fees are unsuitable for the micro and nano transactions (be those value or data) that define machine-to-machine interactions. So, instead of a blockchain which structures transactions in compressed blocks on a single lane, IOTA uses a network topology of a directed acyclic graph (DAG).

Visualization of the tangle.

IOTA's DAG architecture (known as the tangle) is conceived as the infrastructural blueprint for interconnectivity and data coordination between devices. It introduces an altogether different operational logic in that it empowers the network rather than profit any particular group of actors within it. While Bitcoin and other blockchains have arguably become more of a social experiment rather than a technical one, IOTA's undertaking begins from scratch in reinventing not just the distributed ledger, but the structural foundations that ground it.

The Tangle As A Neural Network Registry

A feedforward neural network is an artificial neural network (DAG) wherein connections between the nodes do not form a cycle.

What makes the tangle of special interest is that it is being designed as a globally distributed neural network rather than an institutional blockchain pillar. Neural networks, as a rule, are at first unorganized and don't work very well but improve in accuracy as they evolve and optimize based on "weights and biases." These weights and biases determine how each node navigates its way through the network progressively improving its performance of tasks. The process of enhancing a neural net's accuracy is called training.

IOTA is also introducing a small hardware augmentation (the JINN ASIC/microprocessor) that uses ternary logic and further equips devices to function like neurons in a distributed system of interconnected layers without a center. The ternary logic modification will bring substantial improvements to distributed computing as it allows for a more compact structuring of information and significantly increased processing speed. This reduces the amount of electricity necessary in comparison to what the binary chips expend for executing the same operations.

Directed Acyclic Graphs: An Entangled Web of Chains Without Blocks

A Directed Acyclic Graph (DAG) is a topological ordering consisting of nodes (tasks, events). Their dependency relationships are connected via edges (or directed arrows) in such a way that that time flows in one direction and nodes cannot loop back to themselves. That is, by moving from one to another following the edges, a node cannot be encountered twice. DAGs are applied in modeling all kinds of information where events are to be represented in how they influence each other. Blockchain ledgers, in contrast, group transactions in compressed lists, including the previous block (and thus, the entire history) and chain them on a single lane with interruptions of block time intervals.

Brain Circuitry - What is JINN and Ternary Logic?

JINN is a ternary-based microprocessor optimized for large-scale distributed computing. It was first announced in September 2014 on the NXT forum and IOTA became the software protocol developed to eventually fit the hardware.

Co-founder David Sønstebø has described JINN as:

“JINN is a custom-made Polymorphic Processing Unit which utilizes asynchronous circuits and trinary logic gates, a component of this is the ‘Curl Hasher’ (essentially a tiny ASIC), this ‘Curl Hasher’ component will be made open source so that any chip manufacturer can add it to their chips trivially. We're talking a completely negligible amount of logic gates here, so zero extra cost, size trade-off or implementation issues.”

A Ternary Processor

Most hardware in use today operates on binary logic, with a binary digit adopting one of two possible values or states (0 and 1). A bit corresponds to the maximum amount of information that a binary digit is capable of encoding - that is, a binary digit is a container for what a bit can represent.

Trinary numerical systems, on the other hand, are of two types:

• Unbalanced: a trit (trinary digit) has the values of 0, 1 and 2.

• Balanced: a trit has the possible state-values of -1, 0 and 1

Encoding data in trinary defines both the amount of information that can be transmitted and how it can be handled. IOTA employs the balanced ternary type since it is best-suited to their objectives. Guosong Liu, in an MIT publication, explains the benefits of trinary in computer design:

"This [trinary] allows additional interactions to occur during processing. For instance, two signals can add together or cancel each other out, or different pieces of information can link up or try to override one another."

One reason the brain might need the extra complexity of another computation component is that it has the ability to ignore information when necessary; for instance, if you are concentrating on something, you can ignore your surroundings. "Computers don't ignore information," Liu writes, "this is an evolutionary advantage that's unique to the brain."

IOTA favors availability and speed over consistency (following the CAP theorem) while prioritizing partition tolerance with eventual consistency (meaning that the system can sync up after a partition gets back online, reattaching to the tangle).

Summary

In today's information age, the ubiquity of data, the pressing privacy concerns, and the importance of data integrity seem to require some fundamental revisions to how we design our systems to handle this proliferation of data and complexity. IOTA has taken on the task of fleshing out a comprehensive solution to how we are to instrumentalize that wealth of information by enabling machines to output better structured and more reliable signals on a global scale.

This is the first article in a series of 3 exploring the roots, the applications and the possibilities of IOTA. The second article provides an overview of the data analysis and coordination methods employed by IOTA. The third article explores the Qubic protocol.