Now, for comparison sake here are “SOME VERY ROUGH ESTIMATES”. As I have limited personal time I, unfortunately, do not currently have the time to dive into hardware processing foundations. Yet here is an attempt to simply show the big picture. In Nick Routley’s article, ‘Visualizing the Trillion-Fold Increase in Computing Power’, an Apple Watch can produce about 3 Billion FLOPS while an iPhone 6 can produce 8 Billion FLOPS. Take a PlayStation 4 which can conduct 1,200 Billion FLOPS. Everything from the simple watch on your wrist, sensors, smart-phones, computers and gaming systems has a resource that can conduct computations.

From ‘Part I — So What is The Internet of Things?’ we found that in 2020 there will be 30.73 billion connected devices and this equates to an estimated 4.04 devices per person. Given that some people have a computer, others a PlayStation, and in 3rd world countries where only a smart-phone is their connected device, we can assume that the computational power per device varies. If we took a random number out of the air and assumed 350 million FLOPS was the average power of each device, this would equate to a global combined computational resource of 10.7555 Quintillion FLOPS (10.7555 to the power of 18). In the Wired.com article by John Timmer, ‘World’s Total CPU Power: One Human Brain’ written in 2011, he stated that the planets combined resources could conduct 6.4 Quintillion operations per second and that GPU’s make up 97% of this (note: supercomputers power were not included in this analysis). Considering the article was written in 2011, it doesn’t seem unplausible that a rough estimate of 10.7555 Quintillion FLOPS is what today’s global computation resource consists of.

With 10.7555 Quintillion FLOPS this equates to the global computation resource consisting of either: a) 1,023 combined K Supercomputers or b) 74.94 times the world's most powerful supercomputer the American Summit. Let’s now consider that it is roughly estimated that the American Summit supercomputer cost $200 million dollars to build, and the K Supercomputer costs $10 million annually to run. So if the world has the computational power of 74.94 Summit’s, and 1,023 K Supercomputers, this equates to $14.988 Billion dollars to build (74.94 x $200 million) and $10.23 Billion to run annually (1,023 x $10 million). Are these numbers correct, or even close? No, definitely not! However, we can see that the computational power as a resource, without a doubt has a very high value within the world. More so, that there is a HUGE abundance of this resource within the world through connected devices alone. The next logical thought becomes, “what if we can offer this global resource to a digital marketplace”?

Part of the Qubic Network is the process of using the ‘Qubic Computation Model (QCM)’ in which Navin Ramachandran describes in ‘somewhat’ layman's terms how Qubics runs on QCM. By using Qubics and QCM, this doesn’t necessarily mean, nor have I personally read, that there is a quantitative approach to classifying computation as a resource. That is, if the person (a) requests to purchase N amounts of FLOPS, hashes, or epochs, then person or device (B) which is offering that resource up, can quantify the resource and offer a price. What is important here is that the Eric Hop, the IOTA Foundation, and the community have found the solution and are in the works of creating the Qubic Computation Model right now. Eric wrote a 6 part article series, Qubic: Explaining the Qubic Computation Model. Though if you couldn’t understand Navin’s article you certainly will not be able to comprehend Eric’s. Even so, I implore you to give both a read. What both show, are that they have created the foundation in which to build upon. As for a digital marketplace evolving, this is as easy as an application being built by the community and economic forces finding price equilibriums and use cases. Not to beat a dead horse, but the facts show that not only is computational power a resource but that there is an abundance within the world. Just as in Part IV with the example of California, we found several people will have sensors around their homes wasting data. The same amount of people in the world will have connected devices sitting there wasting their computational power. Just as we saw the use case in which we can use unused data, so will the use case be created in which we can use unused computational resources.

Let's not forget this is about the IoT and connected devices. John Timmer found that in 2007 there were 6.4 Quintillion operations a second combined power on the planet. However, 97% of this was completed with GPU’s. Now, our smart-phones have GPU’s but the majority of connected devices specific to the IoT will not. Most IoT sensors and connected devices will be built to focus on two objectives: 1) to be energy efficient and 2) to be computational lightweight. The two pretty much complement each other because, if a device is computationally lightweight it will also be energy efficient.

For example, do you have a SmartThings or Alexa? Or even a ThermoPro? I have all three. I put the SmartThings temperature and water sensor in my basement, as well as, the ThermoPro sensor outside. I found quickly that the water sensor burned through its battery within 8 months and the ThermoPro dashboard killed its battery in 11 months. As the Internet of Things offers us this digital space where sensors can digitize the physical world not only is the operation of sending data, but also sending secured data (data that can be hashed cryptographically), devices use a lot of energy to conduct the computations. Energy efficiency is a high priority for connected and edge devices.

A connected device loses most of its battery energy through its computational operations. Thus, by using the Qubic Computational Model, which is written in Ternary programming (-1,0,1 / standard programming is binary (0,1)), the end result is a very energy efficient piece of hardware. Or in other words, the programming language that directs the electronic hardware how to act (Abra) creates the Qubic Computational Model. This hardware language creates an estimated 30% to 40% energy efficiency when combined with actual ternary hardware. All of this starts to get technical but the concept is, the Tangle, the Qubic Network, and Qubic Computational Model all combine to be specifically designed for the Internet of Things. So instead of having to change batteries in your water sensor, or your soil saturation sensor every 8 months, you may only have to change it every 4 years. Or, if the sensor has a very small solar cell on it, it may never have to be changed.

After eight months the battery died on the water sensor in my basement. I have a great secure router, the SmartThings hub, a water sensor, a smartphone to receive alerts in case water is flooding in my basement, but, if the battery dies on the device the whole system fails. Hmmm, again, we find a single-point-of-failure when considering battery life. To mitigate the Tangle and the Qubic network has been specifically designed to offer a method that allows devices a huge increase in energy efficiency. After a very quick google search I found on www.whatis5g.info (energy-consumption), “our energy calculations show that by 2015, the wireless cloud will consume up to 43 TWh, compared to 9.2 TWh in 2012, an increase in 460%”. On Wikipedia, they stated that in 2011, expenditures on energy, on a global scale, totaled over $6 Trillion dollars. That comes out to about $685 Million dollars per hour! Just conceive, if we could reduce our energy usage by 35% in connected devices for the Internet of Things? How much would this lower the cost and save the world? On $6 Trillion dollars that is $2.1 Trillion in savings. Also, 35% savings on 43 TWh of energy is 15.05 TWh saved. How much would saving 15.05 TWh lower the CO2 increase and support fighting global warming? The fact is, each year the world becomes more connected, more digitized, and in need of becoming more energy efficient, as well as, secure. So IOTA, the Tangle, and the Qubic Network are being developed not necessarily for the needs of today, but specifically for the needs of tomorrow.

So is Computational Resources Wasted?