Despite these benefits, flooding the network with repeated messages has its own challenges. For transmitting data, the main questions are how data packet collisions “broadcast storms” are avoided, how the retransmitting process propagates the message efficiently toward its destination, and how the process ends, without an energy-wasting avalanche.

Potentially a synchronised-Flooding approach using a synergic combination of techniques incorporating time division multiple access combined with high-accuracy synchronization would allow us to solve these challenges.

Nodes transmit only relevant information, and retransmissions occur simultaneously so that the message propagates one hop in all directions at precisely the same time and avoids collisions, until the network reaches the set number of maximum hops and the message has flooded through the network.

In the Flooding-based scheme, a signal obstruction or even a limited number of signal obstructions will most likely not affect the operation at all because of the numerous redundant paths.

Consider the nature of the nodes in this network as both backbone communication nodes and IoT and devices, a wireless mesh network approach is the most effective network topology and communication technique for mass communication across interconnected devices; providing the ability to join and leave the network dynamically, and participate in the receiving and propagation of messages, without having to account for neighbour devices or route information.

Evolution of Power and Processing

The synchronised-Flooding approach offers a simplified infrastructure requiring only nodes and gateways — nodes that act as the link between multiple networks — and the transmission of only relevant data.

In routing-based networks, even though the infrastructure requirements are not as simple, the total number of operating nodes at any moment (when the network is transmitting) is always lower than in Flooding-based networks; so it would seem, routing consumes less energy.

On the other hand, Flooding-based messages are much more efficient, as they do not require the overhead associated with transmitting routing tables and commands, which increases with the number of nodes and hops.

No routing means that the controller is extremely simple, requiring minimal computing power and memory and thus low power consumption, low PCB real estate, and low cost.

Furthermore, the energy of the signals received from adjacent nodes adds up, so less power can be used for achieving the same range.

But with the increase of interconnected devices, comes an increase in the data streaming through the net coupled with the physical power requirements, there arises a need for improved processing speed of larger data size with reduced power consumption.

Fortunately, the technology required to handle this is not far away — maybe even as early as 2016 — with technologies such as HP The Machine focusing on the idea that current RAM, storage, and interconnect technology can’t keep up with modern Big Data processing requirements.

The base focus of The Machine, completely revolutionises current computer architecture, combining technologies that could solve both problems; using Memristors that could replace RAM and long-term flash storage, and Silicon Photonics could provide faster on- and off-motherboard buses.

The Machine will reinvent the fundamental architecture of computers to enable a quantum leap in performance and efficiency, while lowering costs over the long term and improving security.

Technologies combining hyper-fast, super-dense storage with higher data processing rates and lower power consumption would not only enable the ability to process much larger data sets, but also process the increased level of traffic that mesh synchronised-Flooding approach introduces.

Evolution of Application

With the evolution of infrastructure and the way devices and individuals communicate, so to will the information and application layers evolve.

The leap forward in performance and efficiency of communication and processing Big Data at the physical layer, builds the foundation required for improved shared processing across the internet at the application layer.

local apps work together to securely share information and solve problems as a distributed mesh

More advanced shared processing enables improved Machine Intelligence capabilities, which in its current state is enabling applications we are already familiar with, the likes of popular Intelligent Assistants Siri, Google Now, and Cortana.

These Intelligent Assistants are still in their infancy. The aspects of these assistants that we most readily recognise are their interfaces and mode of distribution — how and where we interact with them —

The experience of Intelligent Assistants that speak our language and communicate like a person has come to be their defining factor. But, they are overwhelmingly focused on natural language interfaces.

When it comes to the scope of what they are, or will be, capable of achieving, to assist and to learn and share information based on the user they assist. What they grown into will depend on the ability to learn through implicit communication and share and process across the Internet with other Intelligent Assistants, as a distributed system.

Implicit communication dominates. Assistants respond and react to our subtle contextual interactions, and to each other, within vast informational ecosystems.

The ability to learn and share about our needs and intentions based on the context of where we are and what we’re doing, as well as on our ability to make inferences based on associations, — the way we organize information or express interests — as a mesh system of specialised assistants, will reshape the way we interface and interact with the Internet and each other.

Every website, every service, every app, and across the internet of things, everything embodies a collection of tasks that may be supported by intelligent assistants. In this environment, the metaphor of personal assistants quickly fragments into systems that are much more akin to colonies of ants.

The Web will become an information model where information is provided as contextual according to time, place and activity of the individual, and answers from assistants instead of a menu of links.

Evolution of Interface

As interconnectivity across the Internet increases and the landscape of the Web changes from the colonisation of Intelligent ecosystems, the things and devices that make up the IoT that we use to interface and interact with, will continue to evolve as they have done in recent years, but with a direction towards this being discussed, where previously potentially undirected or aimless — Evolution of technology for profit, driven by marketeers; Or technology advances in academia for academia sake, without potential vision.

Seeing what 2014 has introduced as far as smarter things and devices, and understanding the evolution of technology so far. We can assume that technology is going to continue along the same path, improved processing as devices become smaller and more personal wearable devices, that all interconnect.

Already, the likes of Google and Sony are producing wearable devices that connect us to the Web of information in a real-world environment through the use of Augmented Reality (AR) feeding real-time information directly to the field of vision.

Art: Sean Hamilton Alexander

As wearable technology companies continue to design, improve and redesign the way we wear our devices and the way technology proliferates into everything we do, as Intelligent Assistants become more integrated into our daily lives, the ‘interface’ to which has the potential to become so integrated that we wont even recognise that they are there, monitoring, learning, processing and sharing.

Our view of the world will be augmented, by the Heads-Up-Display of Intelligent Assistants feeding us information that they share and receive from the mesh of IoT ecosystems around us.

Social and Political Road Blocks

While a lot of this can sound fanciful and sci-fi, this is where the pure technology is heading based on what technologies and practices are emerging or have established themselves and are evolving today.

The biggest road blocks to face in achieving this are the Social and Political issues, which often come down to ethics and control. I’m not going to discuss these at this point as I only want to discuss where the tech is going and what is possible from a conceptual point of view.

With all the social and political road blocks mixed in, we can assume this conceptual view of the evolution of connectivity is going to be difficult to achieve. But its necessary, to support the growth and to make sure the foundation of what we have achieved today, is still here 10 years from now, all be it a completely different landscape.

Regardless of the social or political restrictions placed upon them, people will always find a way to do what they want or achieve their own goals using whatever means are available.

But the main focus must be on the interconnectivity of everything using increased communication, and saving power.