By

High culture organizes its world views using overarching frames: intellectual superstructures that serve as extrinsic conceptual coordinate systems. “Globalization” and “Industrialization” are examples of such frames.

Popular culture on the other hand, tends to be driven by the most visible and drama in the immediate environment. From the chaos of turbulent change, popular culture tends to pick out specific motifs around which to grow a world view. These motifs mostly arise from the economic abundances that drive that particular age.

In trying to compare and contrast the motifs of different ages, something interesting struck me: the motifs tend to cycle between material, object and cognitive motifs. The objects aren’t random objects, but ones created by the operation of technology. So iron is a material motif for the Iron Age, the steam engine is an object motif for the Industrial Age, and writing is a cognitive motif for the Bronze Age. Here’s an approximate and speculative table of the motif-cycling I made up.

(I have endnotes for the less obvious table entries, which may need some explanation; and obviously the model is more speculative for ages for which contemporary written records are not available to us).

Why is this cycling important? Well, for all you futurists out there who are stuck in a mental rut asking yourself, what’s the next big thing? the next big thing is almost certainly not going to be a thing at all (object motif). It’s going to be a material motif. So the right question is what’s the next new material?

So answers like “3D printing” are wrong in a specific and interesting way. Let me explain.

Understanding the Cycling

Thinking in terms of temporal cycles is dangerously seductive. It is extremely easy to delude yourself that you can predict times and transitions; that your labels for specific squiggles on a graph are somehow objective and essential rather than subjective and arbitrary; that patterns in and of themselves somehow represent knowledge.

If useful (in an instrumental sense) cyclic thinking is possible at all, it tends to require extreme sophistication and mountains of data to do correctly. Looser types of cyclic thinking are much better at suggesting the right questions than discovering the right answers.

When you apply structural-cycle thinking to something as subjective and ephemeral as the motifs that symbolize entire temporal epochs (why atom and airplane for 1945? why not aluminum and atomic bomb instead?), you’re basically in fertile territory for self-serving speculative nonsense, where keeping yourself honest is really hard.

That said, there is one mitigating factor with this model: unlike purely structuralist cyclic models, there are some fairly obvious dynamics at work here: first we discover a new natural abundance, then we learn to engineer with it (turning it into an engineered abundance, often created by wasting the natural abundances of a previous era), and finally our brains adapt to think in the new environment (usually via new educational models that manufacture a new normal).

This last stage creates a cognitive abundance or surplus of a specific type. This specificity is what is missing in Clay Shirky’s model of cognitive surplus. Universal high school education, which became a reality in the US around 1910, did not create a generic cognitive surplus. It created an abundance of reading and writing bandwidth. A Coal Mind literacy. Our age too, is creating specialized forms of cognitive surplus that I will get to, not a generic capacity for creativity and innovation.

So what we have here is a basic moving bottleneck phenomenon where first we make things and then the things make us. The remade humans are then cognitively equipped to create new material abundances, restarting the cycle. If the cycle ever stalls without a new material abundance, you get decay and its consequences.

The reason explicit frame-based world views are less useful than implicit motif-induced ones is that they represent the workings of technologically shaped cognitive capabilities, and are therefore derivative with respect to motif-level world views. So “industrialization” is not an absolute frame, but one that a “Coal Mind” so to speak, can process, given the trajectory of cognitive development till around 1900. The “Silicon Mind” retains the label “Industrial Age” for referential convenience, but processes the historical record very differently (in a less triumphalist, more dystopian way for one).

With those caveats aside, let’s foolishly shove the angels aside and rush right in.

The State of Play

We seem to be near the end of the third major reboot cycle in human history (these seem to be marked by the co-existence of an emerging material motif (DNA in this case) and a maturing cognitive motif — the new abilities of the maturing Silicon Mind in this case).

We exited the object-motif zone of the current cycle around 2000, when the social web dumped us all into a sudden abundance of relationship possibilities. We’re not in paleolithic tribal villages anymore. We’re not in Kansas anymore. We are in a social environment where we must manage relationships within a space (the “cloud”) that extends far beyond family, village, nation or metropolitan region.

But I’d say, after a decade, we’re finally on top of relationship abundance. We know how to manage it. And the result is that Facebook now has a billion members (as of yesterday), furiously manufacturing thoughts from a billion different perspectives for their managed social neighborhoods, all vying for the last dregs of attention.

In other words, we’ve learned a new cognitive mode: relating. Navigating interpersonal realities in ways that our brains weren’t really designed to.

But those billion different perspectives aren’t about a billion different things. Imitation and social proof, we’ve discovered, make the dynamics of collective attention far simpler than they could be. It is not a 3-channel universe, but it is not a billion-channel universe either. It is a million channel universe with a lot of slightly different rerun channels.

We are now developing a new mental muscle to navigate this million channel universe: perspective shifting, or refactoring. It is how we manage attention in a world that is past Peak Attention from the point of view of recently disrupted entities that relied on mass media. One useful way to understand Peak Attention is that we did not run out of attention per se, but that the portion that had been colonized during the industrial age started escaping from domesticated captivity around 1980. Perspective economics is about dealing in wild, ungoverned attention, which is more abundant than the governed kind ever was, but is much harder to influence at scale. Sort of like fresh water versus salt water.

A “refactor/rethink everything” mentality is taking hold at scale. This is giving us everything from a media space driven by furiously competing Facebook memes, to business ideas like Airbnb and Zipcar, to what the “View from Hell” blog calls the insight porn industry. That post incidentally, has ribbonfarm classified into this last phenomenon, and I have to reluctantly accept that the classification makes sense. I manufacture insight porn for the perspective economy.

The category includes everything from the glossy manufactured Aha! experiences of TED, to the frantic building of an entire “insight industry” out of Big Data technology, to the flavor-of-the-year world of programming languages.

That then, is the state of play. We’re basically in the last stages of learning to think with our new Silicon Mind. Hacking, relating, refactoring, these are the basic new literacy skills for the Silicon Mind. These skills will form the basis for a new education system and turn into a new cognitive abundance within the next ten years.

So meh, that’s old news. Yesterday’s future.

The Next Big X is Y

Returning to the motif cycling, it is clear that Something Is Up with a new material: DNA. Costs for gene sequencing are dropping faster than the cost of processing power ever did during the Silicon Age. As part of the resultant engineering bounty, we can now clone individual organisms, grow ears on arms, engineer entirely new varieties of plants and animals, and do other extremely weird and nauseating things. Monsanto seeds are nothing compared to what is coming.

So far starters, the next X is material and the particular material Y is DNA.

But it is equally clear that this is just a material abundance. It has not turned into a true engineering abundance, let alone a cognitive one.

At the same time, it is also clear that silicon has shaped our minds as much as it is going to. This shaping is not yet pervasive, but there are enough instances of homo siliconus wandering around that we know roughly what this mind (or rather, collection of minds) looks like. There isn’t going to be another age soon whose motif is another kind of cognitive development. We will merely scale and refine what we have. If somebody reads this blog in a hundred years, he or she will probably criticize it for the crudeness of the relating, hacking and refactoring going on here.

But the action is shifting back to the material layer. There is a new dominant material abundance in town. And this one is special because it is the first material abundance that involves the living material of nature (not counting plastic).

But it is clearly not enough. Before we can get to engineering abundances, we need a few more material abundances to form a resource base. In the last cycle, we had six basic and complementary material abundances — oil, coal, steel, atoms, plastic and silicon — to work with. Oil was the first among equals in that set, just as DNA is probably going to be first among the equals in the next set.

You can sort of see where the material bottlenecks are:

Gene therapy, longevity technologies and cancer treatment almost certainly requires MEMS or nano-level targeted delivery mechanisms that require the sort of material abundances nanotechnologists are pursuing.

Lithium and other battery-making materials are another bottleneck for most engineering futures. We need materials for better batteries in all senses of “better” (lighter, smaller, longer-lasting, less-polluting). Energy scarcity is not going to be a problem much longer, as we slowly replace most atom transport with bit transport and dematerialize many things, but energy portability is going to be a bitch to deal with. The most portable form today (electricity) is hard to store, while the most storable form (coal) is hard to move. It is a stock-and-flow nightmare. But Elon Musk may have solved this one.

Cheap 3D printing and other small-scale/batch/high-variety technologies are mostly being held back by the lack of material abundances in the various “toner” like materials required to fuel the processes (this might be an artificial scarcity, created by IP laws, rather than fundamental resource scarcities).

Water scarcity is a huge, looming problem that may be on a runaway course. But the costs of desalination are apparently falling rapidly. As an investor acquaintance of mine said, cheap desalination is the bottleneck here (apparently Kennedy saw that one coming).

People: the world is rapidly aging. You need young people to drive the story forward. Even young countries like India will have an aging population within a few decades. There is no way to create an abundance of people while living women remain the bottleneck resource. Like it or not, there is a huge economic motivation to pursue vat-grown babies. Parenthood may be obsolete in another century.

What happens if we create an adequately complete resource base of complementary material abundances for a new era of engineering?

We’ll enter another age of object motifs. The key here is that these motifs must represent ubiquity rather than novelty. Not Dolly the sheep. Not an artist with an ear grown on his arm. Something like the steam engine, touching the lives of nearly the entire population.

If I had to bet, I’d bet on artificial, lab-grown meat being the first object motif. By 2050, we will have 9 billion people trying to live first-world lifestyles. The protein has to come from somewhere, otherwise the world will melt into a hyper-obese puddle created by grain-based diets. Or kill itself in moral disgust and shame at the cruelties of factory farming. Lab-grown meat solves these problems at the cost of evoking a certain kind of horror for us Silicon Mind people who think of biology as an inviolable domain. But I am betting that by 2070, at least a third of the world population will be getting most of its protein from lab-grown meat.

Sounds yucky, doesn’t it? That visceral reaction by itself tells us we are nowhere near possessing the kinds of cognitive surplus required to accept such realities. We are creatures of silicon, who consider our electronic cognitive prosthetics “normal” but rebel at the thought of objects from a much more intimate technology base that can invade our bodies at all scales from nano to macro, instead of just being attached to them.

But let’s try to hold back the nausea, and try to imagine the even more distant future, when the cycle now getting underway runs its course. The material abundances are history. Various object abundances define the environment. Like lab-grown meat, vat-babies, 200 year old humans on their fifth hearts and environments flooded with batteries of all sizes and types. Imagine a physical environment that has far less stuff and uses far less energy, but exhibits a lot more variety. Variety that replicates, maintains itself, and evolves, through reprinting and regenerating as necessary, part and whole. Variety that blurs all visible distinctions between natural and artificial, thanks to the ho-hum genomic abundance at the material level.

This is the new normal world of 2200, say. But what sort of mind emerges out of it?

We cannot even comprehend this mind, yet in our teeming perspective circuitry of insight porn, we can navigate the infinite delta streams of future probability and see that there must one day come a Mind whose merest operational parameters we are not worthy to calculate, but which it will be our fate eventually to design, using our puny Silicon Mind.*

This will be the Next New Mind, circa 2300 AD. The Transhuman Mind.

* For the Silicon Uncultured among you, this is a reference to the Deep Thought computer in The Hitchhiker’s Guide to the Galaxy.

Endnotes

Water as a material abundance motif in the list represents water as an energy source. The massive mechanical creativity that was unleashed in the middle ages, was driven during its most inventive period by water power, not coal. By the time steam power arrived on the scene, the mechanical arts were already being codified, as I discussed in my Hall’s Law post. I struggled to find an appropriate material abundance for what was generally considered a chaotic Dark Age everywhere except the realm of Islam. Then it struck me with head-slapping obviousness that the primary answer was paper (and to a lesser example, gunpowder). Though the Chinese invented both, it was the Arabs who turned both into material abundances, once they discovered and democratized the secret after the Battle of Talas in 751 AD. Islamic Central Asia soon became the leading center of paper manufacture. Water power helped scale paper production in later centuries. We forget that for the Gutenberg revolution to occur, paper had to be abundant. The object motif for the Cold War/Atomic Age was obviously the jet plane or space rocket in the popular imagination, but I personally prefer the shipping container. This example shows that the motif that sticks in the popular imagination may not always be a reliable guide to the actual dominant object-abundance. Why “Imagining” as the cognitive motif of the Neolithic Revolution? For agriculture to develop, you had to have a mind capable of imagining the future at least a year ahead, and engaging in behaviors like storing surplus grain in pots. Why “trading” as the cognitive motif for the copper/bronze ages? Because an international trade in tin had to develop for the Bronze Age. I believe this was the start of trade proper as we understand it today, and the original of the “business” mind, used to thinking in terms of trading surpluses over long distances. This one should be obvious, but “narrating” was the cognitive motif for the Epic Age precisely because that’s when we started telling long and complicated stories about ourselves and using them to store civilizational wisdom across time. It may seem surprising to make mathematics the cognitive motif for 800 AD, given the long earlier history. But arguably, it was the development of algebra that led to the development of a true mathematical mind. Arithmetic and geometry are just a little too close to physical reality. Wrapping the mind around the concept of zero was probably the first step, but algebra was where abstraction truly began. “Seeing” as a cognitive mode is basically what I understand art to be. The development of the rules of perspective, and the resultant rise of realistic drawing, was a hugely important development. As it spread, it became the abundant cognitive skill that led to the other intellectual developments we attribute to the Renaissance. Why 1800 for “organizing”? While humans have been organizing for millennia, thinking fluidly about an abundance of organizational forms had to wait till the limits of human energy and horse/ship communication were transcended. Pre-1800s organizational thinking was relatively impoverished. After 1800, we basically went nuts, inventing an entire zoo of organizational forms that reflected different patterns of energy and information use, and various models of authority and legitimacy. The process probably started with the invention of the modern nation-state around the Treaty of Westphalia (1645) and the English Civil War, but it took the emergence of early corporations like the East India Company, and the possibilities of fossil fuels and the telegraph, to create the Cambrian explosion of organizational variety. Hacking as a cognitive motif probably started with phreaking on the telephone grid. Certainly creative makers probably hacked before then, but there were no large-scale engineered realities to “hack” in the sense of a systematic culture. See my post Hacking the Non-Disposable Planet for more. “Refactoring” as a cognitive motif is unfamiliar and hard to think about because it is so new, and there is no consensus yet on labels. Perhaps my terms will stick. Maybe “Insight Economics” will win out once the “Insight Porn” bubble collapses. I don’t particularly care, so long as we agree we are talking about the same mental faculty that a lot of people seem to be autodidactically developing at the same time: an ability to see things from lots of different angles in search of the best one. Programmers obviously lead the pack (hence the label, which is drawn from programming in case you didn’t know).