Aphorism of the Day I: Consciousness is a little animal in our heads, curled up and snoozing, at times peering into the neural murk, otherwise dreaming what we call waking life.

Aphorism of the Day II: People are almost entirely incapable of distinguishing the quality of what is said from the number and status of the ears listening. All the new can do is keep whispering, hoping against hope that something might be heard between the booming repetitions.

.

What effect do constraints on informatic availability and cognitive capacity have on our ability to make sense of consciousness? This is one of those questions that philosophers literally dream of stumbling on, questions so obvious, so momentous in implication, that their answers have the effect of transforming orthodox understanding–if you’re lucky enough to catch the orthodoxy’s ear, that is!

The aim of the Blind Brain Theory (BBT) is to rough out the ‘logic of neglect’ that underwrites ‘error consciousness,’ the consciousness we think we have. It proceeds on the noncontroversial presumption that consciousness is the product of some subsystem of the brain, and that, as such, it operates within a variety of informatic constraints. It advances the hypothesis that the various perplexities that bedevil our attempts to explain consciousness are largely artifacts of these informatic constraints. From the standpoint of BBT, what we call the Hard Problem conflates two quite distinct difficulties: 1) the ‘generation problem,’ the question of how a certain conspiracy of meat can conjure whatever consciousness is; and 2) the ‘explanandum problem,’ the question of what any answer to the first problem needs to explain to count as an adequate explanation. Its primary insight turns on the role lack plays in structuring conscious experience. It argues that philosophy of mind needs to keep its dire informatic straits clear: once you understand that we make similar informatic frame-of-reference (IFR) errors regarding consciousness as we are prone to make in the world, you acknowledge that we might be radically mistaken about what consciousness is.

Radically mistaken about everything, in fact.

What is an ‘informatic frame-of-reference’ error? Consider the most famous one of all: geocentrism. We perceive ourselves moving whenever a large portion of our visual field moves–when we experience ‘vection,’ as psychologists call it. Short of this and vestibular effects, a sense of motionless is the cognitive default. As a result we stand still when the world stands still relative to us. So when our ancestors looked into the heavens and began charting the movement of celestial bodies, the possibility that they were also moving seemed, well, preposterous. What makes this error perspectival (or IFR) is the way it turns on the combination of cognitive capacity and information available. Given the information available, and given our cognitive capacities, geocentrism had to seem obviously true: “the world also is established,” Psalms 93:1 reads, “that it cannot be moved.” As informatically earthbound, we quite simply lacked access to the information our cognitive capacities required to overcome our native intuition of motionlessness. We found ourselves informatically encapsulated, stranded with insufficient information and limited cognitive resources. Thus the revolutionary significance of Galileo and his Dutch Spyglass–and of science in general.

According to BBT, what we call ‘consciousness,’ what phenomenologists think they are describing, is largely an illusion turning on analogous informatic frame-of-reference errors. The consciousness we think we have, that we think we need to explain, quite simply does not exist.

The fact that we can and do make analogous IFR errors regarding consciousness is not all that implausible in principle. A good deal of the debate in the cognitive sciences prominently features questions of informatic access. Given that the cognition of information gleaned from conscious experience relies on the same mechanisms as the cognition of information gleaned from our environments, we should expect to find analogous errors.

We should expect, for instance, to encounter instances of ‘noocentrism’ analogous to the description of geocentrism provided above. Geocentrism, for instance, assumes the earth is outside of play, that it remains fixed while everything else endures positional transformations. Is this so different from the intuitions that seem to underwrite our ancestral understanding of the soul as something ‘outside of play’? Or how about the bootstrapping illusion that seems so integral to our sense of ‘free will’?

Given that conscious (System 2) deliberation is brainbound, only the information that makes it to conscious experience (via ‘broadcasting’ or ‘integration’) is available for cognition. With geocentrism, the fact that we are earthbound constrains the environmental information available for conscious experience and thus conscious deliberation. With noocentrism, the fact that cognition is brainbound constrains the neural information available for conscious deliberation. When conscious deliberation turns to conscious experience itself (rather than the environmental information it communicates) the limits of availability (encapsulation) insures that a variety of information remains inaccessible–occluded.

What information is occluded? Almost all of it, if you consider the 38, 000 trillion operations per second your brain is allegedly performing this very instant. Everything really hinges on the adequacy of what little we get.

One of the things I love about Peter Hankins’ Conscious Entities site are his images, the way he uses filter effects to bleed information from photographic portraits until only line sketches remain. Not only does it look cool, I couldn’t imagine a more appropriate stylistic trope for a website devoted to consciousness.

Why? Imagine running your perception of environmental reality through various ‘existential filters’–performing a kind of informatic deconstruction of your perceptual experience. Some of this information is phenomenal, but much of it is also cognitive. That red before you belongs to an apple, one object among many possessing a history in addition to a welter of properties. You know for instance, that you can bite it, chew it into little pieces. In fact, you have a positively immense repertoire of ‘apple information’ at your disposal, which should come as no surprise, given that your brain is primarily an environmental information processing machine, one possessing an ancient evolutionary pedigree.

What your brain is not, however,is primarily a consciousness information processing machine. Because the brain is primarily designed to exploit ‘first order’ environmental as opposed to ‘second order’ experiential information, we should perhaps expect a dramatic discrepancy between 1) the quantity of environmental versus experiential information available; and 2) the way environmental and experiential information are matched to various cognitive systems.

One of the most striking things about all the little perplexities that plague consciousness research is the way they can be interpreted in terms of informatic deprivation, as the result of our cognitive systems accessing too little information, mismatched information, or partial information. To get a sense of this, think of the information at your disposal regarding apples and begin subtracting. You can begin with the nutritive information you have, what allows to identify apples as a kind of food. Then you can subtract the phylogenetic information you’ve encountered, what allows you to identify the apple as a fruit, as a reproductive organ belonging to a certain family of trees. Then you can subtract the information that allows to distinguish apples from inorganic objects, as something living. Then you can subtract all the causal information you’ve accumulated, the information that allows you to cognize the apple as a effect (possessing effects). Then you can subtract all the substantival information, what allows to conceive the apple as an aggregate, something that can be bitten, or smashed into externally related bits. Then you can move on to basic spatial information, what allows to conceive the apple as a three dimensional object possessing a position in space, as something that can be walked around and regarded from multiple angles. At the very end of the informatic leash, you have the differentiations that allow you to identify this apple versus other things, or even as a figure versus some background.

So, back to our parallel between geocentrism and noocentrism. As I said above: when conscious deliberation turns to conscious experience itself (rather than the environmental information it communicates) the limits of availability (encapsulation) insures that a variety of information remains inaccessible–occluded. Deliberative cognition (reflection) has no access to causal information: the neuronal provenance of conscious experience is entirely occluded. So when deliberative cognition attempts to identify precursors, it only has sequels to select from. As a result, it seems to have no extrinsic precursors, to be some kind of ‘causa sui,’ moveable only by itself.

It has no access to spatial information per se: we have a foggy sense of various phenomenal elements ‘occurring within’ a larger sensorium, which we are wont to ‘place’ in our ‘heads’ in our environment, but its not as if our sensorium is ‘spatial’ the way an apple is spatial: since it is brainbound, deliberation cognitive cannot access information regarding our sensorium by ‘walking around it,’ changing our position relative to it. Lacking this environmental channel, it has to be ‘immovable’ with reference to cognition–once again, in a manner not so different from what we see with geocentrism.

Deliberative cognition likewise has no substantival information to draw on: we can’t, as Descartes so famously noted, break our sensorium up into externally-related parts. Absent this information, the cognitive tendency is to mistake aggregates as individuals, as substantival wholes. Here we see one of the more crucial insights belonging to BBT: the way ‘internal relationality’ and the concepts of holism that fall out of it, that govern our understanding of semantic concepts such as ‘context,’ for instance, is a kind of cognitive default pertaining to the absence of information. Our notion of ‘meaning holism,’ just for instance, is an obvious artifact of brainbound informatic parochialism according to BBT, much as Aristotle’s notion of ‘celestial spheres’ is the artifact of earthbound informatic parochialism. Lacking the information required to see stars as distant, as externally-related objects scattered through the void of space, it seems sensible to interpret them as salient features of an individual structure, an immense sphere.

We all know that our ability to solve problems depends on the relation between the information and computational resources available. BBT simply applies this commonsense knowledge to consciousness, and interprets the perplexities out relying on what are actually quite commonsense intuitions. Beginnings have no precursors. Blurs lack internal structure.

If you’re steeped in consciousness literature and reading this with a squint, thinking that I’m missing this or misinterpreting that, or that it’s just gotta-be-wrong, or ‘yah-yah-it’s-no-big-whup,’ then just ask yourself: How does the relation between available information and computational resources bear on the problem of consciousness? It could be ignorance-fed hubris on my part, but I’m convinced thinking this question through will lead you to many of the same conclusions suggested by BBT.

I’ve been sitting on the basic outline of this approach for twelve years. Since the ‘Now’ and its paradoxes were my first philosophical obsession, something that had driven me cross-eyed more times than I could count, I realized that BBT was a potential game-changer given the ease with which it explained its perplexities away. Just consider what I mentioned above: Lacking informatic access to the neural precursors of conscious experience, deliberative cognition finds itself on a strange kind of informatic treadmill. It can track temporal differentiations effectively enough within conscious experience without, however, being able to track the temporal differentiation of conscious experience itself. It’s an old axiom of psychophysics that what cannot be differentiated is perceived as the same. And thus the ancient perplexity noted by Aristotle, the way the now is always different and yet somehow the same is explained (and much else asides).

The reason I’m thumping the tub as loudly as I can now is that, quite frankly, I could feel the rest of the field moving in. On the continental side of the philosophical border, I saw more and more thinking tackling the difficulties posed by the cognitive sciences, whereas on the analytic side, I found more and more thinkers accepting, in a variety of registers, the central assumption of BBT: that the consciousness ‘revealed’ by introspection (or deliberative metacognition or higher order thought) is little more than a water-stain informatically speaking, an impoverished blur that only seems a ‘plenum,’ something both ‘full’ and ‘incorrigible’ (or ‘sufficient’ in BBT-speak) because being brainbound, it has little or no information to the contrary.

My problem, as always, lies first in the idiosyncrasy of my background, the way I’ve developed all these concepts and ideas in isolation from the academy, and so must inevitably come across as naive or amateurish to ingroup, specialist ears; and second in my bizarre inability to see any of my nonfiction enterprises to the point of submission, never mind publication. This latter problem, I’m sure, is shrink material. The former is bad enough. The only thing worse than being an iconoclast in a field filled with crackpots is being an iconoclast who can only seem to blog about his ‘oh-so-special’ ideas!

The logic of neglect operates across all levels.

To boot, I’m sure being a fantasy novelist doesn’t help, particularly when it comes to a institution as insecure about its cognitive credentials as philosophy! Ah, but such is life. Toil and obscurity, my brothers. Toil and obscurity. For those of you who find this wankery insufferable, I apologize. If you want me to shut up already, ask your philosophy professor to take a looksee and correct my errant ways. In the meantime, I am, as always, the meat-puppet of my muse. And for those of you who have developed a morbid fascination with this morbid fascination, this strange intellectual adventure through the fantasies that constitute our souls, I need to extend a big… fat… danke…

Smoking ideas has to be one of the better ways to waste one’s time.