Mystery can be a good thing and a bad thing. Legal scholar Frank Pasquale's new book, The Black Box Society: The Secret Algorithms That Control Money and Information, focuses mostly on the latter. It examines the massively complex systems at the heart of the global economy that make high-stakes claims to being deserving of our trust: the data-driven mega-corporations that oversee the world's access to finance and the Internet. Through their data-driven “reputation” management—from Facebook profiles to credit scores—they increasingly oversee the world's access to us.

As I read it, I couldn't help but see theological undertones in Pasquale's warnings about looming technocracies—especially coming from a scholar who has written, for instance, about the intersection of Catholic social thought and intellectual property. I asked him about how and where ancient Catholic tradition might intersect with the futuristic, worldly realities he writes about.


You don't consider theology in the book per se, but as I read it I found myself on the lookout for a crpyto-theology in there. How could a Catholic write about secrecy and transparency without one? Am I right to look for it?

Absolutely. I taught at a Catholic university—Seton Hall—for nine years, and thought a lot there about how Catholic values could inform our largely secularized legal system. I also had the good fortune of attending St. Paul’s in Cambridge, Mass., when I was in college, and Fr. J. Bryan Hehir there used to bring up two of Chesterton’s ideas about mystery. First, the definition: something we cannot solve or dissect, but only grow wiser about over time. Second, a paradoxical way out: the idea that “if we try to explain everything, something always remains a mystery—but if we accept one mystery, everything becomes clear.”

To what extent are these questions about transparency and secrecy unique to the "information age"? To what extent are there precedents in the past?

The book focuses on how some massive companies aspire to (and to some extent achieve) the stupendous feat of serving up to us, on a digital platter, “all” the books on a given topic, or “all” the stocks or mutual funds one could invest in. But two mysteries almost always remain: What is the full extent of data they have, and how are they ordering it? Much of their power derives from keeping that under wraps—because if they exposed it, they fear anyone could copy them. But at least in the case of finance, the Promethean ambition to connect more and more markets has also created a kind of Babel—described in the book as instabilities that cascade when once-hidden liabilities or risks suddenly come to light.

I do see parallels in the past—certainly the feudal lords of church and state kept many of their deeds secret. But we shouldn’t be too quick to shrug and say, “Plus ça change.” The major difference in our time is that algorithmic systems threaten to become impenetrable even to those ostensibly in charge of them. That puts us in a “Hal 9000” type of scenario, where humans are answering to machines.

To what extent does "reputation," as understood by the information-centered economy today, have anything to do with pre-digital meanings of the word? Or even with "state of grace"? With "karma"?

Ideally, there would be a big overlap! Being a part of “the elect” would signify moral, even exemplary, conduct. Cory Doctorow’s idea of “whuffie” evoked that hope.

But the new reputational systems I describe in the first third of the book are often an algorithmic sleight of hand: They try to dignify simple profit-motivated classifications (as in, “John probably won’t make the bank as much money as most people will”) into a simultaneously moral and quantitative designation (as in, “John’s 580 credit score shows how lazy he is”). TV ads actually show a construction worker with a 580 credit score and the word “LAZY” in front of him! But who really knows why it’s so low—he could just have had a huge injury and overwhelming bills. What big, data-driven reputation is about, though, is flattening that history, and just making the person’s reputation a number. Narrative—essential to understanding what really happened—is gone.

In one of the book's epigraphs, Gerard Manley Hopkins writes that "all is in an enormous dark":

Million-fuelèd, nature’s bonfire burns on.

But quench her bonniest, dearest to her, her clearest-selvèd spark

Man, how fast his firedint, his mark on mind, is gone!

Both are in an unfathomable, all is in an enormous dark

Drowned.

What does his dark have to do with your "black box"?

The “enormous dark” might evoke the type of chaos or confusion that results when certain non-human ends dominate human welfare, or human understanding. “Quench” the human, and what tends to dominate is the machinic, the quantitative. One can imagine social systems with skyrocketing GDP, for instance, and decreasing opportunity, leisure, happiness, fulfillment, for most people in them. Indeed, look at how economic “growth” was distributed in 2009-2011: 121 percent went to the top 1 percent! That means 21 percent of the “growth” was siphoned out of the normal earnings of the bottom 99 percent. But such results are to be expected when, as Dupre & Gagnier put it, “most economists believe that the core of economics can be developed with no assumptions at all about what an economy should aim to provide.” If you let the economy be a “black box,” and don’t care how it grows or how that growth is distributed—well, then, perhaps you’ve achieved an “unplanned” economy, but at what cost?

“Quenching” the human may also refer to a less final, but ultimately destructive, deformationof man by machine. My fear, expressed in pianissimo in this book, but more clearly in articles like “Technology, Competition, and Values,” is that individuals’ competition to succeed in an ever more technologized environment leaves them consigned to a sort of Darwinian battle where there is no time even to apply—let alone cultivate or develop—the ethics that would enable one to determine whether the competition is fair, or worth completing in the first place.

The focus of your attention is on the Internet and financial industries. What would a chapter on the church look like, had you written it? Would such a chapter have a place in this book?

I’d probably frame it as an engagement with John T. Noonan’s A Church That Can and Cannot Change. I’d look at the most surprising changes in the church, and try to explore why they were surprises. What was so opaque about the church that prevented (most) people from anticipating the possibility of change? Is that opacity necessary to its governance? Perhaps it is—I don’t begrudge civil society institutions the chance to develop their ideas and commitments in private. My main point in the book is that finance and Internet companies have become a quasi-government, all the more powerful thanks to their ability to evade the “sunshine laws” and disclosure requirements applied to most governments. To the extent the church avoids such a governmental role (and does no harm otherwise), I respect its own prerogatives to privacy and confidentiality. But when it asserts itself juridically, or abuses trust—well, then, it begins to look like the political leaders that constitutional norms dating back to the Magna Carta have tried to rein in.

You quote Google founder Sergey Brin as saying, "The perfect search engine would be like the mind of God." It's reminiscent of what Stephen Hawking once wrote, at the end ofA Brief History of Time, about a theory of everything in physics. Maybe we've replaced our desire to contemplate the divine, or even understand the universe, with one merely to trudge through the human-made Internet. Should we be on the lookout for idolatry in utterances like this?

Yes, and it’s disheartening on several levels. I think of any medium of expression as a very imperfect trace of the personality, ideas, wit, hopes, etc., of its author. Then we can think of the Internet Google is searching as yet another reduction, a limited rendition, of the world. And a further reduction of that—to rank-ordered pages—is supposed to be like the mind of God? Give me a break. It reminds me of the artificial intelligence enthusiasts who said a thermostat was conscious because it performed actions in response to a stimulus (hotter or colder air). What a deflated sense of the divine, let alone the human, condition.

The real problem with today’s Prometheanism (or Pelagianism) is that it is so pallid. The missing core of economics could be: Let’s try to assure that the 99 percent of the world population living on less than $34,000 a year (per person, per household) has the same level of health and education and housing and food as those with more. That’s what a preferential option for the poor would look like today, and it’s a pretty clear (if daunting) goal. But instead we have missions to “organize the world’s information” (at least one remove from real human need), or simply to make more money than anyone else (two or more removes). Again, Hopkins comes to mind—to aspire to divinity via meeting real human needs and aspirations, rather than simply processing and sorting their digital traces.