Chris has written a surprising book. He’s written something so inherently human that you can’t help but be swept up into the new world of agentive technology. To be honest, it’s just not fair! It sneaks up on you because Chris effortlessly shows the progression toward agentive technology—toward our giving agency to technology to be simply an obvious step in human evolution. That’s a profound, and useful, shift of mindset.

Chris tells the story of artifcial intelligence from the perspective of human imagination (sci-fi, scary) and of technical capabilities from the perspective of human needs and desires (real, narrow, beneficial). It’s a sleight of hand that brings perspective to some of the “sky is falling” noise that’s out there right now around AI. More importantly, this approach makes it all so relatable (and, yes, readable). You won’t leave here knowing how machines learn, but you will appreciate better how machine learning might impact the humans who rely on it. You’ll also be introduced to the implications of that reliance over time. These might surprise you—it’s not about AI as a sinister over- lord, but rather the seemingly mundane implications of a machine’s lack of empathy.

Again, humanity. I’m so struck with how human this book is.

It’s a book about invention and the evolution of ideas, technologies, and desires. I think maybe the single biggest trick Chris performs here is that by providing the history of various tools and their creators (like temperature control technology), the obviousness of technical assistants is almost shown to be a refined human need, as opposed to a new technical capability. This completely changes how we should approach the design of agents. It argues for, well, human-centered design, now, doesn’t it?

And that, finally, is what leaps from these pages: the need for new practices in human-centered design. Without approaching the problem from a “framework” perspective (thank you), Chris offers the first word on some of these practices. He adds depth to the understanding of how agents differ from tools (both hardware and software). And by covering agentive technology’s human impacts, he shows that industrial design, UX design, and service design don’t adequately address what’s required to understand and solve problems of agentive technologies.

This is just the beginning of a new conversation in design, for sure, but wow—what a great start!

Phil Gilbert

General Manager, Design IBM Corporation

March 22, 2017