This is part 10 of an ongoing series where I explore my relationship with Perl. You may wish to begin at the beginning.

This week we look at DWIMmery. 1

Last week we talked about the awesome productivity of Perl without really asking why it’s so good at it. For one answer to that, we’ll need to look at a new spectrum.

There are several different spectra in Perl. In fact, one of the coolest things about Larry is that he’s not only given us a cool language to play with; he’s also given us cool new ways to think about the languages we play with. Along one spectrum we have the tripartite concept of laziness, impatience, and hubris. This series has only touched on these indirectly: laziness drives the language be legible, impatience drives it to be productive, and hubris drives it to be closer to a natural language. 2 Along another spectrum, we have the orthogonal concepts of manipulexity and whipuptitude, which we just discussed last week.

But there’s another spectrum to look at, which includes the twin concepts of TMTOWTDI and DWIM. 3 TMTOWTDI is closely linked to context, which we covered pretty thoroughly. But what about DWIM?



DWIM has a relationship to context as well. A language can’t really be DWIMmy without a thorough understanding of context, because “what I mean” often depends a great deal on when and how I say it. But DWIMmery goes beyond that, in my opinion. DWIM is one of the things—one of the main things, I believe—that allows Perl to Get Shit Done so efficiently. With other languages, I have to spell out every little thing. With Perl I can do the coding equivalent of saying “you know: do the thing with the thing” and it will just trundle off and do that. People say that magical variables like $_ are hard to learn. They’re not. They’re hard to teach. They’re sometimes hard to understand completely, difficult to grok, we might say. But they’re not hard to learn, because there’s nothing to learn. They just work. They just DWIM.

Recently, a co-worker pointed me at an RJBS module that I’d never seen before: Config::Singleton. It’s a very cool module; you should go check it out. Now, I’ve already talked about RJBS code (more than once, even). It’s some of the most modern Perl around, and should be part of everyone’s Getting Shit Done toolbox. Here’s a line from the POD for Config::Singleton:

Config::Singleton finds a config file via a series of DWIM-my steps that are probably more complicated to explain than they are to understand.



This statement is very nearly redundant. DWIMmy things are always harder to explain than they are to understand. It’s the nature of DWIM. Because I’m a human, and therefore what I meant to do is always complex, once you break it down. The human brain is naturally complex, and handles complexity at an almost invisible level. You don’t even notice it the vast majority of the time. Until you have to translate a mental process into a programming language, that is.

Last week I ever so briefly touched on the topic of smartmatching: just lightly brushed it with the barest fingertip. It’s a pretty broad topic, and it probably deserves a whole post to itself. 4 But this seems like as good a place as any to address one comment I received. 5 Here’s the relevant bit for our discussion today:

... you pretty much had to open up the full table of “left-hand-side X right-hand-side” behavioral combinations in the manual whenever you wanted to use the operator, to make sure you didn’t miss any edge cases for the kind of inputs you were dealing with.



I know the table of which he speaks. It’s in the perlsyn man page, and it surely is complex; I can’t argue with that. But have I consulted it every time I’ve used the operator? Nah. I skimmed it once, and I’ve glanced it a couple times since then. But I have used smartmatching constantly since it became available. Mostly in the context of given / when , which is where it really shines. I’ve also used when without given , which can be a pretty cool thing that just DWIMs. Also, when the Damian sent us the patch that implements where and when for Method::Signatures, he did it using smartmatching, and I use that all the time as well. I use smartmatching a lot, and nothing has ever broken, the world has never come to an end, and the only trouble I’ve ever had from it is what I mentioned last week: people not liking the fact that I used it because it’s now labeled “experimental.” Not because of any problem that it actually caused, in other words, but for fear of what it might do. I must defer to the words of m’colleague H. Granger on this point:

Fear of a name only increases fear of the thing itself.



So, as it happens, I’ve never hit a case where Perl’s concept of “the right thing” failed to match mine in the use of smartmaching. But I may. In fact, I’m pretty damn sure that eventually I will. So does that mean I better forego the use of smartmatching now, to save myself the hassle? Because of the inevitable horror that awaits me?

Well, what about when $_ doesn’t act as I expect, due to its being occasionally global and occasionally local ? 6 That one I’ve hit, more than once. And, let me tell you, it’s a real horrorshow to debug. How about the fact that sometimes curly braces mean a code block and sometimes they mean a hashref, and so Perl has to guess which one I meant, and sometimes it guesses wrong? Yep, hit that one too—once just recently, in fact. That one’s not quite as bad, but the error messages sure don’t give you much of a clue, that’s for sure. What about error messages from Moose? Have you ever tried to figure out just what you did wrong from that hideous stack trace with all the insane layers of MOPpiness going on in there and eventually just threw up your hands in frustration? Boy, I sure have.

So I should stop using all these things too ... right?

Let me tell you a story. It may seem unrelated, 7 but bear with me.

In college, I was an erratic student. In some classes, I was completely absorbed, attending every class and listening avidly. In others, I barely showed up at all, fell alseep in class, and did as little work as possible. Some of the latter classes I came perilously close to flunking out of, but others I did quite well at in spite of my best efforts. Most of my computer science classes were in that boat. Remember, even during my first stint in college I’d already been programming as a hobby for 4 or 5 years; by the time I went back to finish up my degree, I’d actually been a professional coder for two years. So mostly I took CS classes for easy As. I had a never-fail strategy: I’d show up on the first day and collect the “syllabus” (if you don’t know what a syllabus is, it’s a handy one-page cheatsheet for people who want to come to class as seldom as possible). Now I knew all the relevant dates, so I could just show up on the day each program was assigned, get the specs, write the code, and show up again when the program was due. Make sure to attend for any tests, and bam: minimal effort, maximal grade.

My second Pascal course was taught by a nerdy sort of gentleman who talked a lot about fly fishing and wrote tons and tons of code on the board—essentially all you needed to do to get the homework programs correct was just copy down all the blackboard code, rearrange it a bit, and hand it back to him. Of course, that was way too boring for me. I didn’t need to have code spoon-fed to me: I could write my own damn code. The first program went entirely according to plan and I got an A on it. With the second program, though, I made a crucial error: I screwed up my dates. I not only missed getting the assignment, I somehow managed to miss turning it in as well. Considering that I followed this plan for nearly a dozen different classes throughout my college career, I suppose it was inevitable that I was going to muck it up at least once. Frantically I calculated a GPA: even if I got a perfect 100 on every other program and test, a 0 here was going to tank me. 8 Could I drop the course altogether? No, I’d missed the deadline. Except ... if you could convince your teacher to sign off on it, get some sort of special permission slip, then you could drop after the deadline. Fine then: I’d just go to the teacher, give him a sob story, and get him to sign the form.

I don’t recall exactly what I told him—some litany of the ever-popular, ever-vague “personal problems,” I suppose—but I’m pretty sure I didn’t really lie ... just embellished a bit. Okay, a lot. And, in the end, he bought it. I could see it in his eyes. But then he surprised me. “Son,” he said, “you don’t want to drop this course. I’ll give you a second chance on that program.” Then he went on to tell me how he’d love to take me out fly fishing sometime. I nodded encouragingly; there was no way in hell I was going to go fly fishing with anyone, much less this near-total stranger, but, hey: he was going to let me turn in the program late. I could pretend to be interested.

So I got the assignment, looked it over back at home, and cranked out the code in about an hour. It was a simple recursive algorithm: too simple for a second-semester CS course, really, but who was I to complain? I turned in my two dozen or so lines of code 9 and, while I was there, I just happened to mention that my solution involved recursion.

His face fell.

“Oh, no, son. You can’t use recursion.” Whyever not, I asked. “Recursion is complicated! It’s hard for people to understand. You try to write something like that for a business and they’re going to tell you turn right around and write it over.” I refrained, at this juncture, from pointing out that I had already worked for businesses, and knew damn well that was extremely unlikely—he was doing me a favor, after all. I stared at the problem for a bit. This wasn’t one of those moronic examples of how to do recursion, like factorials, 10 where it would be just as simple—simper, even—to use a loop instead. This was something like a binary tree traversal, where you really needed the recursion. Hesitantly, politely, I asked him exactly how it should be done then. “You need to look at the code I put up on the board, son. It’s all in there.”

I don’t recall exactly how I got said code. Found someone else in class and borrowed their notes, maybe. But I do recall my utter incomprehension upon first seeing it. I had to stare at it for what seemed like forever, but eventually I caught on. He had converted the recursion to a loop, of course, ’cause that’s how you do recursion without doing recursion. But that was insufficient. He had to create a whole separate data structure to keep track of where he was: he basically recreated the program stack, only in highly verbose, super-inefficient Pascal code. In the end, I needed a couple hundred lines of code to replace the couple dozen, all to recreate what the compiler would have done for me automatically. And why? I can only surmise that, somewhere along the line, this teacher had been bitten by some overly clever recursive algorithm and had vowed: never again!

Some people feel the same way about Perl’s DWIMminess. I don’t. Next week we’ll see if we can figure out exactly why not.



