When I was growing up in suburban New Jersey in the 1960s, my parents would announce periodically that we would be going to go out to dinner. When the announcement was made, the evening was imbued with a festive air. We dressed up — I have a recollection of patent leather shoes and crinolines. Eating out was an occasion; it happened rarely and felt like an extravagance.

I don’t think my family was unique in this. Most people of that era — unless they worked in advertising — rarely ate out.

No longer. Now, everyone eats out all the time.



I say “everyone,” knowing full well that there are people for whom eating out is, at best, a trip to McDonalds, and others who continue to gather around the dinner table (à la Moonstruck) for a home-cooked meal. But most people who graduated from college in the past two decades — even those with mountains of college debt — eat out a great deal. Eating out has become something done routinely — a habit, if you will. The philosopher and psychologist William James defined a habit as something learned through repetitive action: we do it “with difficulty the first time, but soon do it more and more easily, and finally, with sufficient practice, do it semi-mechanically, or with hardly any consciousness at all.”

It may seem odd to think of eating out as a habit. It requires deliberation to decide where to go and what to eat. But the larger decision to consume food and drink in public space instead of in the privacy of the home has become reflexive — the default locale for meals and snacks. This has far-reaching consequences for our relationship to each other and society. To again invoke William James: “All our life, so far as it has definite form, is but a mass of habits — practical, emotional, and intellectual — systematically organized for our weal or woe, and bearing us irresistibly toward our destiny, whatever the latter may be.” Because eating out is now habitual among a fairly large segment of the populace, this seems to me to have implications for our communal destiny.

The first media depiction of the eating out habit can be traced to the sitcom Seinfeld, which debuted in 1989. Many of the show’s scenes were set in Jerry Seinfeld’s combined kitchen-living room, but no one in the Seinfeld universe ever appeared to cook or have a meal inside the home. The recurring site of Seinfeldian communal dining was Tom’s Restaurant on the Upper West Side and, more sporadically, in restaurants, yogurt places, and soup joints throughout New York City.

The chronic eating out that Seinfeld represented was not just a tic of New York apartment-dwellers. New York City may be the epicenter of the habit, but eating out in its habitual form came to New York via the deluge of young people who arrived in the city following their graduation from college. College is where the habit was formed — where it came to be practiced, again to quote James: “semi-mechanically, or with hardly any consciousness at all.”

Let’s examine how this came about.

In the 1970s, colleges began to shift away from a prescribed curriculum. Core and even generalized distributive requirements mostly disappeared. By the 1980s, college began to be marketed as a place you went to realize your potential, not just through coursework but through a wide array of “experiences” (elaborate gym facilities, travel options to exotic locales, self-designed majors, unusual internships, etc.). Choosing the right college became an existential quest, and the price of college, as if to accommodate this more august goal, rose accordingly.

One result of the new college experience was to give food a prominent role. Accommodating food preferences became part of servicing the high-paying student clientele. And continuous and varied nourishment contributed to the enhanced lifestyle that students (and their parents) expected from college.

In the beginning, the focus on food was confined to the student cafeteria that began to offer more interesting and healthful options. But soon, food offerings on the college campus spilled beyond this singular site. An interesting parallel can be drawn here to the rise of Starbucks, which opened its first store in Seattle in 1971 and began to expand in the 1980s. By the ‘90s, it had not only developed its menu beyond coffee but had become ubiquitous in cities throughout the country, with outposts at most colleges.

Starbucks paved the way. Walk around any college campus today and you’ll see numerous and varied eateries, from coffee stores (not only Starbucks, but Saxby’s and Joe’s) to burger joints, vegan restaurants, pizza parlors, sushi bars, and cupcake shops.

The schedule for eating also changed as food gained importance within the university environment. What used to be digested at prescribed intervals during the day now began to fill in the spaces between. Again, Starbucks took the lead in teaching students to be constantly sipping and snacking. Nowadays, you rarely see someone in the library without a smoothie alongside his computer or walking to class without a coffee cup in her hand — and not a “tall” (aka small) coffee cup but a “grande,” so as to last the length of the class period.

Water has also become involved in the eating-out habit. Twenty years ago, buying water was something of an affectation. Now, it is a de rigueur purchase for most people of a certain age, who see it as the safest and healthiest way to gratify a passing thirst. “Even if you don’t let your mouth touch the water fountain,” one of my students explained, “there’s still that filthy internal mechanism to worry about.” (Don’t ask what those plastic water bottles are doing to the environment. That’s the paradox of a complex system: virtue in one domain can mean vice in another.)

Buying water and other drinks serves the additional purpose of filling the dead periods of the day, which students can also use to call their parents (these calls tend to conclude abruptly with: “I’m here. Gotta go”).

As fraternizing over food and drink began to take precedence over studying in dank library carrels, college became a hard place to leave. And so, unsurprisingly, graduates tended to prolong it. Eating out continued after employment at Goldman and Google, as well as into unpaid internships and self-financed entrepreneurial schemes. Either way, it had ceased to be a luxury (a habit, by definition, feels necessary in being entrenched in its practitioner’s daily life).

If you compare the ‘90s show Seinfeld to the show Girls, currently on HBO, you can see how the relationship of college to the post-college lifestyle has become more entrenched and explicit. The characters in Girls are Oberlin grads (we never learn where the Seinfeld gang went to school). Built into the Girls’ storylines is the understanding that their life in New York City is an extension of their life in college, where studying was mostly about “creative” writing and talk. (After all, hanging out at Starbucks to scribble and talk is a cheaper alternative to pursuing an MFA.)

I was recently in Soho in Manhattan, a center of millennial post-college life, and was struck by the degree to which the eating out habit dominates that hipster scene. The coffee shops, bakeries, restaurants, and wine bars (when did wine start having its own bars?) were packed. It was 2p.m. on a Friday afternoon (too late for lunch; too early, I would think, for escaping work for the weekend), yet everyone was out in force, eating and drinking. I tried to get a seat in La Colombe, the high-end coffee purveyor, but no luck. I then waited on line at Magnum, the specialty ice cream store where you create your own ice cream bar (there’s a guide for how to keep this under 350 calories), but the line was moving at a snail’s pace. One of the byproducts of much contemporary food and drink purchase is that it’s labor-intensive (just steaming the milk for a latte, especially if you want a fancy foam leaf on top, takes time). Yet people seem OK with waiting; in fact, it seems they like to wait.

Wandering around in the 90 degree Soho heat, I could not find a spot that didn’t have a line or an exorbitant cover charge. One restaurant wanted to send me to the basement, where a gaggle of youth were sitting on plastic chairs in a dim space eating guacamole and chips. Another very chic ice cream shop had high tables without chairs, where everyone seemed content to be standing while eating miniscule portions of expensive gelato. I finally sighted a Mr. Softie truck and ate my chocolate and vanilla swirl with rainbow sprinkles sitting on the ledge of a building plaza next to a homeless person and a couple from Indiana.

One of the ironic aspects of the eating out habit is the way in which it takes the thrill out of the behavior it generates. This is the nature of a habit — it is repetitive and dulling. The excited anticipation I had as a child at the prospect of going out to dinner doesn’t happen anymore. To make eating out special one must seek new locales and cuisine (hence the foodie drive to be first at the latest bistro and find the best mushroom risotto).

Another byproduct of the eating out habit is that it obliterates the distinction between the lasting and the fleeting. Recently, I was browsing the shops on Pine Street in Philadelphia and happened into one where the owner, a weary-looking older woman reading a book at the counter, seemed surprised to see me enter. After I purchased a pair of candlesticks, we chatted a bit about the neighborhood. “This used to be one of the city’s best shopping streets,” she said. “But now, nobody buys things. All the young people do is buy food.”

This raises the interesting question of how much money can be spent on food and drink over the course of, say, a year that might be directed toward purchasing, say, a bed, or a washing machine, or making the down payment on a house. I am mystified by the failure to weigh the difference between an item that has lasting value and one that is consumed. I know people who would never purchase a $1000 painting — far too extravagant — but would put down hundreds of dollars for a single meal at a fancy restaurant (the drinks, ramping the bill up to astronomical heights, could easily be imbibed for a fraction of the price at home a few blocks away).

But let me stop. I have begun to sound like a scold. This is unfair for two reasons: One: I partake of the habit myself, and do not think that eating out a lot is necessarily a bad thing (certainly, it is good for restaurants). I don’t think buying lattes, bottled water, and bento boxes qualifies as gluttony or flagrant irresponsibility. It’s true that millennials might make more of a dent in their college loans if they ate out less, but, then again, they learned to eat out in college.

Second and most importantly, there is a lot to be said for the culture that arises around eating out. I’ve made it seem like a bad habit, but in some ways, it is a good one (again to quote William James: “our virtues are habits as much as our vices”). I know, as an educator, that food is an incentive to intellectual discourse. If the university is going to recreate itself as a new agora, it needs to have restaurants and coffee shops to draw young people in, where they can continue their discourse after class — and on after graduation. Some of my most interesting conversations have occurred over meals at Zavino (the high-end pizzeria that abuts my campus building), and some of my most worthwhile projects have been hatched over Starbucks and Joe’s, both of which are a stone’s throw away. My university is a much livelier place as a result of the multiplicity of eateries, and I am proud to live in a part of the city near what has been termed “restaurant row.” I often join the people in the coffee shop across from my apartment for a latte, even though I could have coffee in my own kitchen and wouldn’t have to wait, no less pay $3.50, for it.

Buying food that we could more cheaply and efficiently prepare at home may seem frivolous, but quoting Shakespeare’s King Lear: “Reason not the need.” •