The water cooler runs dry
If you’re closing in on 50 but want to feel much, much older, teach a college course. I’m doing that now, at 49, and hardly a class goes by when I don’t make an allusion that prompts my students to stare at me as if I just dropped in from the Paleozoic era.
Last week I mentioned the movie “They Shoot Horses, Don’t They?” Only one of the 16 students had heard of it. I summarized its significance, riffling through the Depression, with which they were familiar, and Jane Fonda’s career, with which they weren’t. “Barbarella” went sailing over their heads. I didn’t dare test my luck with talk of leg warmers and Ted Turner.
I once brought up Vanessa Redgrave. Blank stares. Greta Garbo. Ditto. We were a few minutes into a discussion of an essay that repeatedly invoked Proust’s madeleine when I realized that almost none of the students understood what the madeleine signified or, for that matter, who this Proust fellow was.
And these are young women and men bright and diligent enough to have gained admission to Princeton University, which is where our disconnect is playing out.
The bulk of that disconnect, obviously, is generational. Seemingly all of my students know who Gwyneth Paltrow is. And with another decade or two of reading and living and being subjected to fossils like me, they’ll assemble a richer inventory of knowledge and trivia, not all of it present-day.
But the pronounced narrowness of the cultural terrain that they and I share — the precise limits of the overlap — suggests something additional at work. In a wired world with hundreds of television channels, countless byways in cyberspace and all sorts of technological advances that permit each of us to customize his or her diet of entertainment and information, are common points of reference dwindling? Has the personal niche supplanted the public square?
Both literally and figuratively, the water-cooler show is fading fast, a reality underscored by a fact that I stumbled across in last week’s edition of The New Yorker: In the mid-1970s, when the sitcom “All in the Family” was America’s top-rated television series, more than 50 million people would tune in to a given episode. That was in a country of about 215 million.
I checked on the No. 1 series for the 2012-13 television season. It was “NCIS,” an episode of which typically drew fewer than 22 million people, even counting those who watched a recording of it within a week of its broadcast. That’s out of nearly 318 million Americans now.
“NCIS” competes against an unprecedented bounty of original programming and more ways to see new and old shows than ever, what with cable networks, subscription services, YouTube, Apple TV and Aereo. Yahoo just announced that it was jumping into the fray and, like Netflix and Amazon, would develop its own shows.
In movies, there’s a bevy of boutique fare that never even opens in theaters but that you can order on demand at home. In music, streaming services and Internet and satellite radio stations showcase a dizzying array of songs and performers, few of whom attain widespread recognition. In books, self-publishing has contributed to a marked rise in the number of titles, but it doesn’t take an especially large crowd of readers for a book to become a best-seller. Everyone’s on a different page.
With so very much to choose from, a person can stick to one or two preferred microgenres and subsist entirely on them, while other people gorge on a completely different set of ingredients. You like “Housewives”? Savor them in multiple cities and accents. Food porn? Stuff yourself silly. Vampire fiction? The vein never runs dry.
I brought up this Balkanization of experience with Hendrik Hartog, the director of the American studies program at Princeton, and he noted that what’s happening in popular culture mirrors what has transpired at many elite universities, where survey courses in literature and history have given way to meditations on more focused themes.
“There’s enormous weight given to specialized knowledge,” he said. “It leaves an absence of connective tissue for students.” Not for nothing, he observed, does his Princeton colleague Daniel Rodgers, an emeritus professor of history, call this the “age of fracture.”
It has enormous upsides, and may be for the best. No single, potentially alienating cultural dogma holds sway. A person can find an individual lens and language through which his or her world comes alive.
And because makers of commercial entertainment don’t have to chase an increasingly apocryphal mass audience, they can produce cultish gems, like “Girls” on HBO and “Louie” on FX.
But each fosters a separate dialect. Finding a collective vocabulary becomes harder. Although I’m tempted to tell my students that they make me feel like the 2,000-year-old man, I won’t. I might have to fill them in first on Mel Brooks.
Frank Bruni is a columnist for The New York Times.