In the groove

National Review, April 17 edition


As I’ve slipped into the heart of middle age, I’ve noticed myself becoming irritatingly susceptible to bouts of nostalgia. The latest episode occurred when, rummaging through some boxes in the basement, I came across the tattered remnants of my once-great record collection.

In the romanticized version of my life, the first record I ever purchased was one that reflected admirably on the sensibilities of a self-styled twelve-year- old tastemaker—something, say, in the vein of the Ramones or the Velvet Underground. Deep down, of course, I’m pretty sure what I bought was inexcusably synth-y, saccharine, bouncy, and very popular.

Now, in fairness, I would go about making up for this initial populist misstep by meticulously assembling hundreds of hard-to- find vinyl records, cassettes, and even, for still-inexplicable reasons, a few eight-track tapes (though I owned no means of listening to them). Finding a great record was always my favorite high.

Snobbish as this sort thing was, my collection was a central part of my identity as a kid. Much of my allowance, and the preponderance of my teenage income, would be plowed into this endeavor, often on “indie,” college rock, and “imports.” The last were records that national chains like Sam Goody’s would sheath in thicker plastic jackets to create the perception of scarcity. They would then charge suckers like me five extra dollars for music that was likely pressed in the same Taiwanese factory as Madonna’s latest record.


Nevertheless, I eagerly surrendered to the gouging. In some ways I even appreciated it. After all, owning what were ostensibly rare records demonstrated my cultural sophistication—or pretentiousness, depending on how one looked at it. And, admittedly, the only person looking at any of it was me.

Whereas I was intractably lazy as a teenager in almost every consequential way, I never took even take the smallest shortcut when it came to my collection. For example, no singles allowed.

If an artist could muster only one decent song at a time, he might become famous and stupendously rich, but he wasn’t worthy of a place in my milk crates. Nor did I ever slide a “greatest hits” album next to a disc of integrity. Albums, like books, were meant to be cohesive and unique.

Now all that’s left of my efforts can be housed in a single box. Around 20 records, preserved with their scratches, nicks, and frayed corners—my collection was intended to be played, not saved. Funny thing is that I don’t recall ever giving or throwing away a single one of them. Yet the collection has been distilled over two decades of adulthood into 20 of the most important records.

These days, of course, people can listen to virtually anything they desire on any device they want wherever they are and whenever they please. This access to artists, to which I have completely submitted, is astonishing. So I’m certainly not a technophobe. But it doesn’t feel quite like we own music anymore; we’re merely given temporary access to it. I realize it’s irrational and archaic to think this way, but I would never consider myself an “owner” of a book that sits on a device in digital form. A genuine collection is cherished precisely because it’s perishable.

Modern vinyl fans talk about the warmth and the crackling quality of a record. It’s true. There’s that. They tend to emphasize the artistry of the covers. Also true. Records can be beautiful to look at. Some fans talk about how the ritual of extracting them and caring for them is satisfying in ways that opening a “streaming service”—such antiseptic words—is not. True.


My sentimentality is driven by all these things. The ritual, but also the hunt. There was no YouTube. No Spotify. No sampling of songs. With only a few dollars to his name, a curious teen would often make a decision hearing only one song on the radio, or perhaps none at all. Perhaps he had read a glowing review in a rock magazine or caught a snippet of sound on the fringes of the dial. Perhaps he trusted a label because of its immaculate history. He might have seen a band live and been impressed. Or, often, aesthetic clues on the back of the record were too tempting to pass up. I spent many hours of my life holding two records in my hands, weighing which was more deserving of my seven or eight dollars.

Occasionally your record would forcefully exert itself at the drop of the needle, creating immediate euphoria. However, records that didn’t make an immediate impact would still be given full consideration. You sit in one place. You listen to one song and one side at a time. It takes little effort to click past a song or an artist that doesn’t offer you instantaneous musical gratification. Some of the best music needs time to boil. Procedurally speaking, a record demands your attention.

Of course it’d be nice to relive those days. Mostly, I’ve been resistant to the vinyl revival because it reeks of Millennial retro trendiness. Turns out that buying an old record player is now an expensive proposition. I’ll probably give it a shot anyway. The truth is it will likely be a fun hobby, but it’ll never be the same. Not merely because I’ve gotten older and less patient, but because the music-listening experience has been forever altered by technology. There are approximately 8 trillion songs on my phone. And that’s fine. Every generation gets to enjoy its own small pleasures. The record collection just happened to be one of ours.


husker duramones


crampsmats stink


In the groove

Aspirin Eater


National Review, June 2015

Every morning I wake up and shake the small white plastic bottles scattered across my home office until one makes a familiar rattling sound. I open it and pop two Excedrin pills whether I have a headache or not—though most mornings I do. The process is repeated throughout the day, almost every day, until I get ready for bed. That’s when I rummage through my bottles once again and dig out the nighttime headache medicine (usually something caffeine-free, like Advil or Tylenol) and swallow two more pills.

This process is a means of prevention. But no matter what I do, every few weeks I will be subjected to another debilitating migraine. It might be triggered by the weather—especially cloudy and rainy days. Or it might be activated by a lack of sleep. Or it could be I’ve stared at a computer screen or binge-watched TV or talked on the phone for too long. Maybe I was sitting in rush-hour traffic or perhaps I failed to hydrate properly. It’s possible that I haven’t been eating the right foods or, even more likely, that I haven’t eaten enough. Whatever the case, wherever I am, another migraine is coming. I’ve given up trying to figure out why.

When I was younger, I assumed the attacks were attributable to cigarette smoking. So I quit. Later, I wondered if perhaps my irregular sleeping habits might be the cause, so I went to an apnea specialist. He told me to lose a few pounds. I did. I’ve tried natural remedies, though I was certain they wouldn’t work. They didn’t. One doctor even suggested that I cut my computer time in half and stop reading so much—which would have necessitated finding another career. I got another doctor. Only unpleasant practices such as exercising, eating healthy, and drinking less alcohol have proven to be even somewhat beneficial.

Gobbling down painkillers at this rate has become perfunctory, and it’s probably toxic for me in the long run. My habit already causes self-inflicted “medication-overuse headaches”—or rebound headaches—which occur when a person ingests too many analgesics. I have rebound headaches daily. Yet I continue taking white and blue pills, which also attack my stomach and do God-knows-what to my liver (though my blood is probably a lot thinner than yours), because few things scare me more than having to miss work and my family for a day or two because of a migraine.

I’m sure a doctor would prescribe something more potent, if I asked. But knowing how I feel about migraines, I’d probably abuse those drugs, as well. So I avoid the temptation altogether.

A few weeks ago, I ran across an ad campaign produced by Excedrin featuring the slogan: “A migraine is more than a bad headache. If you’ve never had one, you can’t understand. Until now.” The company contends that, through the magic of a virtual reality, it can offer family members and friends a taste of what migraine sufferers experience. Each ad ends with an I-told-you-so moment:

“I’m sorry I ever doubted you”—See one man’s journey from migraine doubter to believer.

Or: “See? You believe me now”—Tiffany missed Michaela’s birthday because of a migraine. Now her friend can see why.

Are there really migraine deniers? Can people empathize only when they have firsthand familiarity with your pain? Maybe. Empathy is the ability not only to perceive what others feel but also to experience their emotion in some way. But then we don’t need to have a bone sticking out of our leg to understand that compound fractures can be disagreeable. Should I care?

Migraine symptoms include pain, nausea, vomiting, and sensitivity to light, sound, and smell—basically all the faculties that allow us to be sentient human beings are hampered. It is impossible to write or read or think or even tweet. Though it isn’t acute in the way most physical pain can be, it can be incapacitating.

Who would inflict this on his family or friends? Frankly, any machine that could re-create the experience—and color me skeptical—should be weaponized. Honest Excedrin advertising language would probably go something like:

“Take that, you jerk”—Bill was doubting David’s pain, so David strapped him into a migraine-inducing virtual-reality contraption against his will and laughed and laughed and laughed . . .

At the risk of sounding saccharine, or like a middle-aged man contemplating his mortality or grousing about his increasingly brittle body, I’d say that migraines have taught me some valuable lessons. About empathy, pain, and perspective.

I don’t know if there is any dignity in suffering, but there was a time when my headaches only depressed me. Not anymore. Now I reflect that most people experience some form of slow-boil misery in their lives—either physically or mentally, often far worse than mine. For instance, I recently started paying attention to the never-ending succession of pharmaceuticals ads on TV. You know the ones; bright, clean, well produced, with distinguished gray-haired couples, D-list celebrities, and retired sports heroes imploring viewers to ask their doctor about this new drug. These people are starting to resemble me. But they have lung cancer or Hepatitis C or unbearable joint pain or chronic muscle pain or diabetes or gruesome rashes or bouts of incapacitating depression or heart disease or massive allergic attacks—not to mention an impressive array of other ailments I’ve yet to look up on WebMd for fear of finding out that I have them. And all of a sudden I feel sorta lucky. As I zoom toward 50, I’m kinda glad all I have are migraines—pain and all.

Aspirin Eater

Jerk logic

National Review, Nov. 21, 2016  issue.

Am I a jerk? You may find this an odd question for a person to ask himself. But when you’re in my line of work—which, broadly speaking, is called punditry—complete strangers on social media have little compunction about pointing out all your disagreeable character traits.

Since these drooling halfwits have been impugning my magnanimous disposition for years, I finally decided to investigate the matter. Self-examination, as Socrates might have said, is the hallmark of an enlightened man.

The first step is defining your terms. A “jerk,” according to the dictionary, is a contemptibly obnoxious person. But, as anyone smart enough to write for a political magazine can tell you, the only way to properly evaluate any moral failing is to turn to the social sciences.

As luck would have it, the scientific periodical Nautilus recently featured a deep dive on the topic of jerks. Written by Eric Schwitzgebel, a professor of philosophy at the University of California, it informs us that jerky characteristics are driven by broader psychological groupings such as “narcissism, Machiavellianism, and psychopathic personality”—though pinpointing the exact parameters is both complex and relativistic. And isn’t that always the case?

Schwitzgebel also contends that there’s likely no correlation between a person’s self-opinion about his obnoxiousness and his actual “jerkitude.” In fact, if you believe everyone around you is terrible person, “the joke may be on you.”

The 2016 election, I’m afraid, has convinced me that the joke is definitely on me. But after taking meticulous inventory of my actions over the past year or so, I am forced to acknowledge that perhaps, on occasion, some of my behavior might be construed as wantonly unpleasant. Long story short, I am a jerk . . . with an explanation.

Most of humans are multifaceted beings with array of personality traits that can be triggered by various environmental factors. In everyday life, I’m sure, most of us succumb to obnoxiousness on occasion.

Well, that’s not my problem. I face another dilemma: People just don’t get me.

It begins with my New York upbringing, which has endowed me with an endearing coarseness that many of you provincials find grating for some reason. When you add to that a dry sense of humor and a standoffish personality, my attitude can sometimes be misinterpreted as being holier-than-thou—which, I assure you, is true maybe only 80 percent of the time.

Not long ago, I took one of those “freakishly accurate” Myers-Briggs Type Indicator tests to help discern how anyone could find me off-putting. As it turns out, I belong to a subset of humans called the “Logicians”—which, let’s face it, already sounds pretty insufferable. Others who fall under this category are the noted philosopher and mathematician René Descartes and the fresh-faced actress Ellen Page. Individual personality traits are measured on a spectrum. I am, for instance: Introverted, 70 percent; Intuitive, 60 percent; Thinking, 60 percent; Prospecting, 54 percent; Assertive, 62 percent.

Role: Analyst. Strategy: Confident Individualism.

As I learned more about my personality type, I began feeling sorry for everyone in my almost certainly beleaguered family. While we pride ourselves on “inventiveness and creativity” and “unique perspective and vigorous intellect,” Logicians can also be “insensitive,”absent-minded,” and “condescending.”

A Logician sounds like the sort of guy who would be pondering the Julio-Claudian dynastic succession or the latest episode of Westworld while nodding and looking straight into his wife’s eyes as she earnestly asked him questions about the family’s upcoming Thanksgiving plans.

This set-up works fine until we are impelled to speak. Family members assure me that I inadvertently insult our friends and neighbors quite often when my mouth does open. So advice to fellow Logicians: Never mock the immaturity of middle-aged men who get tattoos before you’ve seen everyone at the barbecue shirtless. And never belittle ostentatious baby names floated by guests until you’re absolutely certain no one at dinner is pregnant.

Since my emotional IQ might be 10, a friend of mine helpfully suggested that I begin affixing smiley faces and exclamation points to my correspondence as a way to telegraph good intentions. So these days, I write things like, “Boy, you really embarrassed yourself on TV the other day! 🙂 🙂 :)” because I’m trying to be more cognizant of people’s feelings.

So how am I a jerk? Well, most of us live two existences. We can broadly divide our time into the professional and the home life.

As a writer, it’s incumbent on me to be clinically unpleasant and prickly when focusing on self-aggrandizing do-gooders or abusers of power or those who pollute our culture with garbage. One can make arguments in good faith while still being downright disagreeable. So I make no apologies for being disliked. There’s nothing wrong with being hated by the right people.

There are, in fact, far too many journalists overly concerned about being shunned. As a young critic writing his first reviews for a wire agency, I sometimes wrestled with an existential question: “Who am I to say these horrible things about people who are far more successful and powerful than I am?” Nowadays I ask myself: “How exactly can I say more horrible things about these people who shouldn’t be more successful or powerful than any of us?”

A skeptical and contrarian disposition is not only useful if you want to be a decent pundit, but indispensable if you want to be a good journalist on any beat. Does that mean I should be weaponizing Twitter as a means of hurling gratuitous insults at civilians in 140-character projectiles? No, that’s not an appropriate way to displace your anger. And I realize that now.



Jerk logic

Read this *!($@’n column

Dec. 12, 2008, Denver Post

Some experts claim that the English language contains nearly a million words — approximately 30 of them classified as curses. In Ephesians 4:29, it clearly states: “Do not let any unwholesome talk come out of your mouths.”

Still, every now and then, filthy language can come in handy.

My first foray into journalism was under the tutelage of a legendary sportscaster whose genteel face and grandfatherly advice brought him much adulation. So I was floored when, on my first day at the office, the man unleashed a cluster bomb of expletives that would have sent Samuel L. Jackson recoiling in horror.

Nevertheless, he had made a point — and I never forgot it. Since then, I have remained an enthusiastic proponent of (selective) explicit language. Sometimes there are no gracious words to convey your emotions properly. And, as most of you, I’ve heard expletives my entire life. It’s really not all that scandalous.

Then again, we may also agree that in certain places — like San Quentin and Illinois, for example — profanity is over-utilized as verbs, adjectives and articles, stultifying the impact of otherwise outstanding cuss words.

When Illinois Gov. Rod Blagojevich is heard on federal wiretaps advocating the firing of Chicago Tribune editorial writers, he at one point says, “Our recommendation is fire all those f@&;*$^% people, get ’em the f$%* out of there and get us some editorial support.

In this case, the profanity punctuates the seriousness of Blago’s desire to dismiss antagonistic members of the Chicago press. Assertive. Effective. I get it.

After that, I’m afraid, things get out of hand.

“You’re telling me that I have to ‘suck it up’ for two years and do nothing and give this mother*&%$r [the president-elect] his senator. F*$& him. For nothing? F$#& him . . . before I just give f%&#*@$ [Senate Candidate 1] a f$&*@^ Senate seat and I don’t get anything.”

You see, here, gratuitous use of the f-word — in all its incarnations — has transformed a perfectly respectable attempt at bribery into an unintelligible tirade which, overall, exposes a man on the abyss of a Joe Pesci moment.

The extraordinary aspect to this is that I am only subjected to this kind of language watching Quentin Tarantino films and reading FBI transcripts of elected officials. What’s surprising as well is that this lingo, as crass as it is, elicited very few complaints from the general public.

When the infamous tapes of Richard Nixon were first released to the public, in addition to hearing White House scheming, Americans were faced with the reality that presidents dish out profanity like football players. It was shocking.

Nixon, in fact, apologized for using naughty words, saying that while he had heard “other presidents use very earthy language in the Oval Office” (biographers claim that John Kennedy and Lyndon Johnson were the masters) he also “had the bad judgment to have it on tape.”

An Associated Press-Ipsos study found that 74 percent of Americans “frequently” or “occasionally” hear people cursing in public and believe the use of profanity is on the rise in the nation.

Maybe profanity is now on the surface of society rather than on the rise. Maybe the mythical America of linguistic purity is fading forever. Maybe it never existed in the first place. What does it matter? It’s not the end of the world.

The FCC recently brought a case before the Supreme Court that tackled the issue of obscene words on radio and television broadcasts during daytime and early evening hours. At the time, Justice Antonin Scalia joked that “Bawdy jokes are OK, if they are really good.”

This comment upset some culture warriors, but the point may be more insightful than it seems. Curses can be funny. And adults can handle dirty words. Adults, for the most part, understand when curses are appropriate and when they aren’t. There is a right time and place for everything.

The wrong place? A federal wiretap, for instance. Or in front of your innocent 5-year-old daughter — who then proceeds to build a song around the f-word to perform for your wife.

Hey, @$&% happens.

Read this *!($@’n column

The Amateurs’ Hour

Reason magazine — January 2008

Andrew Keen’s website claims, without a hint of humility, that he’s “the leading contemporary critic of the Internet.” No kidding? The entire Internet? A curious reader might wonder whether such an all-inclusive battle is similar to taking on, say, “music” or “radio waves.” It is.

More specifically, Keen’s depressing book, The Cult of the Amateur: How Today’s Internet Is Killing Our Culture, laments techno-utopianism, free content, and the rise of citizen journalists, filmmakers, musicians, and critics as cultural arbiters. It is a book, in other words, of spectacular elitism.

Keen, a Silicon Valley entrepreneur turned full-time critic of user-generated Internet content, argues that our most “valued cultural institutions” are under attack from the hordes of lay hacks, undermining quality content with garbage. His central argument is—to pinch a word he loves to use—seductive. He’s right that the Internet is littered with inane, vulgar, dimwitted, unedited, and unreadable content, much of it fueling outrageous conspiracy theories, odious partisan debates, mindless celebrity worship, and worse. And then there’s the stuff that’s not even entertaining.

Keen refuses to confess that there’s even a smattering of intellectually and culturally worthy user-driven content online. If you do find something decent in the “digital forest of mediocrity,” he attributes it to the infinite monkey theorem: Even simians, if permitted to indiscriminately hit a keyboard for an infinite amount of time, will one day bang out Beowulf or Don Quixote. (Silly me, I was under the impression that monkeys had hatched the idea for VH1’s Scott Baio Is 45…and Single.) Apparently, these monkeys are discharging so much free content into the cyber-strata that they threaten to bury culturally significant work, dilute good craftsmanship, and cost me, a journalist and “cultural gatekeeper,” my job. So I guess I’d better take Keen’s thesis seriously.

Keen isn’t entirely wrong—of over the estimated 175,000 new blogs created each day, just a miniscule fraction are worthwhile—but in the midst of cobbling together statistics and disaster stories he ignores an otherwise promising tale of job creation, mass creativity, and the democratization of the media. He also fails to acknowledge that the rise of Web 2.0—Internet-based media, such as blogs, in which the content is largely generated by the users themselves—was prompted precisely by the lack of choices and quality programming from those gatekeepers he so adamantly defends.

Not long ago, I was presented a firsthand view of the gloomy fallout from Web 2.0. Another downsizing had fallen upon the newspaper industry, including my paper, The Denver Post. Colleagues and friends of mine were instructed to clear their desks and find a new line of work. Keen grieves over the fate of my well-trained coworkers. He pins the blame on a bunch of schmucks knocking out third-rate musings on politics and culture. How can The New York Times, with its multi-million-dollar operational budget, compete with a blogger, who typically operates for pennies in his or her spare time?

We can agree, to a point. There are plenty of schmucks out there. But the ability to receive only the content you want while ignoring the rest of the package, combined with the migration of ads to services like Craig’s List, has done far more damage to newspapers than any pajama-clad scribblers ever could. And since the citizen journalist relies heavily on more traditional journalistic sources, I doubt the industry is nearing its demise. (In fact, by acting as freelance fact-checkers, all those bloggers have arguably transformed the medium into a more reliable dispenser of the news.)

In the face of economic realities, newspapers have been co-opting the blogger model—transforming a once-rigid daily newsroom cycle into a constant, 24-hour process, constantly posting updates, using video and audio as well as text, and bringing on bloggers of their own. Meanwhile, many high-profile bloggers, looking for ways to make their sites financially viable, are moving toward an old-media model, emphasizing professionalism and co-opting some of the conventional elements of news services. From the megapopular left-leaning Huffington Post to the conservative-oriented Pajamas Media, bloggers have pooled their talents and transformed into news agencies.

Whatever Keen (or I) may believe the future holds, it’s not society’s job to ensure that journalism remains profitable. It’s journalism’s job to entice readers and viewers with a product that’s worth the price of admission. These struggles, as important as they may be to some of us, do not signal the cold-blooded murder of “our culture.”

That brings us to Keen’s most glaring weakness: his lack of faith in the culture he defends. Keen is concerned not just with journalism but with a wider range of creative expression, from film to music. Readers of The Cult of the Amateurmay be surprised to learn that the barbarians capable of obliterating thousands of years of Western culture in their spare time are a horde of porn-addicted, gambling-happy, ungrateful, musically challenged yokels. What worthwhile culture could be so easily knocked off its perch?

Like most snobs, Keen doesn’t have much confidence in markets either. To accept his argument, we must believe that the common consumer, able to make thousands of informed decisions in everyday life, can’t differentiate between crap and Cristal when the choice is made on a computer screen.

In other contexts, Keen is a romantic. Consider his rhetoric regarding the supposedly bygone local bookstore. (A quick search of, a site sponsored by independently owned bookstores, shows five such stores within a 10-mile radius of my home.) “Instead of 2,500 independent bookstores with their knowledgeable, book loving staffers, specialty sections, and relationships with local writers,” Keen writes, “we now have an oligarchy of online megastores employing soulless algorithms that use our previous purchases and the purchases of others to tell us what to buy.”

Shopping at the convivial local bookstore might be a heartwarming experience, but the notion that such places offer us better choices is a fantasy. On Amazon, you can perform super-exact searches or browse endlessly (so at some point even the commoner may stumble across something worthwhile). You are guided not only by rough algorithms but by book lists and reviews written and compiled by other human beings who share your hyper-specific interests. And aren’t Amazon’s reviewers, list compilers, and bloggers a lot like helpful, educated bookstore staffers, leading us, by hyperlinking, to stories and ideas we otherwise might never have known about?

But Keen’s most persistent grievance is that free content undermines the accuracy of information. “Can a social worker in Des Moines really be considered credible in arguing with a trained physicist over string theory?” he asks, referring to Wikipedia, the online, user created encyclopedia. “Can a car mechanic have as knowledgeable a ‘POV’ as that of a trained geneticist on the nature of hereditary diseases? Can we trust a religious fundamentalist to know more about the origins of mankind than a PhD in evolutionary biology?”

Well, yes and no. I, of course, have the prerogative to trust whomever I want. In the same way I once gathered my news from The National Inquirer and listened to Art Bell’s late-night radio broadcasts for clues to my place in the universe, today I can ferret out similarly useless information webwide.

The more significant point, one that Keen ignores, is that the Web 2.0 explosion has provided me with something I’ve never had before: access to ongoing discussions between and among trained physicists, trained geneticists, and religious fundamentalists. Laymen as well as experts are now invited to sit in on these conversations. On occasion, the amateurs get it right, triggering dramatic results. Matt Drudge can announce the Monica Lewinsky scandal whileNewsweek dithers about publishing it. Or a blog like Little Green Footballs can help catch Dan Rather peddling forged documents about the president’s service record. Rather than undermining information, this new access has expanded users’ understanding of the world.

Keen raises the stakes of his argument when he blames some of society’s serious ills on the Internet. He asserts, for instance, that the “tasteless nature” of social networking sites such as MySpace and Facebook have “infested” Web 2.0 with “anonymous sexual predators and pedophiles.” No doubt a small fraction of those who participate in social networks are sexual predators and pedophiles—roughly the same as the percentage of people in local bookstores, playgrounds, and libraries who are sexual predators and pedophiles. Yet I don’t think I’ve ever heard an advocate for children’s rights blame libraries and playgrounds for sexual abuse.

Despite a heavy load of scaremongering, Keen claims he’s not a “techno-moralist” but a “techno-scold”—as if there’s much of a difference. The problem, he maintains, is that those involved in Web 2.0 live in an echo chamber. “There isn’t a debate, and there isn’t a conversation,” he says. “They’re just listening to themselves.” If only mainstream media outlets had debated their future as often and as intensely as bloggers debate theirs, we might not have needed Keen’s book.


The Amateurs’ Hour

To Tweet, Or Not to Tweet

June 2010, Wall Street Journal

A catastrophic event unfolds. A seemingly healthy professional embarks on his daily commute, only to come to the frightening realization that his battered and beloved BlackBerry lies vulnerable and unused in a distant corner of his home. An unwholesome panic descends. No matter how far away from home he is, and no matter how needless the device may be in a practical sense, he is impelled to hightail it back to his house and reconnect with the world.

William Powers offers this beleaguered man (me), and everyone else who has faced a similar ordeal, a roadmap to contentment in “Hamlet’s BlackBerry,” a rewarding guide to finding a “quiet” and “spacious” place “where the mind can wander free.”

Based on the author’s much-discussed 2006 National Journal essay, “Hamlet’s BlackBerry: Why Paper is Eternal” (and how I wish that were true), the former Washington Post staff writer argues that the distractions of manic connectivity often lead to a lack of productivity and, if allowed to permeate too deeply, to an assault on the beauty and meaning of everyday life.

Obviously this is not a unique grievance, or a fresh one: As Mr. Powers acknowledges, concerns about the deleterious effects of a new world supplanting the old go back to Plato. But there has been an awful lot of grousing about digital distraction lately—Nicholas Carr’s “The Shallows: What the Internet Is Doing to Our Brains” came out just a few weeks ago—and it is easy to feel skeptical of worrywarts agonizing about Americans “wrestling” with too many choices and “coping” with the effects of too much Internet use.

There is simply too much good that comes of innovation for that sort of Luddite hand-wringing. The farmer a century ago who pulled himself off the straw mattress at 4 a.m. to till the earth so his family wouldn’t starve led a fairly straightforward, undistracted existence, but he was almost certainly miserable most of the time. And he probably regarded the arrival of radio as a sort of miracle. In discussions of this type I tend to rely on the wisdom of P.J. O’Rourke: “Civilization is an enormous improvement on the lack thereof.”

But even a jaded reader is likely to be won over by “Hamlet’s BlackBerry.” It convincingly argues that we’ve ceded too much of our existence to what he calls Digital Maximalism. Less scold and more philosopher, Mr. Powers certainly bemoans the spread of technology in our lives, but he also offers a compelling discussion of our dependence on contraptions and of the ways in which we might free ourselves from them. I buy it. I need quiet time.

To accept “Hamlet’s BlackBerry” is to accept that we are super busy. “It’s staggering,” writes Mr. Powers, “how many balls we keep in the air each day and how few we drop. We’re so busy, sometimes it seems as though busyness itself is the point.” Though I don’t find all that ball-juggling as staggering as the author, and I don’t know anyone who acts as if chaos is the point of it all, it would be foolish not to concede that our lives have become far more complex than ever before.

What can be done? What should be done? Mr. Powers’s answer is, in essence: Just say no. Try to cultivate a quieter or at least more focused life. The most persuasive and entertaining parts of “Hamlet’s BlackBerry” are found in Mr. Powers’s efforts to practice what he preaches. (Most of us, it should be noted, do not have the option of moving from a dense Washington, D.C., suburb to an idyllic Cape Cod town to grapple with the demons of gadgetry addiction.) His skeptical wife and kids agree that if they’re allowed to use their laptops during the week, they will turn the computers off on the weekend. Mr. Powers discovers that friends and relatives quickly adapt to the family’s digital disconnect (they call it the “Internet Sabbath”). The family spends more time face-to-face instead of Facebooking.

Mr. Powers proposes that we take into account the “need to connect outward, as well as the opposite need for time and space apart.” It is a powerful desire, the balanced life. Most of us yearn for it. Neither technology nor connectivity is injurious unless we allow them to consume us. Mr. Powers argues that letting life turn into a blizzard of snapshots—that’s what all those screenviews amount to, after all—isn’t enough. We would be happier freeing ourselves for genuine, unfiltered experience and then reflecting on it, not tweeting about it. The busy person will pause here to nod in sympathy.

I’m not sure that many of us have found that spacious place where our minds can wander free of technological intrusions, of beeps and buttons and emails and tweets, but “Hamlet’s BlackBerry” makes the case that we can—or should—find it. Recently, while watching some hypnotically dreadful movie, I instinctively reached for my BlackBerry to fetch some worthless biographical information about a third-rate actress that would do no more than clog my brain still further.

Then I remembered something in Mr. Powers’s book—which takes its title from a scene in “Hamlet” when the prince refers to an Elizabethan technical advance: specially coated paper or parchment that could be wiped clean. A book that included heavy, blank, erasable pages made from such paper—an almanac, for example—was called a table. “Yea, from the table of my memory / I’ll wipe away all trivial fond records,” Hamlet says. Or, as Mr. Powers paraphrases: ” ‘Don’t worry,’ Hamlet’s nifty device whispered, ‘you don’t have to know everything. Just the few things that matter.’ “

To Tweet, Or Not to Tweet

Live Forever

Oct. 2014, National Review

I want to live forever. Or, if that’s impractical, as long as science can keep me operational. Now, obviously, this means elevating my game—more salubrious foods, calisthenics, steering clear of second-hand smoke and what-have-you. But if my efforts fall short—and I’m inclined to believe that at some point they might—I expect technology to pick up the slack. If this entails replacing my limbs with bionic parts, so be it. If it necessitates pumping me full of experimental pharmaceuticals or plugging me into contraptions that keep vital organs functioning properly, go for it. Nanotechnology? Whatever that is, I’m all in. And, if all else fails, please upload my consciousness into a freshly grown clone—though, if it’s not too much trouble, let’s make this one more athletic.

In his now-infamous Atlantic essay “Why I hope to die at 75,” Ezekiel Emanuel, 57, subtly disparages people like me as “American immortals.” I take no offense. Emanuel, after all, is the director of something called the Clinical Bioethics Department at the U.S. National Institutes of Health. He also finds time to run the Department of Medical Ethics and Health Policy at the University of Pennsylvania.

Or, in other words, there are people blessed with dazzling intellects who strive to unlock the secrets of the universe or devote their careers to making life more tolerable for the weak, sick, and elderly. And then there are people who crunch numbers to concoct arbitrary human expiration dates.

Old age, says Emanuel, leaves us faltering, declining, feeble, ineffectual, pathetic, and uncreative. Without even a single Ph.D. to my name, I’ve arrived at a similar conclusion. Growing old sucks. It can be depressing for the individual. A heart-wrenching burden for many families. And, also, better than most alternatives. This is why we humans have initiated a successful sweeping project to lengthen the Third Act—one of our most meaningful and moral undertakings, actually. This disturbs Emanuel, who claims that though proles live longer these days, they do not live more fulfilling lives. And while this might be true (though I doubt it), the most problematic part of Emanuel’s contention is his failure to answer the most vital question raised by his proposition: What kind of life is worth living?

Why am I alive? Maybe it’s an evolutionary need to be a father or maybe it’s an intellectual need to mock people who are by every calculable metric a lot smarter than I am. I don’t pretend to have the answer—probably because everyone’s answer is unique. What I think I do know, however, is how not to quantify life.

Life, for example, is not about being a cog in the collective. This is the basic rationalization Emanuel offers for his deadline—complete with a chart that plots the purpose of human existence. If you’re a productive person with high creative potential, your “first contribution” (interning at a nonprofit, perhaps) will be made in your mid 20s. Your “best” contribution (running for office or working for the Department of Zzzzzz) will be made in your late 30s. And your “last” contribution (authoring a memoir celebrating a life in public service) will be made in your early 60s. After that, well, what’s the point, right?

There are outliers, of course—Abraham didn’t father Isaac until he was 100, and Ronald Reagan wasn’t elected president until he was nearly 70—but we should concede that research proves the older you are the more likely it is that you’re engaged in piddling digressions such as visiting your grandchildren or binge- watching Murder, She Wrote. The chances of your authoring a white paper on a carbon tax or engaging in undertakings deemed beneficial by technocrats is rather low. Thank God.

Emanuel also advances the ugly idea that an uncomfortable life is not a life worth living. Half of Americans over 80 will be saddled with some functional limitations, he points out. A third of Americans over 85 will suffer from Alzheimer’s. Hips will hurt. Memories will fade. This is often tragic. But don’t millions of Americans live their lives with physical and mental limitations? Is their earthly existence worth the same as that of a 76-year-old—nothing? Emanuel says his proposition is a personal one, but if he believes his life—one we imagine he values more than most—isn’t worth extending past 75, what about others who fail to meet his criteria? This question goes unanswered.

Emanuel denies his piece is a stealth proposal to “save resources, ration health care, or address public-policy issues arising from the increases in life expectancy.”

The stench is there, though. For decades an ugly Malthusian compulsion has infected the Left, leading it to think we should measure the value of life by its impact on the environment or its productivity. The implication is stupefying, anti-humanist, and immoral.

Emanuel preemptively claims that there will be spiritual reasons for people to reject his pseudoscientific trolling. Well, even skeptics who believe that existence is happenstance, that life serves no grand purpose, and that there is no afterlife to look forward to should be insulted. I’m reminded of an interaction in one of the most underrated Woody Allen films, Love and Death, in which the character Sonya asks: “But, if there is no God, then life has no meaning. Why go on living? Why not just commit suicide?” Woody Allen’s doppelgänger, Boris, retorts, “Well, let’s not get hysterical. I could be wrong. I’d hate to blow my brains out and then read in the paper that they found something.”

There’s no need to cash out on Pascal’s wager too early, especially when we don’t know what sort of technological developments are on the horizon. My selfish hope is that we make tremendous strides in this department in, say, the next 30 years. If I don’t become a supercentenarian, it’ll be the fault of society. Mostly of people like Ezekiel Emanuel.

Oct. 2014, National Review

Live Forever