Sometimes a reader’s response spurs you to say more clearly and completely what you intended. My last column took up the issue of picky eaters, and quoted from a Times article, “When the Picky Eater is a Grownup.” I applied some skeptical irony to the therapeutic terms brought to bear on the topic by the article and the experts it cited. 

My colleague Rita Ferrone took me to task. “Food disorders do really exist,” she wrote in response. “Talk to anyone who has suffered from bulimia or anorexia. This isn't another form of ‘the worried well.’ These are harmful conditions, and the person who is anorexic or bulimic can't just snap out of it without help.” Rita judged it “unreasonable” of me “to pooh-pooh those adults who just can't cope with a normal diet, and seek out help.”

Fair enough, though I did expressly point out that I wasn’t minimizing the awfulness of serious eating disorders and the heartbreak they bring to those who suffer them. My pooh-poohing was not aimed at these persons -- or at any person, traumatized by adverse experience, whose anxiety about food (or anything else) truly overwhelms him or her. Anyone lacking sympathy for those in deep distress is stonehearted.

But that’s not what I meant by “picky eater,” and I don’t think it’s what the Times piece primarily meant either. The very phrase, “picky eater,” suggests choice – a fastidiousness, an excess of selectivity of the kind conveyed in the article via such designations as “smooth eater.” By focusing on these designations, and on the support groups and therapeutic interventions available to the picky eater, I was attempting to point up the trend in our culture toward adopting diagnostic and therapeutic understandings of a whole range of phenomena. Have we gone too far in this direction? Some may say we haven’t gone far enough.

A confession. I see a lot of movies, and over the years I have discovered in myself an aversion to hearing people eating while the film is going on. The sound of the loud popcorn muncher in the row behind me, well, it drives me a little bit around the bend. It provokes mild anxiety, annoyance, and the burning desire to turn around and ask, “Do you intend to stay buried in that FEED BAG for the next two hours?”

I don’t do that, of course. I restrain myself. Sometimes I move to a different seat. I cope, in other words. Humor is one means – telling stories to friends, for instance, that highlight the rueful situations my cinema food phobia lands me in.

Some months back I chanced across an article about precisely this phobia of mine. It seems that it’s – well, not exactly common, but it’s out there. The article reported that “sufferers” were banding together to share their experiences, pool their consumer power, and demand that theater owners provide a special section where they could sit together and no eating would be allowed.

I wouldn’t want to join such a group. First, I’d have no particular expectation of commonality with the people in the food-free section beyond this particular quirk. Nor would I want to be ghettoized this way vis-à-vis the larger theater population. Nor, finally, would I want this predilection of mine reified, as we used to say in college, into a significant defining category of my self, one requiring therapeutic intervention.  I would rather understand it as a personal peccadillo, best handled by custom... and manners. I would rather manage it than structure it into my life – and other people’s lives – that way.

So where to draw the line between what we view as a facet of simple human variety and what we diagnose as a disorder? Rita notes that the Times article “doesn't claim that every type of pickiness is a severe problem,” but that only some are. “This doesn't mean we are pathologizing everything,” she observes. “It's like saying that just because some people are treated for clinical depression, one is saying that everyone who feels sad ought to seeing a psychiatrist.” 

But is nobody saying that? Just a few years ago, the DSM-5, the recent update of the manual governing psychiatric diagnosis, notoriously eliminated the so-called bereavement exclusion in the diagnosis of Major Depressive Disorder. An article in the New England Journal of Medicine warned that the change would spur diagnoses of major depression in persons with normal bereavement – following, say, the death of a spouse -- after only 2 weeks of mild depressive symptoms.

Unfortunately, the effect of this proposed change would be to medicalize normal grief and erroneously label healthy people with a psychiatric diagnosis. And it will no doubt be a boon to the pharmaceutical industry, because it will encourage unnecessary treatment with antidepressants and antipsychotics, both of which are increasingly used to treat depression and anxiety.

What I was trying to get at in my picky-eater post was precisely this trend toward medicalizing normal life problems. Even life phases. Like all those ads aimed at men of my age cohort, asking us whether we have experienced a drop-off in our sex lives, reduced levels of energy and vitality, increased aches and pains and fatigue. “If so,” the ad informs us, “you may be suffering from a condition known as low T.” Really? Remember when we used to call condition that “middle age”?  

It’s worth recalling that the concept of medicalization, and the critique it leveled, was popularized by progressive sociologists and psychotherapists in the 60s and 70s, renegades like Thomas Szasz, who saw it as an aggressive expansion of institutional medical authority into personal and private life. Szasz wrote that “the therapeutic state swallows up everything human,” noting how life phenomena as diverse as draft malingering, bachelorhood, divorce, unwanted pregnancy, insurance fraud, kleptomania, and grief were declared diseases at one time or another. This sucks up resources. Brandeis sociologist Peter Conrad studied  a dozen conditions considered “medicalized” by physician organizations, including anxiety disorders; body image; male pattern baldness; normal sadness; obesity; sleep disorders, and substance-related disorders. In 2005, medical spending on these disorders accounted for nearly $80 billion. “We spend more on these medicalized conditions than on cancer, heart disease, or public health,” Conrad noted.

The ever-broader application of therapeutic approaches to life problems has shaped the ruckus over “trigger warnings” and “safe spaces” in campus speech debates. Again, I have zero problem (who could?) with accommodations made for traumatized individuals who experience, or fear experiencing, a recrudescence of anxiety provoked by a reminder of that trauma. How might a Holocaust survivor feel, say, about a screening of Triumph of the Will in a history class? That is a very real “triggering.”  But identity-politics practitioners have picked up the term and imported it into a cultural and political discourse. University culture traditionally has modeled, taught and inculcated powerful ways of engaging speech and ideas you oppose. Condemn it. Protest it. Argue back against it. But do we really want to exclude it, and do so by wielding metaphors of therapy?

The complexities of this issue are brought to the fore in a fascinating article from the progressive website Truthout.com on “The Dangers of Medicalizing Racism,” describing a North Carolina church program called Racists Anonymous. The program’s website says that meetings are organized around AA’s 12-steps format, and that the first step for participants is to “come to admit that I am powerless over my addiction to racism in ways I am unable to recognize fully, let alone manage.” Or, as the Truthout article glosses it, “My name is Trevor, and I’m a racist.”

Does this kind of approach work? Is it a good thing for us to view a hate-filled racist as ill – in the grip of a disorder, a disease -- and thus as someone needing, and worthy of, our sympathy? Is it a good thing for him to understand himself that way?

These are not rhetorical questions. The case for answering “yes” to them rests in large part on our evolving understanding of trauma, broadly construed, and the notion that such attitudes as racism constitute reactive behavior encoded, as it were, as a response to childhood trauma. Neurons that wire together, fire together, is the mantra; and the therapeutic view holds that you have to be retrained, cognitively, in order to change. AA, in this view, amounts to a kind of lay cognitive therapy; and the reeducation of a racist is, in fact, an act of healing.  

Persuaded? Darryl Walker, the writer in Truthout, is not. “While this approach may appear innovative,” he writes,

it is simply the latest attempt to medicalize racism. Advocates for the medicalization of racism have suggested drug therapy and behavioral modification. Although medicalization is tempting, it is important to highlight the false presuppositions and disastrous consequences of such a theoretical shift. This approach erases the structural underpinnings of racism and silences larger calls for racial justice.

All this is a long way, to be sure, from the problem of the picky eater, but I think it’s still recognizably the same terrain. My own instinct is that therapeutic understandings bring substantial benefits, but going too far has its price. It reduces agency. It farms out our acts of sympathy and correction to paid professionals. And it slaps a label on an individual that in some regard, big or small, subsumes his or her humanity under a medical category. Oh, and sometimes it’s just plain silly. 

As the French say, Tout comprendre, c’est tout pardonner – to understand everything is to forgive everything. Whether it’s the picky eater’s rejection of vegetables, or betrayals in friendship and marriage, or the invidious belligerence of a racist, we all have to decide how far we should go by way of ascribing to a kind of sociobiological determinism the multifarious ways people are who they are, and the stubborn/ baffling /destructive things they sometimes do.

Who should control the turf of our understandings of self? The priest? The novelist? The judge? Or the doctor?

 

Rand Richards Cooper is a contributing editor to Commonweal. His fiction has appeared in Harper’s, GQ, Esquire, the Atlantic, and many other magazines, as well as in Best American Short Stories. His novel, The Last to Go, was produced for television by ABC, and he has been a writer-in-residence at Amherst and Emerson colleges. 

Also by this author
© 2024 Commonweal Magazine. All rights reserved. Design by Point Five. Site by Deck Fifty.