Eat, Pray, Kill | The Basic Brutality of Eating


Eat, Pray, Kill: The Basic Brutality of Eating

By Beatrice Marovich

“Bloody beetroots”: image courtesy flickr user kudla, via a Creative Commons license

Beatrice Marovich

[Beatrice is a PhD candidate in the Graduate Division of Religion at Drew University in Madison, NJ. She also works as a writer, editor, and communications consultant, specializing in ideas at the crowded intersection of theology, philosophy, faith and public/political life in North America.]

Some humans are deeply passionate about their meat. They love it, they gnash their teeth for it. In her 2006 spiritual travelogue Eat, Pray, Love, Elizabeth Gilbert confessed a kind of affinity with the sensual Tuscan culture of meat. Shop windows in the Italian town, she writes, are loaded with sausages “stuffed like ladies’ legs into provocative stockings” or the “lusty buttocks” of ham. The net effect, she suggests, is the emanation of a “you know you want it” kind of sensuality. Make no mistake. Meat—the flesh of non-human animals—is a force of desire in human life.

But is there an ethical argument in favor of flesh consumption? That is, can a meat-eating human find solid moral ground for her more carnivorous appetites? Is there a soul-cure for the stomachache that comes from eating the body of another sentient creature? Are these questions that the vast storehouses of religious traditions can help us navigate?

In a culture where plates are piled high with the spoils of profligate factory farming, it would seem that the growing surge of vegans and vegetarians have claim to the moral high ground. One might even make the argument that religious vegetarianism is one of the few things that makes modern religions look good. But not everyone is satisfied with this solution. “Ethically speaking, vegetables get all the glory,” Ariel Kaminer lamented in the New York Times, playing the role of the paper’s esteemed Ethicist. And so, in an attempt to buck this trend the paper launched an essay contest in March of this year: in search of the ethical argument for meat.

Essays were judged by a star-studded panel that included vocal vegetarians like Peter Singer and Jonathan Safran Foer as well as more cautiously omnivorous foodies such as Michael Pollan and Mark Bittman. Controversy ensued over the fact that the panel was comprised entirely of men. But, gender issues be damned, results were published in late April. Six essays made the cut. The final stage was to give Times readers four days to vote on their personal favorite. Almost 40 percent of voters appeared to favor the ethical argument in favor of in vitro meat. “Aside from accidental roadkill or the fish washed up dead on the shore, it is perhaps the only ethical meat,” essayist Ingrid Newkirk baldly proffered.

If Peas Can Talk…

The argument that stirred me most, however, was one of the lower-scoring essays—earning a mere 10 percent of voter approval. Interestingly, it wasn’t really an argument in favor of meat at all, so much as it was an attempt to dramatize the moral stakes of the practice.

“We would be foolish to deny that there are strong moral considerations against eating meat,” philosophy professor Bob Fischer begins. Eating meat is clearly, from an ethical perspective, “wrong” on several counts. But morality is an ideal, he notes, something we aim for, and fall short of. This makes the moral world “tragic,” as he puts it. Moral work is a tragedy, played out on a cosmic stage. Rather than wallow in remorse, he sees this as reason to be suspicious of “any proposal that would steer us through these complexities too quickly.”

When it comes to the consumption of meat, in other words, our human hands have long been dirty. This isn’t a discouragement to stop striving for the good. But a moral proposal that promises to wash our filthy fingers spotlessly clean—in seconds flat—is suspect. Because they will still be dirty. The pressing moral question, of meat, becomes: given that human hands are obviously soiled, what can be done with these polluted tools?

The easy answer, most often, is: go vegetarian. If it feels wrong to eat meat, then stop eating it. Why waste time, really? Just go vegan. Start cleaning your hands by refusing to eat your fellow creatures. The ethical argument for meat, in other words, is an impossibility. Ending flesh consumption is one step in right direction, toward a kinder future. Some might argue, however, that the argument from empathy is a slippery slope argument. Vegetarians will surely protest. But philosopher Michael Marder, writing recently for the Times, points to research on pea plant communication as evidence of a kind of plant subjectivity. The title of his column begs the incendiary question: “If peas can talk, should we eat them?”

There are, perhaps, some practitioners of the Jain tradition who would give a resounding “no.” Strict ascetic practices in Jainism disavow not only the consumption of meat, but the practice of farming—because of the damage that agricultural tools to do the earth. The consumption of root vegetables may be prohibited (as you would be yanking the vegetable to its death), as well as the consumption of a living pea shoot, which can (as Marder suggests) talk.

These practices find their basis in ahimsa—the Sanskrit term that describes a posture of nonviolence toward all living creatures. The power of ahimsa can be genealogically traced into the vegetarian strains and variants of Hinduism, Jainism, and Buddhism. Is it when we turn to the wisdom of religious traditions that we finally find the spiritual purity we’re looking for? The sort that can clean our dirty hands from the inside out, starting with our nasty and brutish souls?

A Screaming Silence

My own thinking around religion and animals, particularly around the conundrum of eating them, was complexified at a recent conference, put on by the Graduate Student’s Association in Columbia’s Religion Department. The consumption of animal flesh was not the primary subject matter of “Pray, Eat, Kill: Relating to Animals Across Religious Traditions,” but it was perhaps the most absorbing. It was also the subject of Wendy Doniger’s keynote address. The legendary scholar of myth and religion dipped back into ancient text, citing myriad strange injunctions regarding the consumption of food in The Laws of Manu. What she finds, in these codes, is not only an attempt to deal with the old, and apparently always agonizing, moral pain of eating animal flesh. She also spoke of references, in these ancient texts, to the “screaming silence” of vegetables.

Doniger finds, in other words, a long history of reflection on the basic brutality of eating, rooted in a reflection on this concept of ahimsa. But, interestingly, what she finds is that this sympathy and compassion for animals did not always lead to the condemnation of eating animal flesh.

The fact is, religious ethics are practices that are crafted in conversation with culture and geography. There have been times when the meaning of ahimsa, or practices of animal compassion, have taken a backseat to necessity. Geoff Barstow, for example, spoke of the 18th-century Tibetan Buddhist Jigme Lingpa who displayed an extreme form of compassion for animals (addressing them as his mother). He believed that meat was a poison that bore a heavy karmic burden. But he stopped just sort of commending vegetarianism. Meat was, as Barstow put it, a kind of “necessary evil.” Was this in recognition of the fact that there simply aren’t a lot of vegetables to be had in the mountainous regions of Tibet?

Is the purity (or the arid ethical high ground) we might be aiming for a myth, itself? Is it possible to both consume and remain morally chaste? Doniger suggests that, perhaps, the most common and lasting effect we can see—as reverent humans attempt to deal with the moral ambivalence of eating meat—is that they make lists. They attempt to rationalize this ambivalence, to find a way of controlling its power. The Laws of Manu are filled with long lists of things you can and cannot eat (mushrooms, solitary animals), things you can and cannot do with animals (sacrifice is good, unlawful slaughter is bad).

Such lists are not unique to the Hindu tradition. Indeed, we see both simple and complex dietary regulations in a host of traditions and cultures. Even here in the U.S., we have “secular” regulations that prevent us from eating dogs. Many of us follow our own little personal hodgepodge of injunctions that (we believe) contributes to a more sustainable form of life, or a healthy planet.

In a larger sense, the thicket of little rules and regulations seems absurd. The “real” question, it seems, is whether or not to eat animals at all—whether to have all or nothing, flesh or no flesh. But such universal injunctions seem problematic to me. Human history is littered with smaller lists, smaller injunctions, created in ethical conversation with a particular context.

When we look back at the stages set by the history, via religion, I think we will see this moral drama—the encounter of human and non-human animal—played out in many different ways. In the messy, violent, ambivalence this encounter generates, and the stopgap measures put in place to resolve it, we see thousands of small (often contradictory, often bizarre) solutions. We might read thousands of lists! But this is not a sign of our human failure. Rather, I think we can see it as an encouragement to keep making those small lists.

Morality is a messy business—why should we expect its rules to be singular, or simple?

There Is No Biological Reason to Eat Three Meals a Day — So Why Do We Do it?


There Is No Biological Reason to Eat Three Meals a Day — So Why Do We Do
It?
For most of history, meals were very variable.
September 23, 2011  |
 We grew up believing in three meals a day.

When we skip meals, eat extra meals or subvert paradigms — spaghetti
breakfasts, pancake suppers — we feel naughty, edgy and criminal. “Three meals
a day” resonates like a Bible phrase.

But it’s a cultural construct.

People around the world, even in the West, have not always eaten three
squares. The three-meals model is a fairly recent convention, which is now being
eclipsed as, like everything else, eating becomes a highly personalized matter
of choice. What and when and how frequently we eat is driven less and less by
the choices of our families, coworkers and others, and more and more by impulse,
personal taste and favorite nutrition memes, and marketing schemes such as Taco
Bell’s promotion of late-night
eating known as “Fourthmeal: the Meal Between Dinner & Breakfast.” Selecting
how and when we eat is like loading our iPods.

A torrent of new studies explores the health effects of eating three squares.
Their findings are far from conclusive. A US Department of Agriculture study
found that eating just one large meal a day versus three normal-sized meals
lowers weight and body fat but raises blood pressure; three meals per day lowers
blood pressure. A National Institute
on Aging study
found that eating one large meal a day rather than three
raises insulin resistance and glucose intolerance: two key features of type-2
diabetes.

A University of Maastricht study found that eating at least four small meals
daily reduces obesity risk by 45 percent. This Dutch study also found that
people who skip breakfast are five times as likely to become obese as regular
breakfasters. Yet a University of Ottawa study found that eating many small
meals doesn’t promote weight loss. So did a French National Center for
Scientific Research study, which trashed grazing: “Epidemiological studies which
have suggested that nibbling is associated with leanness are extremely
vulnerable to methodological errors,” its authors warn.

A UC Berkeley study
found
that “alternate-day fasting” — feasting one day, fasting the next, ad
infinitum — might decrease the risk of heart disease and cancer.

Researching the effects of meal frequency is notoriously tricky, because it
involves so many variables: nutritional content, time of day, exercise,
genetics. So the scientific jury is still out.

“There is no biological reason for eating three meals a day,” says Yale
University history professor Paul Freedman, editor of Food: The History of
Taste
(University of California Press, 2007).

The number of meals eaten per day, along with the standard hour and fare for
each, “are cultural patterns no different from how close you stand when talking
to people or what you do with your body as you speak. Human beings are
comfortable with patterns because they’re predictable. We’ve become comfortable
with the idea of three meals. On the other hand, our schedules and our desires
are subverting that idea more and more every day,” Freedman says.

For most of history, meals were very variable. A medieval northern European
peasant “would start his morning with ale or bread or both, then bring some sort
of food out into the fields and have a large meal sometime in the afternoon,”
Freedman says. “He might have what he called ‘dinner’ at 2 in the afternoon or 6
in the evening, or later” — depending on his work, the season and other
factors.

“He wouldn’t have a large evening meal. He would just grab something small
and quick. Dinner back then tended not to be as distinct as it has become in the
last two centuries.”

And it tended to be eaten in daylight — not because eating earlier was
considered healthier, but because cooking, consuming and cleaning up is
difficult in the dark or by firelight.

“People who were not rich tried to get all their meals eaten before dark.
After electricity was discovered, initially only the rich could afford it,”
Freedman says. “From that point onward, one mark of being rich became how late
you ate. Eating way after dark because you could afford electric lights was a
mark of high status, urbanity and class.”

Eating late — or at random times, or more or less than thrice daily — also
reflects one’s distance from the two main forces that standardized three squares
in America: conventional work schedules and traditional family life.

Throughout most of the 20th century, most workers could eat only at specific
times.

“When that factory whistle blew at five o’clock, it was time to go home and
be fed. But now all kinds of Americans are eating later and later because they
work longer hours than they used to, or because their hours are now more
flexible. We are very much losing the three-meals-a-day model, thanks to grazing
and thanks to different members of a household having different schedules, and
to the fact that the kids might not want to eat what their parents are
eating.”

The idea of children being allowed to choose their own meals and mealtimes
would have been shocking a few decades ago, when “Eat what’s on your plate” and
“Eat your peas or no dessert” were family dinner-table mantras. But the family
dinner table is verging on the obsolescent. Which came first: the dissolution of
the standard nuclear family or the dissolution of three meals a day?

“American parents have a particular kind of guilt about the disappearance of
family meals,” Freedman says. Perhaps for good reason: A recent University of
Minnesota study found
that habitual shared family meals improve nutrition,
academic performance and interpersonal skills and reduce the risk of eating
disorders.

Electronic devices are also undermining the three-meals model. They’re at
once entertainment centers, workspaces and almost-human companions. Their
portability and nonstop availability let us eat whenever we like without having
to stop working, without having to be bored, and without having to feel that we
are eating alone.

“The disappearance of family meals antedates the invention of hand-held
electronic devices,” Freedman says. “It was not initiated by them, but it is
exacerbated by them. These days, even if everyone’s sitting around a table
together, it’s not clear that they’re all paying attention.”

The three-meals model is also being fought by the food industry.

“The food industry wants you to buy more food,” thus it urges us to eat as
much and as often as possible. It’s an easy sell, “because Americans have always
liked snacks.”

A snack boom began in the mid-20th century and hasn’t stopped. Thriving
through a wrecked economy, the global snack
industry is predicted
to be worth $330 billion by 2015. In the US alone,
retail sales of packaged snacks increased from $56 billion to $64 billion
between 2006 and 2010, and are expected
to reach
$77 billion by 2015.

The blurred borderline between snacks and meals has changed everything.

“The long-term effect is that any time of day has become a time to eat. The
decline of three meals a day and the rise of snacks are related, although I
wouldn’t say there’s a direct causal relationship,” Freedman says.

Another food-industry strategy is the creation of food niches, based on age,
ethnicity, gender, lifestyle and locale. A few decades ago, everyone ate the
same foods.

“But now there’s kid food, there’s teenager food and there’s grownup food, so
some parents end up buying three times as much food” as their own parents
did.

“They’re being manipulated into it, guilted into thinking: I’m so busy all
week and I have so little quality time with my kids that the least I can do for
them is let them eat as they like rather than making a stand and insisting that
we all eat the same thing together.”