The greatest failure in America is the failure to stay young. It is a failure of imagination, the inability to grasp the alternatives offered by surgery, cosmetology, and pharmacology. It is a failure of will, the indiscipline that results in flagging energies, flabby bodies, and clogged arteries.
It is a failure of financial planning, the incapacity to amass the resources needed to deploy the full panoply of anti-aging techniques and technologies. Most basic of all, it is a failure of genetic foresight, the prenatal indolence that passively accepts a poisoned lineage of physical and mental infirmities, moral laxness, and hereditary balding.
The only room for old men and women in this country is in separate bathtubs atop a cliff, holding hands as they watch the sunset and wait for the antidote to flaccidity to kick in so they can frolic in the same tub; or, in the case of a failure to achieve liftoff, throwing themselves in the sea, in a watery version of suttee.
For me, the nightfall of old age is particularly upsetting. I tried hard to seize and hold the day. I was born to healthy, middle-class parents in a good neighborhood. Except for college, the office Christmas party, and that weekend in Las Vegas, I drank moderately. I exercised regularly and completed several marathons. I had regular checkups and took care of my teeth. I’ve enjoyed a reasonably successful career, a happy marriage and a retirement undimmed by fear of living in a cardboard box and subsisting on the kindness of strangers.
Some changes were only to be expected. At thirty, I faced up to male pattern baldness. At forty, I purchased my first pair of reading glasses. At fifty, I added Metamucil to my orange juice. At sixty, I started blood-pressure medication and did my best to eschew meat and order whatever fish was on the menu
Despite hard work, sound planning, lifestyle adjustments, and unusually well-behaved Irish genes, I find myself–to paraphrase the poet Yeats– “where all the ladders” end, “in the foul rag and bone shop” of encroaching decrepitude.
One day I had hearing as good as a rabbit’s. The next I suffered “sudden onset hearing loss.” In a flash, I went deaf in my left ear. At cocktail parties, I can no longer distinguish conversation from background noise (not that it matters much). Going out to dinner requires several minutes of configuring the seating to compensate for my auditory deficiency.
I developed epilepsy. I was sitting down for a television interview when, bang, I suffered a grand seizure that left me unconscious (so I’m told), writhing on the floor. I’ve had several since, the last of which resulted in a head injury that required several stitches. As a result, I can no longer drive, ride a bike, swim alone or–not that I had ever had the desire–swing on a trapeze.
After 50 years of running, my knees resemble the coil springs on a rusted ’56 Chevy. Two weeks ago, something snapped in my upper arm while doing my morning pushups. I can’t lift my right arm above my shoulder. Last week, while jogging, I wrenched my back so badly I can’t walk right. I recently had surgery for thyroid cancer. My medicine cabinet resembles the pickup window at the local pharmacy.
My powers of recall are showing signs of wear and tear. I open cabinets and drawers and instantly forget what I’m looking for. The ability to attach names to the faces of friends and acquaintances is becoming one of life’s small triumphs.
“The wages of sin,” wrote St. Paul, “are death.” Either he forgot to mention or deliberately left out that so are the wages of virtue. We’re all inching or hurtling toward the egress, and we Baby Boomers are elbowing our way to the head of the line. For us, keeping the Grim Reaper at bay looms as an increasingly expensive proposition.
According to the Medicare Newsgroup, an independent source for coverage of Medicare-related issues, end-of-life care continues to be characterized by highly aggressive medical intervention and runaway costs. Medicare spending in 2011 totaled more than $550 billion. Of that, $178 billion, or 20%, was spent on patients’ last six months of life.
It’s true you can’t take it with you. It’s also true that we members of the over-sixty-five set will suck up a disproportionate share of the country’s medical resources in order to make incremental additions to life spans already longer than those enjoyed by 99% of our ancestors.
There are plenty of seniors with the energy and strength to lead productive lives for years, maybe decades to come. But, in the words of Daniel Callahan, president emeritus of The Hastings Institute, “no matter how (many) medical treatments we get, it’s never good enough because people eventually die … We’re not in a winnable war against death.”
The inevitability of the final curtain doesn’t make it easier to accept. I’m as reluctant and fearful as anyone else to face the end. But, sooner or later, it’s all right to think about making room instead of taking it up. A degree of resignation and acceptance isn’t a bad thing.
We can claw and cry for a day or two more, and spend whatever it takes. We can rage against the dying of the light and resent it as a violation of an imagined right to live forever. Or we can enjoy what we’re still capable of enjoying and exit, if not laughing, then with a smile of gratitude for the miracle of existence we’ve been privileged to share.
The greatest failure in America is the failure to stay young. It is a failure of imagination, the inability to grasp the alternatives offered by surgery, cosmetology, and pharmacology. It is a failure of will, the indiscipline that results in flagging energies, flabby bodies, and clogged arteries.
Happy Bastille Day! Vive la France! Vive la révolution!
Vive la Republic!
Vive Brigitte Bardot!
Most of all Brigitte!
Our first encounter was outside the Circle movie theater in the Bronx. Teetering on the brink of puberty, I glanced innocently at the poster advertising the feature that was playing: And God Created Woman.
There, lying on her stomach, was this mind-bending, impossibly beautiful girl, the naked arch of her back exposed, tousled blond hair framing seductively pouty lips. And those eyes! Those come-hither eyes so blue, so bold, so shamelessly inviting!
Above her picture was her name: Brigitte Bardot! I’d never heard of her before. We stared at each other. I stood for what seemed an hour (but was probably only a minute or two.) “Brigitte,” I whispered, “I love you.” Her eyes spoke back to me: “I love you too, mon Pierre.”
I stepped up to the box office in expectation of buying a ticket only to find that the gate to paradise was barred. The sign in the window was printed in big black letters: POSITIVELY NO ONE UNDER 18 ADMITTED.
Right there, I decided that since I couldn’t see the movie, I’d save up for a trip to Paris. I wrote Brigitte care of her studio. I told her I was coming. If she was already living with a lover–which seemed likely since she was French–and didn’t want to break his heart (not yet), she should write back and tell me the street corner where she’d be waiting. We’d find a cozy, inexpensive hotel nearby.
In my recurring dream, she loitered there expectantly, the lamplight encircling her, her trench coat pulled tight, a small bag with her night things (a very, very small bag) slung over her shoulder, a Gaulois dangling from those lips, sigh, those lips.
I never made the trip. She never wrote back. Not that I know of. But sometimes, I think that she did get my letter, that she did write back, and that her letter–so brief, so sweet, so Brigitte–was lost. Only a few lines, it went like this:
“Quand je te pense, mon âme monte. Mon coeur ne peut pas contenir sa joie! Au coin de la Rue de Dames, sous la lampadaire, mon seul refuge de la nuit sans étoiles, je t’attends. Et je t’attendrai. Jusqu’à ce que tu viens. Viens vitement, mon amour.”
(When I think of you, my soul soars. My heart cannot contain its delight. On the corner of the Rue de Dames, beneath the streetlight, my only refuge from the lonely, starless night, I wait. And will wait. Until you come. Come quickly, my love!)
O Brigitte! My Brigitte, I’m on way!
Quote of the Day from Mike Tuberty’s blog, Boat Against the Current …
Elizabeth Hardwick, on Making a Living as a Writer:
“Making a living is nothing; the great difficulty is making a point, making a difference—with words.” —Elizabeth Hardwick, “Grub Street: New York,” The New York Review of Books, February 1, 1963
I greatly admire Elizabeth Hardwick. (And thanks to Mike Tuberty for providing the quote above.) But as someone who’s made a living as a writer for the past 30 years, I strongly dissent from the assertion that making a living as a writer (or a teacher, or a sanitation worker, or a nurse, or a salesperson—you name it) is “nothing.” From my experience, the only people for whom making a living is nothing are trust-fund babies (“trustafarians”).
Writing is work. Like any other job, it’s a matter of showing up, of living with highs and lows—of successes punctuated by terrible disappointments—of insecurity and doubts and fears, and above all else of persisting. I’ve always wondered where the romantic notion about writing comes from—the notion that it’s some sort of idyll of the imagination that’s easier than other forms of work.
Nobody has ever put it better than W.B. Yeats in his poem “Adam’s Curse”:
We sat together at one summer’s end,
That beautiful mild woman, your close friend,
And you and I, and talked of poetry.
I said, ‘A line will take us hours maybe;
Yet if it does not seem a moment’s thought,
Our stitching and unstitching has been naught.
Better go down upon your marrow-bones
And scrub a kitchen pavement, or break stones
Like an old pauper, in all kinds of weather;
For to articulate sweet sounds together
Is to work harder than all these, and yet
Be thought an idler by the noisy set
Of bankers, schoolmasters, and clergymen
The martyrs call the world.’
Along with air and water, food is the common denominator of human survival. Throughout history, the quest for daily sustenance has often been precarious. Food shortages caused by crop failures or extreme weather were (and are) common enough. But beginning with the Industrial Revolution in the mid-seventeenth century, as millions left the land for the cities and populations exploded, thinkers disagreed about how best to feed people in an economy based on manufacturing rather than agriculture. Should it be left to the free market? Or should governments take control? What criteria should be used to decide who gets fed—and in what amount—and who doesn’t? Are some more deserving of being fed than others?
Weighing in on these questions, Adam Smith was sanguine. In 1776 he published “The Wealth of Nations,” in which he lauded the free market and the profit motive as drivers of economic progress. “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner,” he wrote, “but from their regard to their own interest.” The “invisible hand” of competition would harness private ambition to the public good. Smith’s near-contemporary Thomas Malthus was far more pessimistic. Malthus’s influential tract “An Essay on the Principle of Population,” first published in 1798 and revised several times afterward, turned the invisible hand into an iron fist: Unless kept in check, he maintained, human reproduction would outrun the best efforts to increase the food supply and would lead inevitably to famine and mass death.
Charles Darwin and Karl Marx sided more with Smith than with Malthus. Although Malthus’s assertion of the indifference and profligacy with which nature spawned and destroyed life helped Darwin formulate his theory of “natural selection,” in which only the fittest survived, Darwin believed that famines no longer played a critical part in human evolution. In his seminal book “On the Origin of Species” (1859), he described “famines and other such accidents” as occurrences “to which savages are so liable.” In “The Descent of Man” (1871), he expanded on that liability:
“With savages the difficulty of obtaining subsistence occasionally limits their number in a much more direct manner than with civilized people, for all tribes periodically suffer from severe famines. At such times savages are forced to devour much bad food, and their health can hardly fail to be injured.”
Darwin’s comment offers no hint that even as he worked in his home in Cornwall writing “On the Origin of Species,” the greatest civilian catastrophe in nineteenth-century Europe was unfolding just a day’s journey away, in Ireland—the Irish famine, which triggered waves of mass death and emigration. The failure on Darwin’s part to mention the Irish famine might have reflected his belief that it was a historical aberration, or perhaps he wished to steer clear of the political and nationalistic passions it stirred.
Karl Marx, on the other hand, was too busy hailing the coming triumph of the urban proletariat to pay attention to the collapse of the antiquated and doomed social structure of rural Ireland. Marx didn’t touch on events in Ireland in “The Communist Manifesto,” published in 1848 at the height of the famine. Instead he celebrated the bourgeoisie as the gravedigger of the old order: “It has created enormous cities, has greatly increased the urban population as compared with the rural, and has thus rescued a considerable part of the population from the idiocy of rural life.” In his masterwork, “Das Kapital” (1867)—he sent an inscribed copy to Darwin—he wrote dismissively that “the Irish famine of 1846 killed more than 1,000,000 people, but it killed poor devils only.”
In “Famine: A Short History” (2009), Irish economist Cormac Ó Gráda questions whether famines ever served as a Malthusian check on population. “In the past,” Ó Gráda contends, “the demographic impact of famines tended to be relatively short-lived.” Disease provided the Grim Reaper a more reliable scythe, especially among infants and the aged. It’s undeniable, however, that famines played a critical role in the struggle for global supremacy that unfolded from the middle of the nineteenth century into the second part of the twentieth. Along with John Kelly’s eminently readable history of the Irish famine, “The Graves Are Walking: The Great Famine and the Saga of the Irish People,” three recent books—Timothy Snyder’s “Bloodlands: Europe Between Hitler and Stalin,” Lizzie Collingham’s “The Taste of War: World War II and the Battle for Food,” and “Tombstone: The Great Chinese Famine 1958–1962″ by Yang Jisheng—provide instructive reminders of the degree to which food supply has been used as a tool of social engineering and a weapon of war.
As these books make clear, between 1845 and 1961—a span of little more than a century—the number of deaths from hunger and its effects exceeded the total in all of preceding human history. The ratio of deaths to population in the Irish famine (1845–51) and the Chinese famine (1958–61) represent record rates of mortality. The central problem in most modern famines was never an absolute lack of food. At issue was distribution. Contra Malthus, the volume of mortality wasn’t simply a case of too many mouths to feed; rather, to one degree or another, economic theories and government bureaucracies were the culprits. This was no mere innocent bureaucratic bungling, as John Kelly’s book makes clear. On the contrary, these catastrophes were either used or conceived to bring about the modernization of underlying socioeconomic structures.
How did this happen? The short answer is that hunger shook hands with administrative bureaucracy, economic theory, and political ideology. As Cormac Ó Gráda reminds us, the United Kingdom of the 1840s possessed “the wealthiest economy in the world.” Its navy and merchant marine ruled and regulated world trade. It dominated markets across the globe, sometimes—as with the Opium Wars with China—prying them open at gunpoint. Its manufacturing prowess was unchallenged. For all these reasons, Britain in the mid-nineteenth century could accurately be described as the first nation to make the full transition into modernity. The Irish famine was part of this transition. With the arrival of a devastating potato blight in the autumn of 1845, Sir Robert Peel, the Tory prime minister, took steps to prevent a catastrophic increase in mortality. But Peel also intended to use the crisis to break the tenacious grip of Ireland’s small farmers and laborers on their paltry holdings and turn them into wage earners employed on large, efficient farms—or factory workers in the industrial centers of the British Isles.
Peel may have hoped this could be done with a minimum of distress to the several million people at the bottom of the Irish economic pyramid, but by the summer of 1846 he was out of office—and Sir John Russell, his Liberal Party successor, was a disciple of the Manchester School, which held that government should abstain from interference with the laws of supply and demand. This faith was reinforced by the reigning orthodoxies of Protestant Evangelicalism and Providentialism, which rested on the confidence that God sent disasters like the potato blight as punishment for human transgressions and as an opportunity for imposing the kind of moral reform that would bring Ireland into conformity with the superior values of Anglo-Saxon society. As the London Times editorialized in the autumn of 1846, “An island, a social state, a race is to be changed. The surface of the land, its divisions, its culture…its law, its language, and the heart of a people who for two thousand years have remained unalterable within the compass of those mighty changes which have given us European civilization, are all to be created anew.”
Before the work of re-creation came the job of razing what was in place. Sir Charles Trevelyan, an eminent Victorian who served as assistant secretary of the Treasury, welcomed the blight as a heaven-sent chance to “cure” the Irish of chronic dependency. In 1847, the Parliament abandoned any pretense of assisting the Irish—shutting down soup kitchens and other relief efforts—and acted to facilitate clearing the land of as many tenants as possible. For anyone remotely acquainted with the situation in Ireland, the consequences were obvious. “But unlike the morally blinkered, who saw only hunger, misery, and death in the ruined potato fields,” writes John Kelly sardonically,” Mr. Trevelyan saw the restless hand of God at work.”
In his 1860 jeremiad “The Last Conquest of Ireland (Perhaps),” Irish nationalist John Mitchel charged that “the Almighty, indeed, sent the potato blight, but the English created the Famine.” In indicting the British government for “deliberate murder” —the word “genocide” wouldn’t be coined for another eight decades—he articulated a sentiment shared by many Irish and Irish-Americans, both then and today. Kelly’s judgment on this question—that while “the intent” of British relief policy “may not have been genocidal…the effects were”—is equivocal. But perhaps it’s as close to the truth as we can get. Though contempt for the Irish colored everything they did, Trevelyan and company neither caused the blight nor set out to send a million people to their deaths. Yet they concocted a policy of malign neglect and active interference designed to use a food shortage to reshape Irish society. Whether the Irish wasted away from hunger and disease or fled abroad—in excess of 2 million emigrated in a single decade—didn’t matter. The ideological end of modernizing “an island, a social state, a race” justified the means.
More than half a century later, in the wake of World War I, hunger once again became a tool of peacetime social engineering on a massive scale. In sync with the “scientific certainties” of Marxism, the Bolshevik faction under Lenin that took control of Russia in 1917 believed in iron laws of economics as devoutly as did the acolytes of the Manchester School. Yet the fruit that had fallen into their lap wasn’t, as Marx had predicted, an industrialized society planted and ripened by the bourgeoisie, but rather the ramshackle, backward, heavily agricultural Czarist Empire, where socialism would have to be sown and grown, not reaped through revolution.
Timothy Snyder portrays Lenin as shrewdly tempering ruthlessness with realism, conducting a “political holding action” that gave a degree of autonomy to the various republics and allowed private ownership. But after Lenin’s death in 1924, his choice as general secretary of the Communist Party, the crafty and conscienceless Joseph Stalin, put aside his predecessor’s caution and pursued an overnight transformation of the new Soviet state. Begun in 1928, Stalin’s first Five Year Plan was a breakneck push into urban-industrial modernity premised on returning peasants to serfdom. Their crops would feed the cities and provide exports to generate the hard currency to buy foreign machinery. The wealthier peasants were tagged kulaks (“tight-fisted”), an elastic label stretched to include anyone who resisted surrendering his holdings, however meager, to the state. Tens of thousands were shot; 1.7 million were deported to the Gulag.
The epicenter of this action was Ukraine, which, in Snyder’s description, became “a giant starvation camp, with watchtowers, sealed borders, pointless and painful labor, and endless and predictable death” (see “Europe’s Darkest Hour,” Commonweal, February 22, 2011). The result was the famine of 1930–33—“the greatest artificial famine in the history of the world”: more than 5 million died in what Snyder deems an act of deliberate genocide directed against the Ukrainian people. Writing in the London Review of Books (November 4, 2010), historian Richard J. Evans argued that Stalin’s starvation policy didn’t actually single out Ukrainians but was directed against kulaks—many of them Russian. Yet Snyder is indisputably correct when he emphasizes the non-Malthusian essence of Stalin’s famine, which “took place in times of peace, and was related more or less distantly to an ideologically informed vision of modernization.”
During World War II, Adolf Hitler pursued a policy—justified as a requirement of German national survival and as a right conferred by racial superiority—of making war in order to seize new national living space (lebensraum). The key lay in the East, from which, as Lizzie Collingham puts it, Hitler imagined Germany could carve out “its own version of the American west.” Collingham’s enlightening book reveals the extent to which food production, distribution, and consumption were critical to the conduct and outcome of the war. Often ignored or relegated to the war’s backstory, food, as Collingham tells it, was a prime motive in the ambitions of the aggressors and a strategic priority among the major combatants.
In 1941, a critical year in the war, Hitler launched Operation Barbarossa, in the hope of scoring a lightning victory over the U.S.S.R. Like Stalin, Hitler focused on a swift and radical transformation of the Soviet countryside, particularly Ukraine. Herbert Backe, head of the innocuous-sounding Reich Ministry for Food and Agriculture, drafted the Nazi blueprint. A classic “desk criminal” (Schreibtischtater) who never served on the front or set foot in a death camp, Backe laid out a grand scheme for a postwar resettlement—“The Hunger Plan”—that envisioned “a European California,” its “idyllic new towns and ideal agricultural communities” built on the graves of 30 million Slavs methodically starved to death, with another 70 million shipped off to the Soviet Arctic zone to labor and die in a gulag now under German management.
In the event—and at the price of horrendous losses—the Red Army stopped the Nazi onslaught, and the Hunger Plan was never put into full operation. Nevertheless, Hitler used hunger against his opponents wherever he could. Patients in the Reich’s mental hospitals were put on a diet designed to kill in three months. Of the 3 million Soviet POWs who died in captivity, Timothy Snyder estimates that 2.6 million perished from hunger. Several million Soviet civilians starved, 1 million in the siege of Leningrad alone. And many of the 6 million Jews who perished, both in and out of the death camps, died from hunger.
On the Western front of the war, Hitler had hoped that a relentless campaign of U-Boat attacks could damage Britain’s supply lines so badly it would be forced to make peace. And indeed, at their peak, U-Boats sank about 10 percent of food shipments to Britain. Thanks to the astounding prodigality of U.S. supplies and shipping, however, the U-Boat attacks never came close to sinking Britain itself. “Throughout the worst months of the Battle of the Atlantic,” Collingham concludes, “British civilians were never confronted with the problem of hunger, let alone the specter of starvation.”
For its part, Britain ran its wartime food policies according to what Collingham describes as “an unspoken food hierarchy” that relegated the needs of its colonial subjects to the bottom. As she reports, Prime Minister Winston Churchill and his War Cabinet decided “that India would be the part of the empire where the greatest civilian sacrifices would have to be made.” When told that the food situation in India had become critical, the War Cabinet’s reaction, in Collingham’s judgment, was “irresponsible and brutal.” Echoing Trevelyan’s verdict on the Irish a century before, Churchill “claimed that Indians had brought these problems on themselves by breeding like rabbits and must pay the price of their own improvidence.” The Bengal famine that raged between 1943 and ’44 killed approximately 3 million people. Confronted with the facts of what was happening, Churchill asked “if food was so scarce in India, why had Gandhi not yet died?” That famine, ironically, was carved forever into the childhood consciousness of Amartya Sen, then a nine-year-old boy in West Bengal. Sen grew up to become a Nobel Prize–winning economist whose seminal work on famine has revealed how far the phenomenon rests not on actual shortages of food, but on social inequalities and on politicizations of the food-provision mechanisms that invariably work against the poor.
As for the other major combatant nations in World War II, imperial Japan didn’t plan for systematically starving those under its sway. In Collingham’s view, however, the so-called Greater East Asia Co-Prosperity Sphere and the planned settlement of a million Japanese farmers in Manchuria shared the same rationale as the German drive for lebensraum; expansionism and the exploitation of conquered peoples and territories were seen as the sine qua non of being a player on the world stage. Collingham estimates the toll inflicted by the Japanese invasion of China to be “at least 15 million civilians, 85 percent of them peasants, and virtually all them the victims of deprivation and starvation.” The suffering in China was paralleled by that in Indo-China, where Japan’s “ruthless requisitioning of rice” led to the Tonkin famine, in which 1 to 2 million Vietnamese died of hunger, with new research suggesting “that the scale of the horror was far greater.”
Despite early success at plundering the empire they’d conquered, the Japanese themselves soon felt the effects of the counterattack mounted by the far more powerful United States. Where German U-Boats failed to sever Britain’s supply lines, the U.S. submarine campaign shredded Japan’s maritime supply lanes. By 1944, Japan’s shipping capacity had been reduced by 60 percent, and the situation quickly worsened, threatening ultimately to become militarily decisive. As Collingham makes clear, citing Napoleon’s famous adage that “an army travels on its stomach,” Japan’s military crawled to defeat on a nearly empty belly, with “60 percent, or more than 1 million, of the total 1.74 million Japanese military deaths between 1941 and 1945…caused by starvation and diseases associated with malnutrition.”
These food-related deaths were the result of dedicated American military policies. Beginning in March 1945, the United States undertook Operation Starvation, dedicating a special force of B-29s under General Curtis Lemay to seed the waters around the home islands with mines. Japanese shipping was paralyzed. Hunger was rampant, famine inevitable. Only the dropping of the atom bombs spared the Japanese from having to choose, in the end, between starvation and submission.
Two great powers emerged out of World War II: the Soviet Union and the United States. Historians continue to debate the origins of the Cold War that followed. How much was due to Stalin’s intransigence and belligerency? How much to blind anti-Communism on the part of American leaders? What’s clear is that among a significant portion of the anticolonial leadership in the less-developed world, choosing the Marxist-Leninist model of imposing industrialization and modernization through central planning and one-party control seemed more viable than following the capitalist road. The victory of the Chinese Communists in 1949 put the world’s most populous nation under the rule of Mao Zedong, a doctrinaire Marxist-Leninist who set out in as short a time as possible to make the People’s Republic the equal of the two superpowers—an ambition embodied in the cruelly named “Great Leap Forward.”
As chronicled by Yang Jisheng, a long-time reporter for China’s official news agency, the Great Leap Forward pulled the country into an abyss of mass starvation and death. Yang recognizes his own complicity in the cover-up that followed. He didn’t question the Communist Party’s version of events—in which his own father perished—until, disillusioned by the 1989 Tiananmen Square massacre, he set out to unearth the truth behind the famine of 1958–61. The resulting book, “Tombstone,” is a highly detailed, two-volume account (the English version has been edited into a single volume) intended by Yang as a memorial to his father—a “tombstone in my heart,” he writes—and to the millions of other victims.
Yang demolishes the notion that bad weather, tight global grain markets, or the withdrawal of Soviet advisers contributed to the deaths of 30 million people, and lays the blame squarely on Mao. A megalomaniacal tyrant who envisioned his rule as a marriage, in his own words, of “Marx with [the ancient emperor] Qin Shi-huang,” Mao used the Great Leap Forward to gather the peasantry into military-style communes, turning the Chinese countryside into a gigantic barracks. Civil society was abolished. The family was done away with. Every aspect of life and work was regimented by the state. The people were to be created anew.
Ideological rigidity and economic fantasy produced collective insanity. When Beijing issued quotas, which local officials met and exceeded by requisitioning every ounce of grain, officials then set new and higher quotas. Communal kitchens, inefficient to begin with, became hopelessly undersupplied. An utterly unrealistic plan for spurring local steel production led to communes melting down whatever was at hand—cooking implements, ploughs, temple bells, etc. When the true effects of the catastrophe grew evident, Mao denounced “right-deviationist thinking” among naysayers and subversives, and unleashed a wave of violent repression.
Yang’s chronicle of the suffering that flowed from Mao’s orders insistently recounts the mind-numbing particulars of how many died, where, and how. “The labor reform team of the Zhongba administrative district,” “Tombstone” tells us, “included an eleven-year-old girl named Chen Yuxiu, who was forced to work for five straight days and nights. She collapsed, bleeding from the nose and mouth, and ultimately died.” In the details of suffering, all famines are, finally, alike. Mao’s Chinese victims underwent the same gruesome physical ravages John Kelly describes among the Irish: “the eyelids inflame; the angular lines around the mouth deepen into cavities; the swollen thyroid gland becomes tumor-sized; fields of white fungus cover the tongue, blistering mouth sores develop, the skin acquires the texture of parchment; teeth decay and fall out, gums ooze pus, and a long silky growth of hair covers the face.”
The suffering continues among the survivors in weakened bones, damaged hearts, haunted memories, and multi-generational psychological effects. Studies done after the Second World War indicate that, when subject to malnutrition and starvation in the womb, children were born with a predisposition to schizophrenia and psychotic depression. The repercussions, reports Lizzie Collingham, “are still echoing down through the generations, into the present day.”
Whether adherents of Marxism, the Manchester School, or National Socialism, in both war and peace those in charge of modern famines agreed that it was the victims who were at fault. Irish peasants were lazy and superstitious; Ukrainian kulaks, greedy and reactionary; Slavs and Jews, filthy untermenschen; Bengalese, chronic overbreeders. In the eyes of the Japanese, Chinese peasants were incorrigible and primitive; in Mao’s view, they were “regressionists” who lacked “adequate psychological preparation for socialist revolution.” Progress, however defined, depended on removing the human impediments that stood in its way.
This article was published as the cover story in Commonweal (May 16th, 2014)
Twenty years ago this month, I published my first novel, “Banished Children of Eve.” When the idea for the book first came to me, I conceived of it as a work of nonfiction, not a novel. I had put on temporary hold (alas, it turned out to be permanent) my pursuit of a Ph.D. in history and was working in Albany as a gubernatorial speechwriter. Lapsed historian though I was, I hadn’t lost my interest in the past.
In the course of researching a speech on housing policy, I stumbled across the first report made by the state legislature on conditions in New York City. Dated 1855, it was a Dickensian catalogue of poverty, disease and appalling overcrowding in the immigrant wards on or near the city’s waterfront. As I dug deeper, it became obvious that the vast majority of those living amid these wretched and unsanitary conditions were Irish immigrants and their children who’d fled the Great Famine and its aftermath.
Though I was soon finished with the speech, I had just begun the exhaustive process of research into the epic effects that the Famine immigration had on American urban life in general and on the shaping of New York in particular. Overnight, New York went from being an important Atlantic entrepôt to what it remains to this day: an immensely energetic, sometimes conflicted, always dynamic immigrant city of global proportions.
The deeper I dug, the more I was struck by how every aspect of the city was changed by the sudden arrival of a tsunami of traumatized peasants fleeing the worst civilian catastrophe in Western Europe between the Thirty Years’ War and World War One. I was equally impressed by the amnesia that seemed to erase the scope and sweep of these changes not just from the minds of most New Yorkers but from the very consciousness of these immigrants’ descendants, myself included.
As I devoured newspaper accounts and historical records, I gave real thought for the first time to the fact that my own great grandparents, Michael and Margaret Manning, were buried among these words and statistics. Beyond that they arrived in or around 1847, I knew that they probably came from Kilkenny. It seems they might have been illiterate, and it’s even possible that the name Manning had been changed from Mangin due to an error in transcription.
I started out wanting to write a social history that described in exhaustive detail the flight of the Famine Irish to New York–a million of them entered the port between 1845 and 55–and what awaited them once they arrived and struggle to start new lives. The year of research that I allotted myself stretched into three and then four. The more I learned, the more I felt there was more to know.
The historical details were endlessly fascinating. And yet, I grew increasingly frustrated by what was beyond my learning and what I could never know. The unrecorded everyday experiences of these immigrants, their quotidian fears and expectations, their fondest memories and deepest hopes were lost. They were faceless and voiceless. The density and complexity of their passions and pain were reduced to a single line in a census or death certificate.
Eventually, I gave up on history. If I was going to reach these people in their individuality and particularity, if I was going to enter their vanished world, I could only do it through an act of the imagination. I decided to attempt a novel.
I started by imagining a story built around the catastrophic Draft Riots of 1863, the worst urban disturbance in American history. It took three years of writing before I finally got to the riots. The characters–African Americans as well as Irish and native Yankees–took control of the plot. They led me down the labyrinthine ways of their individual existences, each in his or her own way a banished child of eve, all of them moving through this vale of tears to the music of Stephen Foster, whose life and songs are the book’s leitmotiv.
In the twenty years that “Banished Children” has remained in print, it has opened more doors, taken me more places and introduced me to more people than I could have ever possibly imagined. I rapidly discovered that the great silence that followed the Great Famine wasn’t a unique part of my family’s legacy but woven into the fabric of the Irish-American experience.
As I traveled with the book, I met an amazing array of artists and writers–Irish and otherwise. They are involved in unearthing, exploring and celebrating the rich and hidden histories of immigrants, slaves and working people whose labor, sacrifices, songs, stories and aspirations, though often given scant attention in official accounts, have enriched our country beyond all measure.
The night before “Banished Children” came out, I met Tom Flanagan at the Madison Avenue Pub for a celebratory drink. As well as a master novelist–his “Year of the French” is, in my opinion, among the greatest historical novels ever written–Tom was a friend and mentor. Tom toasted the future. “Don’t be surprised, “ he said, “at how far your banished children will travel and, if you’re lucky, at all the friends they’ll bring home.”
Tom was a prophet as well as a teacher.
(This essay was published in the 3/14/14 edition of the Irish Voice)
In my experience, most novelists have tried and failed at one profession or another before they turned to fiction writing. I failed at several. High school teacher. Court officer. Wall Street messenger. Historian. Alas, the list is long and sorrowful.
When I first took up writing, I aspired to be a poet not a novelist, but I failed at that too. Maybe that’s why I have such admiration for poets. I know how hard it is to succeed at producing a single worthwhile poem, never mind to do it year after year.
Except for an occasional foray undertaken as a private exercise and not an attempt to redeem my former failure, I no longer write poetry. But I continue to read the work of poets I admire, the famous (Yeats, Auden, Heaney, et al.) and the not so famous (Angela Alaimo O’Donnell is a favorite).
Recently, I’ve found myself making repeat visits to Daniel Thomas Moran’s most-recent book of poems, A Shed for Wood (Salmon Poetry, 2013) Moran has made his living as a dentist, a trade marked by ruthless practicality and a prosaic focus on the material and mechanical–drill bits, needles, pliers, braces, bridges and the growing armory of hi-tech devices to prevent, remove and replace the ravages of routine and inevitable decay.
In essence, dentistry has always seemed to be the polar opposite of poetry. Certainly, there have been medical doctors who’ve excelled at poetry. The American poet William Carlos Williams comes immediately to mind. But dentists? In my prejudiced view, dentists have always been to doctors what plumbers are to architects, mechanics rather than artists, their expertise necessary and useful but lacking the holistic vision and wider understanding that we expect (if rarely encounter) among physicians.
Moran has forced me re-examine that prejudice. His poetry is grounded in everyday realities as common and unromantic as canines and molars. But like the master dentist he is (Moran has been a private practitioner as well as a professor of dentistry at Boston University), he constantly probes, exposes, drills deep, undeterred by surfaces.
For me, Moran’s verse combines elements of my favorite triumvirate of American poets–Emily Dickinson, Walt Whitman and Robert Frost. It is earthy, unpretentious, accessible, agnostic, sometimes comic, often serious, frequently both, rooted in the ordinary–mayflies, horseshoe crabs, sparrows, tumbled stones and treetops–yet capable of delivering a jolt of understanding as sharp and sudden as when a dental drill strikes an unanesthetized nerve.
I’ve been keeping A Shed for Wood beside my bed. I read a few poems each night. I mull their insights and their meanings. Moran and I differ in our worldviews: he, a stalwart unbeliever; I, an incurable adherent of the creed. But the wisdom in his poems transcends such boundaries. On my way to sleep, I embrace the poet’s invitation to go “Where we can be with our aloneness / at rest with its bottomless still / and inhale the life which inhabits us.”
Moran is a favorite of several prominent writers, including the late Samuel Menashe, a poet of the first rank and the first to be honored with the Poetry Foundation’s “Neglected Masters Award.” Yet despite this, and despite the fact he’s been accorded a number of honors–including a stint as the poet laureate of New York’s Suffolk County–Moran’s work, in my view, has never come close to receiving the attention it deserves.
Moran now lives with his wife Karen in the New Hampshire woods. I’m not sure if he still practices dentistry, but as A Shed for Wood makes clear, he continues to practice poetry at the highest level, turning out poems that serve as a source of wonder, enjoyment, enlightenment, and laughter.
You lovers of words, do yourself a favor: Neglect him no longer.
A Shed for Wood is available on Amazon.
My introduction to the triune came early. Each morning as my classmates and I made the sign of the cross, my first-grade nun stressed that the Trinity–one God in three separate and distinct persons, Father, Son and Holy Ghost–was essential to our faith and, ergo, to our salvation. Since my six-year-old brain couldn’t make much sense of it, I was happy to be told the three-person God was a mystery beyond human understanding and had almost driven mad the theologians who’d tried to solve it.
Still, it stuck. Three in one, one in three. The holy trifecta. In the large stained glass window on the south wall of our Bronx parish church, St. Patrick held up a shamrock. One stem, three petals: They glowed a single emerald green as the sun lofted behind them. For that moment at least, the riddle of the Trinity ceased to bewilder.
Over the years, as I wandered amid the thickets of secularity, I learned that, as well as a marker of religious dogma, three brought to whatever it was associated with a special aura, whether exciting (Triple Crown), silly (Three Stooges), erotic (ménage à trois), scary (Third Reich), exceptional (triple play), or sad (strike three). Just by being three, ordinary things gained a special cachet.
When I set out to become a writer of books, I imagined one would suffice. A historian manqué, just shy of a Ph.D., I first stumbled into speech writing. I decided to try it for a year, save enough to go back to school, finish the dissertation, and turn it into a book. “The best-laid schemes o’ mice an’ men,” as Scottish poet Robert Burns put it, “gang aft agley.” I ended up scribbling for two New York governors and five chairmen of Time Inc./Time Warner across a span of three decades.
On the plus side, my job involved indoor work and required no manual labor. It paid the mortgage and tuitions, and included a defined benefit plan; on the minus, it was frequently stressful, sometimes grinding and always anonymous. Occasionally a speechwriter or two has slipped from behind the curtain and gained fame crafting words for mouths other than his/her own. But as I saw it, once you take the king’s shilling, you do the king’s bidding, and whatever praise or blame ensues is the sovereign’s alone.
As time went on, I felt a growing need to put my name on words I could publicly claim as mine. I got to my office two hours early in order to attempt a novel. Having grown used to churning out large chunks of copy in short amounts of time, I calculated I’d have a finished manuscript in a year or two. Robert Burns proved right again. Ten years later, I left the delivery room cradling my long-gestating mind child, Banished Children of Eve, a six-hundred-page saga of Civil War New York.
The first agent I submitted it to was dismissive. I hadn’t written one novel, she wrote, but “sausaged three in one.” I was stung. Yet the more I thought about it, the more I realized its truth. My novel was the story of Irish famine immigrants, the frightening, fecund mongrel world of mid-19th-century New York, and the impact of the Civil War. These were the three petals. Minstrel-songster Stephen Foster was stem and sausage skin. His music is the book’s leitmotiv. There are worse things to be accused of, I decided, than being a Trinitarian. I stuck with three in one, and that’s how it was published.
I drew a great deal of satisfaction from at last having my name on writing all my own, so much so that I decided one wasn’t enough. I had other stories I wanted to write. Faced by commercial constraints as well as those of my own mortality, I knew the next had to be shorter. Unfortunately, hard as I tried, I couldn’t get the hang of the short form, which required the precision of the pointillist. I preferred the Jackson Pollack school, buckets of paint splashed across expansive canvases.
With the second novel, I decided to reverse the first: In place of three packed in one, one would be divided in three. The stem I started with was Fintan Dunne, Irish-American ex-cop and private eye, a veteran of World Wars I and II, whose formal education ended in the Catholic Protectory, an orphanage cum reformatory in the Bronx. In hardboiled style, Fin is a man who, if he ever had any illusions about human nature, had them kicked out of him so long ago he can’t remember what they were.
Fin is what the writer William Kennedy calls a “cynical humanist.” Distrustful of all authority, skeptical of most causes, uninterested in heroics, he is reluctant to get involved. Whatever the case, he knows from the outset that there are no perfect endings, no spotless souls, and that some mysteries are better left unsolved. Still, despite his understanding of the futility of good intentions and the hopeless fallibility of everyone–including himself–Fin can’t help but try to see that some modicum of justice is done.
I followed Fin as he fought with eugenicists and fifth columnists (Hour of the Cat), wrestled with the still-unsolved case of New York’s most-famous missing jurist (The Man Who Never Returned), and burrowed into the Cold War’s intricate machinations and betrayals (Dry Bones). I’ve seen the city and the world through his eyes as he experienced two world wars, the Great Depression and the gloom-and-boom of the Eisenhower era, the rollercoaster years W.H. Auden accurately labeled “The Age of Anxiety.”
I’m grateful for our three-legged journey. Fintan has been great company every step of the way. Now that we finished our last caper and said our goodbyes, I’m hopeful that I’ve told his story the way he wanted it told, and that the three tales together–separate and distinct yet parts of the same whole–capture him in a jaded emerald glow.