The Hunger Games: Who Gets to Eat & Who Decides

May 8, 2014

Along with air and water, food is the common denominator of human survival. Throughout history, the quest for daily sustenance has often been precarious. Food shortages caused by crop failures or extreme weather were (and are) common enough. But beginning with the Industrial Revolution in the mid-seventeenth century, as millions left the land for the cities and populations exploded, thinkers disagreed about how best to feed people in an economy based on manufacturing rather than agriculture. Should it be left to the free market? Or should governments take control? What criteria should be used to decide who gets fed—and in what amount—and who doesn’t? Are some more deserving of being fed than others?

Weighing in on these questions, Adam Smith was sanguine. In 1776 he published “The Wealth of Nations,” in which he lauded the free market and the profit motive as drivers of economic progress. “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner,” he wrote, “but from their regard to their own interest.” The “invisible hand” of competition would harness private ambition to the public good. Smith’s near-contemporary Thomas Malthus was far more pessimistic. Malthus’s influential tract “An Essay on the Principle of Population,” first published in 1798 and revised several times afterward, turned the invisible hand into an iron fist: Unless kept in check, he maintained, human reproduction would outrun the best efforts to increase the food supply and would lead inevitably to famine and mass death.

Charles Darwin and Karl Marx sided more with Smith than with Malthus. Although Malthus’s assertion of the indifference and profligacy with which nature spawned and destroyed life helped Darwin formulate his theory of “natural selection,” in which only the fittest survived, Darwin believed that famines no longer played a critical part in human evolution. In his seminal book “On the Origin of Species” (1859), he described “famines and other such accidents” as occurrences “to which savages are so liable.” In “The Descent of Man” (1871), he expanded on that liability:

“With savages the difficulty of obtaining subsistence occasionally limits their number in a much more direct manner than with civilized people, for all tribes periodically suffer from severe famines. At such times savages are forced to devour much bad food, and their health can hardly fail to be injured.”

Darwin’s comment offers no hint that even as he worked in his home in Cornwall writing “On the Origin of Species,” the greatest civilian catastrophe in nineteenth-century Europe was unfolding just a day’s journey away, in Ireland—the Irish famine, which triggered waves of mass death and emigration. The failure on Darwin’s part to mention the Irish famine might have reflected his belief that it was a historical aberration, or perhaps he wished to steer clear of the political and nationalistic passions it stirred.

Karl Marx, on the other hand, was too busy hailing the coming triumph of the urban proletariat to pay attention to the collapse of the antiquated and doomed social structure of rural Ireland. Marx didn’t touch on events in Ireland in “The Communist Manifesto,” published in 1848 at the height of the famine. Instead he celebrated the bourgeoisie as the gravedigger of the old order: “It has created enormous cities, has greatly increased the urban population as compared with the rural, and has thus rescued a considerable part of the population from the idiocy of rural life.” In his masterwork, “Das Kapital” (1867)—he sent an inscribed copy to Darwin—he wrote dismissively that “the Irish famine of 1846 killed more than 1,000,000 people, but it killed poor devils only.”

In “Famine: A Short History” (2009), Irish economist Cormac Ó Gráda questions whether famines ever served as a Malthusian check on population. “In the past,” Ó Gráda contends, “the demographic impact of famines tended to be relatively short-lived.” Disease provided the Grim Reaper a more reliable scythe, especially among infants and the aged. It’s undeniable, however, that famines played a critical role in the struggle for global supremacy that unfolded from the middle of the nineteenth century into the second part of the twentieth. Along with John Kelly’s eminently readable history of the Irish famine, “The Graves Are Walking: The Great Famine and the Saga of the Irish People,” three recent books—Timothy Snyder’s “Bloodlands: Europe Between Hitler and Stalin,” Lizzie Collingham’s “The Taste of War: World War II and the Battle for Food,” and “Tombstone: The Great Chinese Famine 1958–1962” by Yang Jisheng—provide instructive reminders of the degree to which food supply has been used as a tool of social engineering and a weapon of war.

As these books make clear, between 1845 and 1961—a span of little more than a century—the number of deaths from hunger and its effects exceeded the total in all of preceding human history. The ratio of deaths to population in the Irish famine (1845–51) and the Chinese famine (1958–61) represent record rates of mortality. The central problem in most modern famines was never an absolute lack of food. At issue was distribution. Contra Malthus, the volume of mortality wasn’t simply a case of too many mouths to feed; rather, to one degree or another, economic theories and government bureaucracies were the culprits. This was no mere innocent bureaucratic bungling, as John Kelly’s book makes clear. On the contrary, these catastrophes were either used or conceived to bring about the modernization of underlying socioeconomic structures.

How did this happen? The short answer is that hunger shook hands with administrative bureaucracy, economic theory, and political ideology. As Cormac Ó Gráda reminds us, the United Kingdom of the 1840s possessed “the wealthiest economy in the world.” Its navy and merchant marine ruled and regulated world trade. It dominated markets across the globe, sometimes—as with the Opium Wars with China—prying them open at gunpoint. Its manufacturing prowess was unchallenged. For all these reasons, Britain in the mid-nineteenth century could accurately be described as the first nation to make the full transition into modernity. The Irish famine was part of this transition. With the arrival of a devastating potato blight in the autumn of 1845, Sir Robert Peel, the Tory prime minister, took steps to prevent a catastrophic increase in mortality. But Peel also intended to use the crisis to break the tenacious grip of Ireland’s small farmers and laborers on their paltry holdings and turn them into wage earners employed on large, efficient farms—or factory workers in the industrial centers of the British Isles.

Peel may have hoped this could be done with a minimum of distress to the several million people at the bottom of the Irish economic pyramid, but by the summer of 1846 he was out of office—and Sir John Russell, his Liberal Party successor, was a disciple of the Manchester School, which held that government should abstain from interference with the laws of supply and demand. This faith was reinforced by the reigning orthodoxies of Protestant Evangelicalism and Providentialism, which rested on the confidence that God sent disasters like the potato blight as punishment for human transgressions and as an opportunity for imposing the kind of moral reform that would bring Ireland into conformity with the superior values of Anglo-Saxon society. As the London Times editorialized in the autumn of 1846, “An island, a social state, a race is to be changed. The surface of the land, its divisions, its culture…its law, its language, and the heart of a people who for two thousand years have remained unalterable within the compass of those mighty changes which have given us European civilization, are all to be created anew.”

Before the work of re-creation came the job of razing what was in place. Sir Charles Trevelyan, an eminent Victorian who served as assistant secretary of the Treasury, welcomed the blight as a heaven-sent chance to “cure” the Irish of chronic dependency. In 1847, the Parliament abandoned any pretense of assisting the Irish—shutting down soup kitchens and other relief efforts—and acted to facilitate clearing the land of as many tenants as possible. For anyone remotely acquainted with the situation in Ireland, the consequences were obvious. “But unlike the morally blinkered, who saw only hunger, misery, and death in the ruined potato fields,” writes John Kelly sardonically,” Mr. Trevelyan saw the restless hand of God at work.”

In his 1860 jeremiad “The Last Conquest of Ireland (Perhaps),” Irish nationalist John Mitchel charged that “the Almighty, indeed, sent the potato blight, but the English created the Famine.” In indicting the British government for “deliberate murder” —the word “genocide” wouldn’t be coined for another eight decades—he articulated a sentiment shared by many Irish and Irish-Americans, both then and today. Kelly’s judgment on this question—that while “the intent” of British relief policy “may not have been genocidal…the effects were”—is equivocal. But perhaps it’s as close to the truth as we can get. Though contempt for the Irish colored everything they did, Trevelyan and company neither caused the blight nor set out to send a million people to their deaths. Yet they concocted a policy of malign neglect and active interference designed to use a food shortage to reshape Irish society. Whether the Irish wasted away from hunger and disease or fled abroad—in excess of 2 million emigrated in a single decade—didn’t matter. The ideological end of modernizing “an island, a social state, a race” justified the means.

More than half a century later, in the wake of World War I, hunger once again became a tool of peacetime social engineering on a massive scale. In sync with the “scientific certainties” of Marxism, the Bolshevik faction under Lenin that took control of Russia in 1917 believed in iron laws of economics as devoutly as did the acolytes of the Manchester School. Yet the fruit that had fallen into their lap wasn’t, as Marx had predicted, an industrialized society planted and ripened by the bourgeoisie, but rather the ramshackle, backward, heavily agricultural Czarist Empire, where socialism would have to be sown and grown, not reaped through revolution.

Timothy Snyder portrays Lenin as shrewdly tempering ruthlessness with realism, conducting a “political holding action” that gave a degree of autonomy to the various republics and allowed private ownership. But after Lenin’s death in 1924, his choice as general secretary of the Communist Party, the crafty and conscienceless Joseph Stalin, put aside his predecessor’s caution and pursued an overnight transformation of the new Soviet state. Begun in 1928, Stalin’s first Five Year Plan was a breakneck push into urban-industrial modernity premised on returning peasants to serfdom. Their crops would feed the cities and provide exports to generate the hard currency to buy foreign machinery. The wealthier peasants were tagged kulaks (“tight-fisted”), an elastic label stretched to include anyone who resisted surrendering his holdings, however meager, to the state. Tens of thousands were shot; 1.7 million were deported to the Gulag.

The epicenter of this action was Ukraine, which, in Snyder’s description, became “a giant starvation camp, with watchtowers, sealed borders, pointless and painful labor, and endless and predictable death” (see “Europe’s Darkest Hour,” Commonweal, February 22, 2011). The result was the famine of 1930–33—“the greatest artificial famine in the history of the world”: more than 5 million died in what Snyder deems an act of deliberate genocide directed against the Ukrainian people. Writing in the London Review of Books (November 4, 2010), historian Richard J. Evans argued that Stalin’s starvation policy didn’t actually single out Ukrainians but was directed against kulaks—many of them Russian. Yet Snyder is indisputably correct when he emphasizes the non-Malthusian essence of Stalin’s famine, which “took place in times of peace, and was related more or less distantly to an ideologically informed vision of modernization.”

During World War II, Adolf Hitler pursued a policy—justified as a requirement of German national survival and as a right conferred by racial superiority—of making war in order to seize new national living space (lebensraum). The key lay in the East, from which, as Lizzie Collingham puts it, Hitler imagined Germany could carve out “its own version of the American west.” Collingham’s enlightening book reveals the extent to which food production, distribution, and consumption were critical to the conduct and outcome of the war. Often ignored or relegated to the war’s backstory, food, as Collingham tells it, was a prime motive in the ambitions of the aggressors and a strategic priority among the major combatants.

In 1941, a critical year in the war, Hitler launched Operation Barbarossa, in the hope of scoring a lightning victory over the U.S.S.R. Like Stalin, Hitler focused on a swift and radical transformation of the Soviet countryside, particularly Ukraine. Herbert Backe, head of the innocuous-sounding Reich Ministry for Food and Agriculture, drafted the Nazi blueprint. A classic “desk criminal” (Schreibtischtater) who never served on the front or set foot in a death camp, Backe laid out a grand scheme for a postwar resettlement—“The Hunger Plan”—that envisioned “a European California,” its “idyllic new towns and ideal agricultural communities” built on the graves of 30 million Slavs methodically starved to death, with another 70 million shipped off to the Soviet Arctic zone to labor and die in a gulag now under German management.

In the event—and at the price of horrendous losses—the Red Army stopped the Nazi onslaught, and the Hunger Plan was never put into full operation. Nevertheless, Hitler used hunger against his opponents wherever he could. Patients in the Reich’s mental hospitals were put on a diet designed to kill in three months. Of the 3 million Soviet POWs who died in captivity, Timothy Snyder estimates that 2.6 million perished from hunger. Several million Soviet civilians starved, 1 million in the siege of Leningrad alone. And many of the 6 million Jews who perished, both in and out of the death camps, died from hunger.

On the Western front of the war, Hitler had hoped that a relentless campaign of U-Boat attacks could damage Britain’s supply lines so badly it would be forced to make peace. And indeed, at their peak, U-Boats sank about 10 percent of food shipments to Britain. Thanks to the astounding prodigality of U.S. supplies and shipping, however, the U-Boat attacks never came close to sinking Britain itself. “Throughout the worst months of the Battle of the Atlantic,” Collingham concludes, “British civilians were never confronted with the problem of hunger, let alone the specter of starvation.”

For its part, Britain ran its wartime food policies according to what Collingham describes as “an unspoken food hierarchy” that relegated the needs of its colonial subjects to the bottom. As she reports, Prime Minister Winston Churchill and his War Cabinet decided “that India would be the part of the empire where the greatest civilian sacrifices would have to be made.” When told that the food situation in India had become critical, the War Cabinet’s reaction, in Collingham’s judgment, was “irresponsible and brutal.” Echoing Trevelyan’s verdict on the Irish a century before, Churchill “claimed that Indians had brought these problems on themselves by breeding like rabbits and must pay the price of their own improvidence.” The Bengal famine that raged between 1943 and ’44 killed approximately 3 million people. Confronted with the facts of what was happening, Churchill asked “if food was so scarce in India, why had Gandhi not yet died?” That famine, ironically, was carved forever into the childhood consciousness of Amartya Sen, then a nine-year-old boy in West Bengal. Sen grew up to become a Nobel Prize–winning economist whose seminal work on famine has revealed how far the phenomenon rests not on actual shortages of food, but on social inequalities and on politicizations of the food-provision mechanisms that invariably work against the poor.

As for the other major combatant nations in World War II, imperial Japan didn’t plan for systematically starving those under its sway. In Collingham’s view, however, the so-called Greater East Asia Co-Prosperity Sphere and the planned settlement of a million Japanese farmers in Manchuria shared the same rationale as the German drive for lebensraum; expansionism and the exploitation of conquered peoples and territories were seen as the sine qua non of being a player on the world stage. Collingham estimates the toll inflicted by the Japanese invasion of China to be “at least 15 million civilians, 85 percent of them peasants, and virtually all them the victims of deprivation and starvation.” The suffering in China was paralleled by that in Indo-China, where Japan’s “ruthless requisitioning of rice” led to the Tonkin famine, in which 1 to 2 million Vietnamese died of hunger, with new research suggesting “that the scale of the horror was far greater.”

Despite early success at plundering the empire they’d conquered, the Japanese themselves soon felt the effects of the counterattack mounted by the far more powerful United States. Where German U-Boats failed to sever Britain’s supply lines, the U.S. submarine campaign shredded Japan’s maritime supply lanes. By 1944, Japan’s shipping capacity had been reduced by 60 percent, and the situation quickly worsened, threatening ultimately to become militarily decisive. As Collingham makes clear, citing Napoleon’s famous adage that “an army travels on its stomach,” Japan’s military crawled to defeat on a nearly empty belly, with “60 percent, or more than 1 million, of the total 1.74 million Japanese military deaths between 1941 and 1945…caused by starvation and diseases associated with malnutrition.”

These food-related deaths were the result of dedicated American military policies. Beginning in March 1945, the United States undertook Operation Starvation, dedicating a special force of B-29s under General Curtis Lemay to seed the waters around the home islands with mines. Japanese shipping was paralyzed. Hunger was rampant, famine inevitable. Only the dropping of the atom bombs spared the Japanese from having to choose, in the end, between starvation and submission.

Two great powers emerged out of World War II: the Soviet Union and the United States. Historians continue to debate the origins of the Cold War that followed. How much was due to Stalin’s intransigence and belligerency? How much to blind anti-Communism on the part of American leaders? What’s clear is that among a significant portion of the anticolonial leadership in the less-developed world, choosing the Marxist-Leninist model of imposing industrialization and modernization through central planning and one-party control seemed more viable than following the capitalist road. The victory of the Chinese Communists in 1949 put the world’s most populous nation under the rule of Mao Zedong, a doctrinaire Marxist-Leninist who set out in as short a time as possible to make the People’s Republic the equal of the two superpowers—an ambition embodied in the cruelly named “Great Leap Forward.”

As chronicled by Yang Jisheng, a long-time reporter for China’s official news agency, the Great Leap Forward pulled the country into an abyss of mass starvation and death. Yang recognizes his own complicity in the cover-up that followed. He didn’t question the Communist Party’s version of events—in which his own father perished—until, disillusioned by the 1989 Tiananmen Square massacre, he set out to unearth the truth behind the famine of 1958–61. The resulting book, “Tombstone,” is a highly detailed, two-volume account (the English version has been edited into a single volume) intended by Yang as a memorial to his father—a “tombstone in my heart,” he writes—and to the millions of other victims.

Yang demolishes the notion that bad weather, tight global grain markets, or the withdrawal of Soviet advisers contributed to the deaths of 30 million people, and lays the blame squarely on Mao. A megalomaniacal tyrant who envisioned his rule as a marriage, in his own words, of “Marx with [the ancient emperor] Qin Shi-huang,” Mao used the Great Leap Forward to gather the peasantry into military-style communes, turning the Chinese countryside into a gigantic barracks. Civil society was abolished. The family was done away with. Every aspect of life and work was regimented by the state. The people were to be created anew.

Ideological rigidity and economic fantasy produced collective insanity. When Beijing issued quotas, which local officials met and exceeded by requisitioning every ounce of grain, officials then set new and higher quotas. Communal kitchens, inefficient to begin with, became hopelessly undersupplied. An utterly unrealistic plan for spurring local steel production led to communes melting down whatever was at hand—cooking implements, ploughs, temple bells, etc. When the true effects of the catastrophe grew evident, Mao denounced “right-deviationist thinking” among naysayers and subversives, and unleashed a wave of violent repression.

Yang’s chronicle of the suffering that flowed from Mao’s orders insistently recounts the mind-numbing particulars of how many died, where, and how. “The labor reform team of the Zhongba administrative district,” “Tombstone” tells us, “included an eleven-year-old girl named Chen Yuxiu, who was forced to work for five straight days and nights. She collapsed, bleeding from the nose and mouth, and ultimately died.” In the details of suffering, all famines are, finally, alike. Mao’s Chinese victims underwent the same gruesome physical ravages John Kelly describes among the Irish: “the eyelids inflame; the angular lines around the mouth deepen into cavities; the swollen thyroid gland becomes tumor-sized; fields of white fungus cover the tongue, blistering mouth sores develop, the skin acquires the texture of parchment; teeth decay and fall out, gums ooze pus, and a long silky growth of hair covers the face.”

The suffering continues among the survivors in weakened bones, damaged hearts, haunted memories, and multi-generational psychological effects. Studies done after the Second World War indicate that, when subject to malnutrition and starvation in the womb, children were born with a predisposition to schizophrenia and psychotic depression. The repercussions, reports Lizzie Collingham, “are still echoing down through the generations, into the present day.”

Whether adherents of Marxism, the Manchester School, or National Socialism, in both war and peace those in charge of modern famines agreed that it was the victims who were at fault. Irish peasants were lazy and superstitious; Ukrainian kulaks, greedy and reactionary; Slavs and Jews, filthy untermenschen; Bengalese, chronic overbreeders. In the eyes of the Japanese, Chinese peasants were incorrigible and primitive; in Mao’s view, they were “regressionists” who lacked “adequate psychological preparation for socialist revolution.” Progress, however defined, depended on removing the human impediments that stood in its way.

This article was published as the cover story in Commonweal (May 16th, 2014)




The “Banished Children of Eve” at 20

March 20, 2014

Twenty years ago this month, I published my first novel, “Banished Children of Eve.” When the idea for the book first came to me, I conceived of it as a work of nonfiction, not a novel. I had put on temporary hold (alas, it turned out to be permanent) my pursuit of a Ph.D. in history and was working in Albany as a gubernatorial speechwriter. Lapsed historian though I was, I hadn’t lost my interest in the past.

In the course of researching a speech on housing policy, I stumbled across the first report made by the state legislature on conditions in New York City. Dated 1855, it was a Dickensian catalogue of poverty, disease and appalling overcrowding in the immigrant wards on or near the city’s waterfront. As I dug deeper, it became obvious that the vast majority of those living amid these wretched and unsanitary conditions were Irish immigrants and their children who’d fled the Great Famine and its aftermath.

Though I was soon finished with the speech, I had just begun the exhaustive process of research into the epic effects that the Famine immigration had on American urban life in general and on the shaping of New York in particular. Overnight, New York went from being an important Atlantic entrepôt to what it remains to this day: an immensely energetic, sometimes conflicted, always dynamic immigrant city of global proportions.

The deeper I dug, the more I was struck by how every aspect of the city was changed by the sudden arrival of a tsunami of traumatized peasants fleeing the worst civilian catastrophe in Western Europe between the Thirty Years’ War and World War One. I was equally impressed by the amnesia that seemed to erase the scope and sweep of these changes not just from the minds of most New Yorkers but from the very consciousness of these immigrants’ descendants, myself included.

As I devoured newspaper accounts and historical records, I gave real thought for the first time to the fact that my own great grandparents, Michael and Margaret Manning, were buried among these words and statistics. Beyond that they arrived in or around 1847, I knew that they probably came from Kilkenny. It seems they might have been illiterate, and it’s even possible that the name Manning had been changed from Mangin due to an error in transcription.

I started out wanting to write a social history that described in exhaustive detail the flight of the Famine Irish to New York–a million of them entered the port between 1845 and 55–and what awaited them once they arrived and struggle to start new lives. The year of research that I allotted myself stretched into three and then four. The more I learned, the more I felt there was more to know.

The historical details were endlessly fascinating. And yet, I grew increasingly frustrated by what was beyond my learning and what I could never know. The unrecorded everyday experiences of these immigrants, their quotidian fears and expectations, their fondest memories and deepest hopes were lost. They were faceless and voiceless. The density and complexity of their passions and pain were reduced to a single line in a census or death certificate.

Eventually, I gave up on history. If I was going to reach these people in their individuality and particularity, if I was going to enter their vanished world, I could only do it through an act of the imagination. I decided to attempt a novel.

I started by imagining a story built around the catastrophic Draft Riots of 1863, the worst urban disturbance in American history. It took three years of writing before I finally got to the riots. The characters–African Americans as well as Irish and native Yankees–took control of the plot. They led me down the labyrinthine ways of their individual existences, each in his or her own way a banished child of eve, all of them moving through this vale of tears to the music of Stephen Foster, whose life and songs are the book’s leitmotiv.

In the twenty years that “Banished Children” has remained in print, it has opened more doors, taken me more places and introduced me to more people than I could have ever possibly imagined. I rapidly discovered that the great silence that followed the Great Famine wasn’t a unique part of my family’s legacy but woven into the fabric of the Irish-American experience.

As I traveled with the book, I met an amazing array of artists and writers–Irish and otherwise. They are involved in unearthing, exploring and celebrating the rich and hidden histories of immigrants, slaves and working people whose labor, sacrifices, songs, stories and aspirations, though often given scant attention in official accounts, have enriched our country beyond all measure.

The night before “Banished Children” came out, I met Tom Flanagan at the Madison Avenue Pub for a celebratory drink. As well as a master novelist–his “Year of the French” is, in my opinion, among the greatest historical novels ever written–Tom was a friend and mentor. Tom toasted the future. “Don’t be surprised, “ he said, “at how far your banished children will travel and, if you’re lucky, at all the friends they’ll bring home.”

Tom was a prophet as well as a teacher.

(This essay was published in the 3/14/14 edition of the Irish Voice)



January 13, 2014

In my experience, most novelists have tried and failed at one profession or another before they turned to fiction writing. I failed at several. High school teacher. Court officer. Wall Street messenger. Historian. Alas, the list is long and sorrowful.

When I first took up writing, I aspired to be a poet not a novelist, but I failed at that too. Maybe that’s why I have such admiration for poets. I know how hard it is to succeed at producing a single worthwhile poem, never mind to do it year after year.

Except for an occasional foray undertaken as a private exercise and not an attempt to redeem my former failure, I no longer write poetry. But I continue to read the work of poets I admire, the famous (Yeats, Auden, Heaney, et al.) and the not so famous (Angela Alaimo O’Donnell is a favorite).

Recently, I’ve found myself making repeat visits to Daniel Thomas Moran’s most-recent book of poems, A Shed for Wood (Salmon Poetry, 2013) Moran has made his living as a dentist, a trade marked by ruthless practicality and a prosaic focus on the material and mechanical–drill bits, needles, pliers, braces, bridges and the growing armory of hi-tech devices to prevent, remove and replace the ravages of routine and inevitable decay.

In essence, dentistry has always seemed to be the polar opposite of poetry. Certainly, there have been medical doctors who’ve excelled at poetry. The American poet William Carlos Williams comes immediately to mind. But dentists?  In my prejudiced view, dentists have always been to doctors what plumbers are to architects, mechanics rather than artists, their expertise necessary and useful but lacking the holistic vision and wider understanding that we expect (if rarely encounter) among physicians.

Moran has forced me re-examine that prejudice. His poetry is grounded in everyday realities as common and unromantic as canines and molars. But like the master dentist he is (Moran has been a private practitioner as well as a professor of dentistry at Boston University), he constantly probes, exposes, drills deep, undeterred by surfaces.

For me, Moran’s verse combines elements of my favorite triumvirate of American poets–Emily Dickinson, Walt Whitman and Robert Frost. It is earthy, unpretentious, accessible, agnostic, sometimes comic, often serious, frequently both, rooted in the ordinary–mayflies, horseshoe crabs, sparrows, tumbled stones and treetops–yet capable of delivering a jolt of understanding as sharp and sudden as when a dental drill strikes an unanesthetized nerve.

I’ve been keeping A Shed for Wood beside my bed. I read a few poems each night. I mull their insights and their meanings. Moran and I differ in our worldviews: he, a stalwart unbeliever; I, an incurable adherent of the creed. But the wisdom in his poems transcends such boundaries. On my way to sleep, I embrace the poet’s invitation to go “Where we can be with our aloneness / at rest with its bottomless still / and inhale the life which inhabits us.”

Moran is a favorite of several prominent writers, including the late Samuel Menashe, a poet of the first rank and the first to be honored with the Poetry Foundation’s “Neglected Masters Award.” Yet despite this, and despite the fact he’s been accorded a number of honors–including a stint as the poet laureate of New York’s Suffolk County–Moran’s work, in my view, has never come close to receiving the attention it deserves.

Moran now lives with his wife Karen in the New Hampshire woods. I’m not sure if he still practices dentistry, but as A Shed for Wood makes clear, he continues to practice poetry at the highest level, turning out poems that serve as a source of wonder, enjoyment, enlightenment, and laughter.

You lovers of words, do yourself a favor: Neglect him no longer.

A Shed for Wood is available on Amazon.


The Thrill of the Trilogy

November 4, 2013

My introduction to the triune came early. Each morning as my classmates and I made the sign of the cross, my first-grade nun stressed that the Trinity–one God in three separate and distinct persons, Father, Son and Holy Ghost–was essential to our faith and, ergo, to our salvation. Since my six-year-old brain couldn’t make much sense of it, I was happy to be told the three-person God was a mystery beyond human understanding and had almost driven mad the theologians who’d tried to solve it.
Still, it stuck. Three in one, one in three. The holy trifecta. In the large stained glass window on the south wall of our Bronx parish church, St. Patrick held up a shamrock. One stem, three petals: They glowed a single emerald green as the sun lofted behind them. For that moment at least, the riddle of the Trinity ceased to bewilder.
Over the years, as I wandered amid the thickets of secularity, I learned that, as well as a marker of religious dogma, three brought to whatever it was associated with a special aura, whether exciting (Triple Crown), silly (Three Stooges), erotic (ménage à trois), scary (Third Reich), exceptional (triple play), or sad (strike three). Just by being three, ordinary things gained a special cachet.
When I set out to become a writer of books, I imagined one would suffice. A historian manqué, just shy of a Ph.D., I first stumbled into speech writing. I decided to try it for a year, save enough to go back to school, finish the dissertation, and turn it into a book. “The best-laid schemes o’ mice an’ men,” 
as Scottish poet Robert Burns put it, “gang aft agley.” I ended up scribbling for two New York governors and five chairmen of Time Inc./Time Warner across a span of three decades.
On the plus side, my job involved indoor work and required no manual labor. It paid the mortgage and tuitions, and included a defined benefit plan; on the minus, it was frequently stressful, sometimes grinding and always anonymous. Occasionally a speechwriter or two has slipped from behind the curtain and gained fame crafting words for mouths other than his/her own. But as I saw it, once you take the king’s shilling, you do the king’s bidding, and whatever praise or blame ensues is the sovereign’s alone.
As time went on, I felt a growing need to put my name on words I could publicly claim as mine. I got to my office two hours early in order to attempt a novel. Having grown used to churning out large chunks of copy in short amounts of time, I calculated I’d have a finished manuscript in a year or two. Robert Burns proved right again. Ten years later, I left the delivery room cradling my long-gestating mind child, Banished Children of Eve, a six-hundred-page saga of Civil War New York.
The first agent I submitted it to was dismissive. I hadn’t written one novel, she wrote, but “sausaged three in one.” I was stung. Yet the more I thought about it, the more I realized its truth. My novel was the story of Irish famine immigrants, the frightening, fecund mongrel world of mid-19th-century New York, and the impact of the Civil War. These were the three petals. Minstrel-songster Stephen Foster was stem and sausage skin. His music is the book’s leitmotiv. There are worse things to be accused of, I decided, than being a Trinitarian. I stuck with three in one, and that’s how it was published.
I drew a great deal of satisfaction from at last having my name on writing all my own, so much so that I decided one wasn’t enough. I had other stories I wanted to write. Faced by commercial constraints as well as those of my own mortality, I knew the next had to be shorter. Unfortunately, hard as I tried, I couldn’t get the hang of the short form, which required the precision of the pointillist. I preferred the Jackson Pollack school, buckets of paint splashed across expansive canvases.
With the second novel, I decided to reverse the first: In place of three packed in one, one would be divided in three. The stem I started with was Fintan Dunne, Irish-American ex-cop and private eye, a veteran of World Wars I and II, whose formal education ended in the Catholic Protectory, an orphanage cum reformatory in the Bronx. In hardboiled style, Fin is a man who, if he ever had any illusions about human nature, had them kicked out of him so long ago he can’t remember what they were.
Fin is what the writer William Kennedy calls a “cynical humanist.” Distrustful of all authority, skeptical of most causes, uninterested in heroics, he is reluctant to get involved. Whatever the case, he knows from the outset that there are no perfect endings, no spotless souls, and that some mysteries are better left unsolved. Still, despite his understanding of the futility of good intentions and the hopeless fallibility of everyone–including himself–Fin can’t help but try to see that some modicum of justice is done.
I followed Fin as he fought with eugenicists and fifth columnists (Hour of the Cat), wrestled with the still-unsolved case of New York’s most-famous missing jurist (The Man Who Never Returned), and burrowed into the Cold War’s intricate machinations and betrayals (Dry Bones). I’ve seen the city and the world through his eyes as he experienced two world wars, the Great Depression and the gloom-and-boom of the Eisenhower era, the rollercoaster years W.H. Auden accurately labeled “The Age of Anxiety.”
I’m grateful for our three-legged journey. Fintan has been great company every step of the way. Now that we finished our last caper and said our goodbyes, I’m hopeful that I’ve told his story the way he wanted it told, and that the three tales together–separate and distinct yet parts of the same whole–capture him in a jaded emerald glow.


A View of My Own

October 19, 2013

For the first twenty-five years of my writing life, I wrote in my office, at my desk, five days a week, in the early morning. I almost never wrote at home or on weekends. When I got to work, I’d keep my door closed and, except on dark winter mornings, leave the lights off so no one would know I was there.

I mused about someday having a nook, a corner, a cave–some space–in which I could go and just do my writing, with no time limits or distractions. But anytime I was tempted to feel sorry for myself, I recalled an account I’d read by Alexander Solzhenitsyn of his time in the Gulag.

Denied pen or paper, and facing severe punishment if he were caught writing anything–never mind a fictional account of life in the camps–he used the burnt tips of matches and toilet paper to write in whatever moments of solitude he could snatch for himself. That’s how he completed the manuscript of “One Day in the Life of Ivan Denisovich.”

Each morning, Solzhenitsyn remembered, as he and his fellow prisoners mustered in the freezing, Siberian dawn to be counted and dispatched to do heavy labor, the loudspeaker blared patriotic songs or official propaganda.

One day, however, it played a radio program form Moscow on “The Writer’s Life.” The first thing a writer should do, the announcer intoned, was to secure a comfortable place that was his alone. His desk should be uncluttered and the research and books he needed carefully catalogued and shelved. Quiet was crucial, although it was permissible to have classical music playing softly in the background.

“Now,” the announcer said, “you will be ready to begin the work of writing.”

We know what happened to Solzhenitsyn. He survived the camps. His manuscript was circulated in private and eventually published. He went on to write a series of epic novels and “The Gulag Archipelago,” an exhaustive and influential account of Stalin’s far-flung network of slave-labor camps. He was awarded the Nobel Prize for Literature.

But what, I wonder, happened to all those writers listening to the same broadcast he had, those who found their quiet, secure place, the uncluttered desk, classical music playing in the background?

I now write at home, in Hastings-on-Hudson, in an office I had built on the top floor. (The contractor was Kevin Groves, who’s also the main contractor for the ongoing restoration of the Tenement Museum, on Orchard Street, not far from where my great-grandparents and grandparents lived.)

It’s a dream space, with lots of bookshelves, a couch, an easy chair, a capacious desk, and mirable dictu, a view of the Hudson River and the Palisades.

I delighted and grateful to have this room of my own and the view of the Hudson Valley. But my writing hasn’t improved. I don’t write any faster or any better. I’m not sure what lesson in all this is. Certainly, I wouldn’t want to find myself in Solzhenitsyn’s predicament, or back at my old desk in Time Warner.

Still, I think it’s true for anyone serious about being a writer, if you can’t write in the place you want, then write in the place you are.



October 13, 2013

I’ve been brooding more than usual lately because I’m sailing in that Dead Sea of having a book about to come out and facing the daunting prospect of starting another. (“Some things in life get easier,” William Kennedy once told me, “but novel writing isn’t one of them.”) The more I think about it, the more I want to take a nap.

How, I wonder, did writing ever acquire an aura of romance and adventure? It’s lonely, isolating exasperating, a commitment of several years with no sure payoff (indeed, maybe a rejection) at the end. Yet for me the only thing more painful than writing is not writing.

“The Raymond Chandler Papers: Selected Letters and Nonfiction, 1909-1959,” which was bestowed me through the blessed benevolence of Joe Goodrich and Honor Molloy, is helping me navigate these troubled waters.

To a teacher in New Jersey who wrote him in 1946 asking for advice to give his pupils, Chandler sent this terse reply: “The people whom God intended to be writers find their own answers, and those who have to ask are impossible to help. They are merely people who think they want to be writers.”

Chandler is the enemy of illusions. He constantly stresses the hard work involved in writing. The writer’s job, as he makes clear, is to show up: “The important thing is that there be a space of time, say four hours a day at least, when a professional writer doesn’t do anything else but write. He doesn’t have to write, and if he doesn’t feel like it, he shouldn’t try … But he is not to do any other positive thing, not read, write letters, glance at magazines, or write checks. Write or nothing …Two simple rules, a. you don’t have to write; b. you can’t do anything else. The rest comes of itself.”

I had just read this when I stumbled on the late Elmore Leonard’s “10 Rules of Writing.” Leonard was a terrific writer, and I agree with his rules (mostly). Yet if you follow them strictly, the danger is you’ll end up sounding like Leonard, and if you can’t develop a distinctive voice–a signature style–why try to be a writer in the first place?

I was once on a panel with a writing teacher who pontificated on his “10 Pillars (no mere rules for this professor) of Good Writing.” I listened with silent skepticism to the first few–“good writers write in complete sentences” (tell that to Joyce, Faulkner, et al.); “good writers know their audience” (writers can never be sure of who their audience is/will be. Á la Socrates, all they can they can know is themselves). I completely tuned out when he admonished that “good writers always start with an outline of what they want to say.”

I first ran into that rule in high school, and in 30-plus years as a speechwriter/novelist I’ve found it to be less a pillar than a brick wall. Devising and adhering to an outline is like trying to diagram a sentence before you write it. Writing, whether fiction or nonfiction, is storytelling. A story grows out of itself, reveals itself in the telling, unfolds truths and nuances that are invisible until, in his or her wandering and wondering, the writer discovers, unearths, stumbles upon them. Writing is exploration–finding, losing, re-finding your way–and not mere map reading.

When it came my turn to speak, I said that I didn’t know much about pillars but as a devotee of naps, I’d suggest four pillows for writers to sleep on: 1.) Write badly. Don’t let the editor in your head take over until you first get down on paper some version of what you want to say. All good writing is re-writing. 2.) Write on schedule. Have a set time when you show up at your desk/laptop to write. 3.) Be a fanatic. Never give up. When you reach what seems a dead end, brood on it. Yes, a mixed metaphor: brooding on dead ends. Yet in my experience, if you persist the egg will hatch and the dead end prove to be a pathway. The writing will reveal what you need to keep writing. 4.) If these rules don’t work for you, invent your own.

This morning I showed up dutifully at my desk. I tried to begin the new novel. I fiddled around, stared out the window, sighed, penned a sentence or two, crossed them out, pet the dog, fretted that maybe I don’t have another novel in me or–if I do–lack the stamina and drive to get it out, then I went for a jog and took a nap.

In the end, to paraphrase a writer who broke all the strictures, to thine own rules be true. A writer must rule over his/her own material, not be ruled by someone else’s rules.

For my part, I’ve found no rules or commandments or laws worth a damn other than this: do what you can today and get up tomorrow and try all over again.



October 9, 2013

Today, I met a friend for lunch (a wonderfully generous, entirely successful friend). He invited me to the 21 Club, one of the city’s true hoity-toity watering holes and prominently featured in the 1957 classic “Sweet Smell of Success,” with Burt Lancaster as Broadway gossip columnist J. J. Hunsecker, and Tony Curtis as press agent Sidney Falco, and a script by Clifford Odets. (If you haven’t seen it go to Netflix immediately.) Anyway, I’m very familiar with the area, having worked for 20 odd years (some years odder than others) at the Time & Life Building on 6th and 51st and the Time Warner Building at 75 Rock. At least I thought I was familiar until I reached 52nd and 6th and looked up at the street sign. Holy heart failure, Batman, this is what it says (and I’m not making this up): “Avenue of the Americas (a bullshit name if there ever was one!), 6th Av., WC Handy Place, WCBS Way/Proud Sponsor Of Bike New York, and Cousin Bruce Way.” On the east side of the street, there’s an additional name: “Swing Way.” My first thought was where the hell am I? My second: In a name-off gang bang like this, how did Donald Trump miss out on pasting up his name? The “Avenue of the Americas” moniker, of course, is one of the oldest tricks in the con game that defines Manhattan real estate: Tear down the El, drive out the working poor, and call 6th “Avenue of the Americas,” and 9th “Columbus,” and 4th “Park,” yada yada. But, really, come on, folks, even in New York there have to be limits, no? I mean six street signs on one lamppost? One cross section with six names: “Avenue of the Americas , 6th Av., WC Handy Place, WCBS Way/Proud Sponsor Of Bike New York, Cousin Bruce Way, and Swing Way”? (And, if you didn’t grow up in the city in the 60s, who the hell knows who Cousin Bruce is/was?) I mean, put yourself in the shoes of a tourist from Tasmania or Topeka who’s just arrived in New York and is standing on 6th Avenue (or, more accurately, The Boulevard of U.S. Intervention in Latin and Central America in the Interests of Supporting Dictatorships Friendly to American Corporations)  trying to figure out where the hell he/she is? Note to the next mayor: Let’s cut the bullshit and get back to basics. Jeez Louise, enough already, this town is confused enough as it is without sticking six names on one place.