My non-fiction family memoir concerns an evolving planetary catastrophe. In the year 2000, chemist Paul Crutzen and biologist Eugene Stoermer began to popularize the neologism ‘Anthropocene’ as a human-induced geological event in the modern era. Greenhouse gas concentrations increased with human activity in the latter part of the Stone Age, but the rapid acceleration of consumption of fossil fuels as a source of energy has now sparked an environmental crisis. To address it, we must look back in time. Steam power and internal combustion technologies changed the land and oceans in a manner that dwarfed all previous domestic uses of fuels and mineral resources. The loosely defined, geological Anthropocene denotes a profound transformation of society and the environment, that portends either a new age of liberation and enlightenment, or the dissolution of civilization.
Climate heating caused by greenhouse-gas pollution is our immediate concern. This memoir traces my family’s connections to the history of harnessing fossil fuels. My great-grandfather Andrew was a coal-burning railroad engineer. Grandfather Frank was one of the first employees of General Motors. My uncles promoted the automotive and passenger airline industries during those revolutions in transportation. My parents chronicled the political and social strife that arose in the course of this worldwide transformation. Now, the condition of terra firma is my existential inheritance as a planetary scientist.
  • The Eastern Oregon desert, photographed the day of the solar eclipse of August 21, 2017.

    Summer Lake and the adjoining Paisley caves in Eastern Oregon were inhabited 14,000 years ago, then flooded by Lake Chewaucan at the end of the ice ages. This mud flat is a type locale of the Anthropocene event as humans began a global transformation.

    Foreword


    To surmount the impending climate catastrophe, we must understand its history. In the year 2000, chemist Paul Crutzen and biologist Eugene Stoermer began to popularize the neologism ‘Anthropocene’ as a human-induced geological event. The growth in the atmospheric concentrations of several greenhouse gases may have begun with human activity in the Neolithic, the latter part of the Stone Age, but global climate heating is now clearly associated with the massive consumption of fossil fuels as a source of energy. There are many technical details, which the reader is free to skip. Steam power and internal combustion technologies changed the land and oceans in a manner that dwarfed all previous domestic uses of fuels and mineral resources. The loosely defined, geological Anthropocene coincides with a profound transformation of society and the environment that portends either a new age of liberation and enlightenment, or the dissolution of civilization.

    Climate heating caused by greenhouse-gas pollution is our immediate concern. This memoir addresses it through my family’s connections to the history of harnessing fossil fuels. My grandfather and great-grandfather worked on the railroads, and grandfather Frank was one of the first employees of General Motors. My uncles launched and promoted the automotive and passenger airline industries during that revolution in transportation. My parents chronicled the political and social strife that arose in the course of this worldwide transformation. Now the condition of terra firma is my personal inheritance as a planetary scientist, but I am not the first to look at it from a family perspective.


    The context of human history is primeval. John McPhee coined the phrase “deep time” to describe our planet’s geological evolution in his collection Annals of the Former World (1998).The descriptions of nature in the fictionalized diaries of Wallace Stegner’s Angle of Repose (1971), and the non-fictional, measured geological prose of Marcia Bjornerud’s Timefulness: How thinking like a geologist can help save the world (2018) also take the reader through deep time. The climate scientist David Goodrich crisscrossed the continent in epic bicycle journeys to examine the ancient origins of fossil fuels in A Voyage Across an Ancient Ocean (2020)and their ecological consequences in A Hole in the Wind (2017). A journalist John Vaillant in Fire Weather (2023) described the onset of epic wildfires, such as one that took place, ironically, in the far northern Alberta bitumen oil extraction districts, as the hallmark of the “petrocene” era.

    The memoir begins with my parents meeting at a lecture by the daughter of Marie Curie, discoverer of two radioactive chemical elements. When my parents went to school there were 92 elements (my grandparents had only 64). In the decades hence, 26 more have been discovered or created. Many of the elements beyond uranium were discovered in fallout from the explosion of thermonuclear bombs. The fictional minerals and elements in popular culture with mystical properties—kryptonite, dilithium, vibranium, unobtanium—cannot compare with the miracle of chemistry on Earth and in our Solar System, the elements that gave rise to life. Modal and isotopic measurements of these elements show that the land we live on and the air we breathe are limited resources. In this knowledge lies our ability to determine our future.

    As David Simon and Ed Burns said when they ended the HBO series set in Baltimore,The Wire, “if people haven’t got it by the end of the 5th season, they never will.” This is a story about the rise and fall of the current social system. It involves planetary evolution, social evolution, religion, love, cruelty, severe weather, catastrophic climate change, old photographs, and fashion. Hopefully you will get it.

Introduction

In the early spring of 1940, Doris McGlone, a University coed in Ann Arbor, Michigan, was attending a lecture by Ève Denise Curie. Ève had recently published a biography of her Polish-French mother Marie Sklodowska Curie, who with her husband Pierre had isolated the radioactive elements polonium and radium, launching a scientific revolution that ushered in the modern world. Seated next to Doris in the balcony, a man leaned over and whispered, “Is that a Schiaparelli gown she’s wearing?” That was William Neumann’s opening line, the man who was to become her husband, and seven years later, my father.

Doris didn’t know the answer, but she was flattered that this fellow thought she was the kind who would know. War clouds were gathering over France and Ms. Curie was lecturing on “French Women and the War”. Mme. Elsa Schiaparelli, the clothes designer, had just fled Paris and was likewise touring the U.S. As she was planning her tour, Ève had scandalized Schiaparelli by assembling no special wardrobe. Schiaparelli told her “it would be ridiculous for you to appear in pre-War costumes!” So, the publicity-wise couturier created gowns for her—among other things, a black oilskin coat lined with fluffy lamb’s wool and equipped with huge pockets—just the thing for a Paris air raid, or lecturing in Michigan. Unlike her mother, who usually wore simple, black dresses, the musician and wartime journalist Ève “always cared about smart clothes, wore high-heeled shoes and make-up, and loved shining at parties.”

Doris and Bill began dating, as she put it, “at the dangerous time of lilacs in spring.” Radioactive decay allows us to date the appearance of the woody, flowering plants from which the fragrant lilacs descended, and also made possible scientific dating of the human species and its effect on the land.

Spring 1940 was but one season. Human history spans many millennia, and hopefully many more generations. Geologists debate whether to formalize the modern period as the Anthropocene, an event in the current geological epoch, the Holocene. The salient and undeniably human aspects of the Anthropocene are the now-leveled mountains that once covered vast coal formations, the polluted atmosphere and oceans caused by the burning of hundreds of millions of years of fossilized plants at an unprecedented rate, the radioactive traces of the Atomic Age, and the Space Age satellites that have explored the entire Solar System. The growth of urban population and rapid increases in water and energy consumption, by modern transportation and industrial agriculture are visibly changing our planet’s climate. These changes lead to a greater prevalence of hurricanes, floods, and heat waves with their brutal effects. To chalk them up simply to human activity obscures, of course, the complexity of interaction between nature and the social organization of our now-dominant species.

Rapid climate changes and species extinctions, perhaps ultimately our own, certainly deserve a scientific name. Petroleum-based microplastics, abandoned man-made satellites, and traces of “forever chemicals” are markers of this event that will be seen long in the future. To create a new unit of geological time may represent a conceited view of our importance in the history of the Earth, but with such conceit perhaps we shall recognize what needs to be done before we too vanish.

Doris’ father Frank McGlone kept journals, beginning in 1886, and through good fortune many of the yearly volumes were passed down to his daughter. They told of his farming and railroading life in the rural Midwest and his intermittent work in industry during Mark Twain’s Gilded Age, the post-Civil-War era of materialistic excess and political corruption accompanying the growth of industry and westward expansion. They chronicle a few years Frank spent as a lay minister for the Moody Bible Institute, and conclude with his employment in the newly-formed General Motors colossus. Frank and his sons built the automobiles that made Flint, Michigan the Vehicle City, made the paints that covered the cars, and helped start the passenger airline industry in Chicago.

Frank’s daughter Doris was too involved in teaching and travelling to write a memoir. She kept only some of her many letters. After retirement, she was interviewed into a tape recorder about her childhood and family, high school years, college, engagement and marriage. As the youngest child of her family, she left those reminiscences, together with her father’s journals, to her children as a final lesson on the blackboard.

This memoir ties my family history to that of the Anthropocene crisis; to the discoveries regarding nature made by pioneering women and men; to the evolution of life on the planet, and the prospects for our continued survival if we are brave enough to confront the crisis..

Deep Time: How Did We Get Here?

The Curie family’s research and an absolute geological time scale

Feel free to skip the details. Deep time ranges from 13.8 Ga to the modern age of recorded history. Our memoir lies within the last hundred millionth of cosmic time. Radiometric dating, genomics, DNA analysis, paleoclimatology, and the contributions of women in the physical sciences connect us with our ancestry.


In biology as in societies, cooperation between individuals develops out of necessity to surmount environmental challenges. We will never fully know our ancestors, but as scientific knowledge has grown, so has our access to the past. Starting almost four billion years ago, some simple forms of life deposited a slimy and scratchy biogenic material that was fossilized as stromatolites on future rocks. Colonies of bacteria or archaea, many of which are known as extremophiles (lovers of hostile environments such as hot springs or animal intestines), continue to extract carbon from the atmosphere and form microbial mats that become fossilized through the ages. The evolution of the multicellular eukaryotes (cells with nuclei) from which we are descended, and the oxygenation of the atmosphere starting 2.33 billion years ago (Ga) that made it possible for us to breathe and harness fire, are matters of ongoing study.

The precursors of those fragrant lilac bushes and the insects that pollinated them emerged several hundred million years ago, long before humans walked the Earth. Climatic swings from a snowball planet to tropical hothouse, punctuated by catastrophic events and mass extinctions, caused natural selection to proceed for billions of years. Modern hominids, beings who resemble us, emerged in the last 400,000 years, merely 0.01% of geologic time. Recorded history that began during the Bronze Age ~4,200 years before present  is only a millionth of the time of Earth’s evolution, and Doris’s family tree can be traced back to only a tenth of that. To use a football field analogy of the Earth’s existence, history is but a grain of sand, and our lifespans are the size of the tiniest bacterium. We must digress into astrophysics, paleobotany and archeology to cover the other 99.99999% of that time.

Time: a really neat concept, but hard to pin down. Time is what clocks measure. In quantum mechanics time is reversible, in thermodynamics it has a forward direction, and in cosmology it has something to do with the expansion of the universe, currently accelerating for unknown reasons, starting with an extremely hot, dense, and imponderable soup about 13.8 billion years ago pejoratively called “the Big Bang”. The astronomer Fred Hoyle coined this term in 1949, but such a singular event was originally described as a consequence of Einstein’s general relativity by a Belgian priest, Georges Lemaître, in 1927. Unanswered questions abound in cosmology, laughably asked as ‘what went bang, where did it happen, what was it before it went bang,’ etc., but the only way to really grapple with such questions, as Nobel prizewinning astrophysicist and cosmologist John Mather says, is to build instruments and measure things. And so, after measuring the background radio signal left by “cosmic inflation”, his James Webb Space Telescope with its honeycomb array of mirrors is now seeing stars and planetary systems like ours popping out of clouds of dust, generations after the first primordial stars existed and exploded over 13 billion years ago.

A scholarship student from the U.K. hung out nights at the Harvard observatory. Cecilia Payne told Sir Arthur Eddington of her aspiration to be an astrophysicist. He gave her a list of books to read, to which she replied that she had read them all, and was tackling the astrophysical journals in the library. She showed spectroscopically in 1925 that the bulk of the visible universe that resulted from this early event is composed of hydrogen and smaller amounts of helium, neither gaseous element being sufficient to support life. Her thesis examiners required her to state that her results “certainly were not real”, since Eddington had already decided that the composition of the Sun and stars was similar to Earth’s. It was four and a half years before the conservative Harvard faculty accepted her discovery, and another three decades before Cecila became a professor of astronomy. Payne-Gaposchkin’s portrait now hangs in Harvard’s faculty room.

Cecila Payne-Gaposchkin, Harvard Portrait Collection

Thirty years later, building on her wartime research and fighting sexism in the sciences all the while, Eleanor Margaret Burbidge and coauthors showed that astrophysical reactions in the beautifully glowing stars of the night create the carbon in our proteins and the iron in our blood. Planetary scientist Carl Sagan described these “metals” as “star stuff”, the precursors of planet Earth. These gaseous clouds and grains of dust that formed our solar system only exist because of the explosions of nearby stars in our galaxy, some more than twice as old as the Sun.

The aforementioned discoveries reflect the increasing role of expatriate women like Marie Curie (and her Nobel-prizewinning daughter Irène) in the physical sciences during the Anthropocene. The educational advancement of women in the 19th and 20th centuries was undoubtedly aided by the advances in productivity entailed by the industrial revolution and its commodity mode of production, and the ease of travel achieved by railroads and steamships. After Cecilia Payne’s characterization of the bulk of the matter that makes up the visible universe, the work of one of Burbidges’ collaborators, astronomer Vera Rubin, also initially barred from admission to graduate schools and observatories on account of gender, led to the groundbreaking discovery that only a small fraction of Newton’s gravitational mass in the universe is observable as stars and dust. Mysterious “dark matter” and “dark energy” comprises the rest of the universe.

Planets have formed throughout most of the universe’s 13.8-billion-year history, leaving open the possibility for the existence of ancient life in the Galaxy, but the Earth-sized planets of such old, metal-poor stars, such as Kepler 444, are unlikely to have our rich, fertile soil. Besides needing the organic compounds of hydrogen, carbon, nitrogen, and oxygen, major nutrients of phosphorus, potassium, sulfur, calcium and magnesium are vital, usually together with the silicon that provides the scaffolding for minerals and rocks. Traces of boron, chlorine, copper, iron, manganese, molybdenum, and zinc are also needed by plant life. Chromium, cobalt, strontium, nickel and vanadium are considered beneficial in trace amounts. While ubiquitous sodium is only required by some plants, the nervous systems of animals would not be possible without it, and some animals require traces of fluorine, iodine, bromine and even arsenic and selenium for health. Of the lighter elements created by early stellar nucleosynthesis, which comprise the first four rows of the periodic table, only the inert gases and six other elements, beryllium, aluminum, scandium, titanium, gallium, and germanium, are apparently not needed by any living thing. Only a couple of the heavier elements in rows five and six of the table, tin and lanthanum, are essential micronutrients, but apart from that, none of the “precious” metals in these rows are needed for life. But where would fashion be without the glitter of silver and gold, platinum and rhodium?

In the more recent (0.1%) Earth history, from the end of the Miocene, “deep time” proxies suggest a climate that was much warmer than today. Trees grew in Antarctica, while grasslands expanded in former forests. Large-skulled, jutting-chin primates, known as hominids, differentiated and evolved through untold generations. The species we call homo sapiens migrated out of Africa across several continents and eventually replaced the toothy, long-faced Denisovans and the technologically advanced Neanderthals of Eurasia. Early humans survived over the course of repeated cycles of glaciation and subsidence, sea level transgression and regression. Land bridges formed during these glacial periods, allowing migration to Pacific islands, Australia, and over the Bering Strait to the Americas.

As primates descended to grasslands from trees in response to Miocene climatic changes, arguably driven by plate tectonics and volcanic outpourings of greenhouse gases, they learned to walk upright. This freed their hands to use tools, allowed their skulls to expand, and larynxes to evolve. As their anatomy developed, so did the variety of sounds they used to communicate, precursors of monosyllabic speech and complex language. The use of fire and animal hides for clothing to ward off the chill dates back at least a million years. An omnivorous diet enhanced by cooking allowed more energetic activity than simple gathering of food. Hunting with spears hardened in flames, overcoming hunger and the ever-challenging forces of nature, these apes developed social organization, empathy, and emotions for millions of years before recognizably human skeletons evolved. They mourned and ritually buried their dead.

Fashion and Footwear


Ms. Schiaparelli’s devotion to fashion had its roots in climate variations. Clothing became widespread by 170,000 years ago, as evidenced by new types of body lice adapted to life in caves warmed by fire. During the ice ages (the Pleistocene epoch), an affliction known as “cold feet” was more than just a loss of nerve and refusal to proceed. The cobbler’s trade emerged. By 40,000 years ago the bones of the feet had evolved to adapt to wearing shoes and to allow a more confident step. No other species wears Birkenstocks. 

Evidence of sophisticated use of chert and bone-based tools suggests that humans had migrated as far as the North China plains and Australia by 40,000 years ago, their travel helped by lower ice-age sea levels. The furry skins of animals that cave people wore were blackened by soot to preserve them after being scraped clean. Among the flakes of chert tools are found layers of ocher. Presumably the red and yellow ocher-colored hides, as opposed to charcoal-tanned furs, were widely heralded as the new black.

The domination of homo sapiens over Neanderthals and Denisovans in the Middle Paleolithic, 300,000–50,000 years Before Present (BP), is not Stanley Kubrick’s 2001 awakening story in the desert. It may have been a friendly domination, because “sharing DNA” has given humans fragments of genomes with origins in several hominid species. Such intermingling spanned continents and imparted medically important traits that promoted survival across repeated ice ages. Estimates of Denisovan DNA in Iceland of 3.3% and upwards of 5% in some Melanesian populations are evidence of widespread interbreeding between human species and, doubtless, of cultural and technological exchanges. Archaic technologies and human figurative art were developed by now-extinct hominids well before 40,000 years ago, and were later perfected in the Upper Paleolithic, presumably by homo sapiens. Some of these fraternal species must have coexisted with anatomically modern humans, given the burial records found in a few well-preserved cave sites. While humans and Neanderthals may have coexisted peacefully, it is also possible that the early humans stole their technology, clothes and jewelry and then ate them.

Signs of modern behavior emerged at the end of this Middle Paleolithic period— fishing as a livelihood, stone blades and spears, kitchens, needles, thread, and above all, fashionable furs. Art forms and ritual burials attest to the evolving human traits of abstract thinking and love. Musical flutes are found where these hominid species lived in Europe, evidenced by appropriately-spaced holes drilled in the bones of cave bears. Our ancestors survived the most recent ice ages that began roughly 33,000 years ago when glaciers covered much of North America and Europe, and their numbers grew, but we will never hear their songs nor feel their emotions.

Radiometric Dating


Radioactivity was discovered almost accidentally by Henri Becquerel, who later shared the Nobel Prize with Marie and Pierre Curie for their characterization of radioactive elements. That was the trigger that resulted in that fateful lecture in Michigan where my parents Doris and Bill met, and is one of the many scientific revelations of the Anthropocene. And, that made my life possible.

The Curies’ discovery of chemical elements that spontaneously give off energetic rays resulted in Ernest Rutherford’s exponential decay law, whereby a constant decay rate (half-life) characterizes nuclear phenomena. That law in turn gave rise to a surge of discoveries regarding ancient history and the age of the Earth. Bertram Boltwood (1906) found that the ratio of radiogenic lead to uranium provides a method of dating rocks. After many decades, the study of radioactive decay chains addressed the vexing inconsistencies between the rather brief Biblical time scale, Newton’s and Kelvin’s somewhat longer estimates of cooling times, and the evidence found by geologists and evolutionists that the Earth was vastly older than said by scripture or Lord Kelvin’s geophysics. Slowly decaying primordial elements such as uranium and thorium revealed the true age to be four and a half billion years, give or take a few million.

The most important techniques for dating materials came from the atomic physics laboratories in the decades following Ève’s pre-war lecture in Michigan. Cosmic rays create unstable nuclides such as carbon-14 (14C) and 10Be, with half-lives spanning 5,700 to millions of years. These radioisotopes are deposited in rocks exposed to the surface and absorbed by living things. Their decay provides enduring chronometers for pre-historic times and absolute ties between stratigraphic records. Indeed, post-WW2 archaeology and climatology underwent a complete renaissance when these durable (but less precise) isotopic models extended chronologies backwards in time from the precisely dated record of growth rings of trees. Isotope systematics provide the ages of sections of oceanic drill cores, such as the ones on display at the Lamont-Doherty campus of Columbia University and thereby date prehistoric climate events, such as those on the European continent, repeatedly covered by ice sheets. (Insert image of drill core)

Growth rings in living and ancient pieces of wood can now be stitched together to track climate history for 13,910 years. Along with 14C and other radioactive elements, there are stable heavy isotopes of hydrogen and oxygen found in organic molecules and oceanic water. Fractionation of isotopes is a feature of the water cycle. Lighter isotopes evaporate more readily from the ocean and heavier ones are lost preferentially by precipitation. The rise and fall of sea level as temperatures change alters the concentration of these isotopes in glacial ice. These subtle variations in concentration are recorded in lake sediments and ice cores as well as tree rings, and allow us to infer prehistoric temperatures.

Temperature in Greenland over the last 20,000 years, reconstructed from ice cores. The Last Glacial Maximum (LGM) was a prolonged cold period during the Pleistocene epoch that followed the Eemian warm period more than 120,000 years ago. Sea levels rose intermittently as ice sheets broke off, creating sediment layers of continental debris (Heinrich Events). Many such episodes culminated in the transition to the current Holocene epoch (red arrow). Agriculture and animal husbandry began in a period of quickly increasing temperature at the end of the Younger Dryas, followed by the long, steady warm period of the Holocene for the last 8,000 years during which civilization expanded. Platt, D. E. et al. (26 May 2019)

Part II Ice Ages, Climate Swings, and Human Migration


Social systems evolving with climate change. Fossil-fueled railroads develop along with the discovery of greenhouse gas warming. The crucial role of the atmosphere in the Earth Energy Imbalance.

Around the end of the preceding Pliocene epoch, the Earth was cooler than in the preceding millions of years. Continental drift joined the previously isolated North and South American continents at the Isthmus of Panama and enclosed the Mediterranean Sea, changing ocean circulation patterns, making seasons more pronounced and drying forests in which our precursor species lived. The onset of glaciation in the Northern Hemisphere marked the advent of the Pleistocene epoch, during which it is thought that our genus homo emerged and walked upright. Climate alternated from glacial to temperate in 41,000 and 100,000 year year cycles, tied in complex ways to plate tectonics, ocean circulation, and Earth’s orbital variables.

In the middle of one of these glacial climatic periods, about 50,000 years ago, anatomically modern humans, homo sapiens with a skeletal structure indistinguishable from present-day individuals, arrived in Europe, Australia, and elsewhere, during the late Pleistocene ice ages, and coexisted with Neanderthals and other hominins. Recent genomic studies suggest that interbreeding took place for millennia. Within 10,000 years, those other body types disappear from the archaeological record. With technological advances in sample preparation and analysis, the dates assigned to the limited evidence preserved are too fuzzy to resolve how rapidly this took place after more than 99% of the radiocarbon has decayed. As a final gift from the Curie family, the radioactive phosphorus atoms attached to human DNA molecules have given rise to the emerging discipline of evolutionary genomics, whereby correlation of genetic sequences provides statistical chronologies much further back in time.

Pleistocene

The Last Glacial Maximum (LGM) lasted until about 15,000 years ago (15 ka). Glaciers had slowly begun to recede in the Northern Hemisphere starting about 20 ka, uncovering Boston and Manhattan, New York. Melting of the West Antarctic Ice Sheet started between 14 and 15 ka, and within a few centuries of warming, major changes occurred in sea level. Greenland ice cores show an abrupt warming that marks the final stage of the Pleistocene “ice ages”. The epoch receded in what is known as the Bølling–Allerød interstadial, an abrupt warm and moist period that ran from 14,690 to 12,890 years before the present.

The Pleistocene warm period, similar to the tropical paradise in which we now live, was brief. A millennium or so later, a possible comet or meteor impact triggered what is known as the Younger Dryas climate shift. Temperatures fell, glaciers again covered the British Isles. Many large mammals disappeared, unable to adapt to predators and changing habitats.

Holocene Transition

Another millennium passed before the ice age glaciers finally receded at the start of the Holocene Epoch, about 10 ka. Modern humans had visited the Americas well before this transition. They tracked the receding glaciers and retreating coastlines, having made use of the low sea levels to cross the Bering Strait along the Beringian Bridge. At times they ventured along the lush mountains of the North American Cordillera into what are now dry deserts. There is evidence of stoneworks in Mexico older than 30 ka, and that human footprints discovered in White Sands, New Mexico were left in Lake Otero at 22 ka. As the sea levels rose, the warmed, moisture-rich air quenched the dust storms that had clouded the atmosphere, making more of the Earth habitable.

Lake Bonneville in Nevada overflowed and flooded the Snake River Plain in Idaho. The Missoula Lake floods filled the Columbia River. Lake Lahontan covered 8500 square miles of Nevada, and a smaller land-locked Lake Chewaucan in southern Oregon filled to 125 m (400’) higher than its present desiccated remnant, Summer Lake. The Paisley Caves along its shoreline contain coprolites (fossilized turds) dated to more than 14,000 years ago. According to the DNA recovered from their poop, these humans subsisted on waterfowl, fish, and large mammals. They cooked and ate the flesh of now-extinct species of camels, horses, and bison on a small rock-lined hearth, surrounded by canines domesticated from wolves, not the present-day species but their close cousins, now extinct.

Silhouette of the author, 2017, at Summer Lake, Oregon (Lake Chewaucan). The Paisley caves in the adjoining Winter Range were inhabited by humans 14,000 years ago during the Ice Ages.
Humans leaving footprints


Silhouette of the author, 2017, at Summer Lake, Oregon (Lake Chewaucan)

Beginnings of the Anthropocene

Primitive beings burning wood for a million years may have deforested large expanses of land but had only a tiny effect on the amount of carbon dioxide (CO2) in the atmosphere. Burial of organic matter underwater, and erosion of continents whose minerals combine with and sequester CO2, caused modest fluctuations in its concentration. For the last million years, atmospheric CO2 concentration varied between roughly 180–290 parts per million (ppm) as the Earth cycled through glacial cold and interglacial warm periods. Only in 1958 (as described in the Space Age chapter) did precise measurements of atmospheric CO2 begin and show how much humans can affect the atmosphere.

These beings, however, had a major ecological impact on Eurasia and the Americas. Archaeology of Clovis-age settlements (11,500–11,000 years BP) shows that humans had transitioned from gathering nuts and the slaughter of the now-extinct mastodons, to sheepherding, seasonal agriculture with domesticated grains, and making pottery for storage. Their tools evolved and nomadic societies became settled communities. With granaries came verminous rodents, so cats were convenient to have in order to control them.

Neolithic civilizations, marked by accumulation of grain and trade, required more efficient methods of transport than the backs of animals. They sealed their boats with naturally occurring asphalt bitumen. Ships could only navigate open waters and portions of rivers, until the development of pound locks in the 10th century in China and the 15th century in Europe. Despite the dangers of the sea and variable conditions of inland canal transport, floating vessels have always consumed much less energy than overland transport.

A seminal event was the invention of the wheel about 5400 years ago, perhaps in the Indus valley in Harappan times, possibly for an oxcart in Poland or a wheelbarrow in Germany. The animal-drawn cart wheels left characteristic tracks that became roads. Oxen and horses could be relied on to draw carriages in the desired direction, later along wooden planks, covered with metal sheaths to reduce friction. For heavier loads, steel wheels eventually replaced wooden ones. Five millennia later a steel flange was added to the wheel to keep transport units on track. Rail transport then became possible without animal guidance. The rubber pneumatic tires developed by the Scotsmen Thompson and Dunlop cushioned the ride of bicycles and motor cars, while asphalt and concrete paving led to the Autobahn, Interstate Highways, and drive-in restaurants with their mini-skirted carhops on roller skates.

Bronze Age

Indigenous people mined copper in the Great Lakes in reduced (metallic) form 9,500 years ago to make implements. The people of the Old Copper Culture, finding stone tools superior to the pure soft metal, eventually abandoned copper implements. Fashion triumphed over function about 3,000 years ago; after that, early Native Americans used copper mostly for smaller, less utilitarian items associated with adornment, such as beads and bracelets.

In Cyprus (whose Greek name Kýpros was given to the metal copper) the practice of smelting copper ore with charcoal and umber to make armor began ca. 2760 BCE (Before Common Era). Transportation enabled trade, necessary to distribute minerals, and neolithic people alloyed the softer native copper with other metals to make bronze tools, tablets and statues. Properties varied with the early mixtures of ores from 3500 BCE within the Vinča culture in southeast Europe, but alloying with tin was found to be superior to the toxic arsenic previously added to harden copper. Tin is rarely found in high concentrations, and had to be imported from what is now Germany, and from Cornwall in the British Isles. Possibly my Teutonic or Celtic ancestors were diggers of minerals.

Little is cohesively recorded of the Bronze Ages, mostly told by the surviving relics of precious metal. Urban social structures developed around the mining of ore, possession of tools, and transport. Some of the glyphs of early forms of writing are still in use, e.g., the horns of an ox became the protrusions of the Aleph or Alpha or letter A.

Bronze weapons could quickly settle disputes over territory and food. Their possession often entailed formation of armies and destructive warfare: the sieges, sacking and reducing of cities while taking prisoners that continue to this day. More powerful weapons such as the longbow made metal armor less effective in battle. The invention of gunpowder and rocketry in China altered the battlefield entirely. Chemical explosives rendered most forms of armor obsolete, but highly polished and jeweled helmets are still fashionable amongst nobility.

Slavery and the Rise of Philosophy

Along with metallurgy came increasing concentrations of resources and power. In due course, slavery emerged as a form of wealth accumulation. For example, chattel slavery was already written into law by 2100 BCE in Sumeria. King Ur-Nammu codified the long-standing practice in Mesopotamia of human enslavement (ownership of labor power in its entirety without compensation) that predates written records, along with laws concerning private ownership of land that made slave labor advantageous. The enhanced productivity of agriculture was furthered by improved tools, irrigation, and the use of domesticated animals such as horses as well as slaves. In this era, the great Stonehenge monument was erected in England as were the great pyramids of Egypt. Unpaid conscription could not have provided both the immense skills and the million man-years required to build the Great Pyramid of Giza. The skilled laborers were honored by being buried alongside the tomb of their king.

Weights and measures, then money in the form of representative shekels and later as minted bronze coin exchanged for goods and labor emerged around this time, possibly to support these massive projects. The origins of such payments are lost to antiquity, but the evolution of currency as a representative of amounts of goods and labor eventually transformed human society. It took another few millennia to make human slaves a basic commodity of western colonialism, e.g., the trans-Atlantic trade originating in the 16th century onward settlement of plantations in the Americas by Conquistadors and their gold-and-silver economies.

Communalism and ethnocentricity may date to pre-historic times, but it took the slave trade and Eurocentric thinkers to crystallize the identification of skin color with a bogus notion of racial superiority. The English colonization of the Americas began a holocaust that killed and enslaved tens of millions of Africans, whose misfortune was to be recognizably dark-skinned. Benjamin Franklin, the colonial and signer of the Declaration of Independence, looked down on the “Blacks and Tawneys”, although he included Germans, Spaniards, Italians, French, Russians and Swedes in this category. The infamous Dred Scott v. Sandford ruling in 1857 by Chief Justice Taney said that black-skinned people were considered property, and that when the Constitution was framed, those with an African inheritance were regarded as “beings of an inferior order … and so far inferior, that they had no rights which the white man was bound to respect; and that the negro might justly and lawfully be reduced to slavery for his benefit.”

Two centuries ago, the state of Virginia formally outlawed the importation of slaves into the New World. Nearly a century later, Brazil became the last such country to abolish the slave trade. Another century passed, and the United Nations Educational, Scientific, and Cultural Organization (UNESCO) repudiated racism and affirmed the equal rights of all humans as one species.

Long before the 1978 Declaration, indentured servitude and the wages system had emerged. This market-driven, unstable enslavement precipitated the migration of many of our ancestors to a land of greater individual opportunity (for Europeans, at any rate).

Along with slavery and many centuries of wars between kingdoms and city-states arose a form of introspection now called philosophy. Religious scriptures collectively known as Vedas or Epics were recorded about 4,000 years ago although there are no precise chronological milestones for that. From ca 500 BCE, Confucius, Mencius, Socrates, Plato, and Aristotle recorded their dialogues concerning the mind. Their ideas were not always popular with the priesthood, and those philosophers sometimes suffered accordingly. The important results of their explorations were the recognition of logic in various forms, and the idealistic concept of a distinctly human nature, not just the mortal being that we share with lower species.

Greek philosophers who were attuned to the prevailing order provided a justification for enslavement, saying that some [Athenian men at least] are by nature free, but the conquered tribes were “born slaves”. The debate over the goodness or badness of this “human nature” was the basis for much of the evolution of religion, as opposed to the animistic beliefs imputed to prehistoric cultures. This vague idea continues to be invoked to justify the exploitation of labor and the unsuitability of socialism.

I will skip over the subject of churches lest I too suffer the poison cup or the burning stake. Suffice to say, my paternal grandfather was raised in the Catholic church, while my mother’s antecedents were raised in those oft-warring sects of Christianity, euphemistically called “the Protestant Reformation”. Neither the Catholics nor Protestants were especially tolerant of dissent. Meanwhile, Islam coexisted with Judaism and became dominant in Persia, Syria, Egypt, North Africa and Spain, where during the medieval Dark Ages an Arab culture flourished and preserved the knowledge of the ancient Greek philosophers for a time. 

There are now sporadic signs of established religions tackling the pressing issues of the day. In 2023, in response to protests by First Nation people in Canada, the Vatican formally repudiated the Discovery Doctrine enunciated in papal bulls from 1452 that justified the previous century of colonization of Africa and West Asia. The Doctrines issued for the next half-century authorized colonial powers such as France, Spain and Portugal to seize lands and enslave people in Africa and the “New World,” as long as people on those lands were not Christians. The repudiation by Pope Francis was five centuries too late to affect the course of European colonization. The “doctrine of discovery” by European nations still persists in this century, even among liberal justices of the Supreme Court of the US. Ruth Bader Ginzberg, writing for the majority, cited this doctrine in 2005, saying that the indigenous Oneida Nation had only a right of occupation, not ownership, of an ancestral land in New York.

A Warm Period, the Little Ice Age, and European Exploration

In northern Europe and the British Isles, the land of fairies and trolls, Beowulf and other sagas, a brief tenth-century, northern hemisphere warm period arising from Atlantic tropical cyclone activity is said to have enabled the first short-lived Viking settlements in Newfoundland. Cross-Atlantic searches for new resources resumed during the 15th century, as climate cooled somewhat throughout the Little Ice Age.

The Ancestral Puebloan, Mississippian, Clovis and Woodlands cultures built huge cities and 30-m-high symbolic mounds depicting animals during the Warm Period. Their mixed-agrarian and hunting societies vanished before the European invasion, possibly from prolonged drought or exhaustion of game, and their cities became dust. Scandinavians settled Iceland, holding a parliament by consensus at Thingvellir in 930 CE. While the ancient Greeks had coined a word for democracy, its hold on society had until then been fleeting. Centuries of destruction brought about by the Roman Empire and medieval noblemen who knew how to use only the sword and not the pen produced a theocratic culture in Europe, the Dark Ages, that hindered advances in civilization. In the course of reforming the papal church, King James I imposed many practices alien to the more Catholic Scots, justified by the assertion of the “divine right of kings”. Over the next few centuries the English barons of the realm of the Magna Carta made attempts to limit the rights of royalty, but the prevailing British monarchy hasn’t died out to this day, nor has their outlandishly jeweled headgear.

Freedom of religion and separation of church and state took a while to be established in England, driving nonconformists to emigrate. Soon the rich commercial prospects of the New World also attracted members of the freethinking religious sects. While the Church of England required clergy to wear the white surplice and clerical cap, Puritan clergymen preferred black academic attire, and the Quakers wore plain, practical clothing. The religious persecution that drove my early ancestors to leave the British Isles was clearly expressed in a difference over fashion.

The Iron Age

Long before the migration of Europeans to the Americas, metallurgy in Europe, Asia and Africa ascended into the higher temperature production of iron. Previously only a few artifacts of meteoric iron existed. The collapse of the Bronze Age at the end of the second millennium BCE is variously attributed to climatic pressures, hostile migrations by sea, or development of improved weaponry. Iron and steel implements easily overpowered the stone, wooden or bronze armaments. Iron production required burning large amounts of charcoal fuel. A single American furnace for smelting iron ore in the colonies could consume an acre of forest each day.

When Englishman Abraham Darby (1678–1717) invented coke smelting in 1709, large scale production of pig iron developed in regions where coal could be extracted from the ground. Power machinery ran the furnace bellows and pumped water from coal mines starting in the mid-18th century. With rapid improvements in efficiency, combustion of carboniferous material in air took production out of the home and allowed factories to be built closer to mines and supplies of raw materials.

The Guillotine and the Discovery of the Greenhouse Effect

The emission of carbon caused by increased burning of coal during the Industrial Revolution was not ignored by 19th century scientists. The chemical nature of air was grasped at the end of the 18th century by the dissenting English clergyman Joseph Priestley (1733–1804), who is best known today for his discovery of oxygen as a major constituent of the atmosphere. This highly reactive substance combines with carbon to produce heat, giving off CO2 in the process, which is consumed by plants and absorbed by nature. Besides advancing our knowledge of the air we breathe, Priestley also created the fizzy carbonated water that fills the grocery shelves today. His Unitarian views on Christian theology, his advocacy of the separation of church and state, and his support of the French Revolution earned him the wrath of an incendiary mob that, with the aid of oxygen, burned down his home and laboratory.

While a tragedy for those in his circle, Priestley’s persecution did not shake his beliefs in the erroneous phlogiston theory of fire. The French nobleman Antoine-Laurent Lavoisier used precision balances and calorimetry to show that combustion is supported by consuming oxygen, and not by releasing stored phlogiston, as Priestley and many others had long insisted. Sadly, the mob marched Lavoisier to the guillotine as a symbol of the monarchy before realizing his scientific greatness.

The unknown Scottish chemist Elizabeth Fulhame documented the process of photoreduction (as used in photography) and put the phlogiston theory to its final rest in 1794, six months after the tragic execution of Lavoisier. Mrs. Fulhame’s book, An Essay On Combustion with a View to a New Art of Dying[sic] and Painting, wherein the Phlogistic and Antiphlogistic Hypotheses are Proved Erroneous, was written as she pursued “the art of making cloths of gold and silver and other metals” as worn by the late king of Spain.


Priestley and his family moved to America, to a more hospitable intellectual climate in Pennsylvania. Along the Ghost Town Trail, the charcoal collieries that denuded the forests soon gave way to anthracite coal mines, and the town of Sunbury and the American Chemical Society still celebrate the Reverend’s life.

Scientific understanding of the greenhouse effect followed the 18th-century political upheavals. The French physicist Joseph Fourier determined that the Earth’s incoming solar radiation was insufficient to maintain temperatures above freezing, and suggested in the 1820s that Earth’s atmosphere might act as an insulator.

Eunice Newton Foote, an American suffragist and experimental physicist, demonstrated the thermal effects of CO2 and water vapor in sunlight. She proposed in 1856 that they play a role in climate. Around this time, the Irish physicist John Tyndall was systematically measuring the properties of gases. He understood their importance in warming the Earth, and three years later, published a note in the Royal Society regarding their properties and explained how they trapped radiant heat, what we now call infrared radiation. Svante Arrhenius (1896) calculated the heat-absorbing effects of the products of fuel combustion, what was soon called the atmospheric greenhouse effect. Guy S. Callendar (1938) showed that his model predictions matched the measured temperature rise over 50 years. As the global concentration of CO2 varied, so did temperature, although other factors temporarily masked its influence. Few were convinced of the importance of Callendar’s model in view of its modest estimates of warming and the vastness of the atmosphere, to which humanity had contributed merely 10% of the CO2 concentration, about 310 ppm at the time. Although lacking digital computers and with limited experimental data to drive his simplified models, his results were consistent with modern estimates.

Global climate changes during the Medieval Warm Period and Little Ice Age. The discovery of the greenhouse effect was followed by Callendar’s 1938 model and observation of rising temperature that matched the increase in CO2 concentration. The data are provided by the Pages2K Consortium, with uncertainties at the 95% level of confidence within the shaded region. Graphic by Ed Hawkins (2020)

The breathable atmosphere is a paper-thin layer comprising less than a tenth of a percent of Earth’s radius. Its carbon dioxide, water vapor and methane are the greenhouse gases that keep the average global surface temperature of the Earth habitable, which would in their absence be -18°C ( -1°F), well below the freezing point of water. Production of methane and nitrous oxides by agriculture and fluorinated hydrocarbon gases by industry comprise about 10% of the greenhouse gas global heating potential, and this contribution is accelerating. Added to this are the complex interplays of geological and biological fluxes of volcanic gases, clouds, particulate matter in the atmosphere, and microorganisms in the hydrosphere. Together with planetary inclination and orbital evolution, a chaotic record of climate arises that is frozen into glaciers or buried in lake bottoms during temperate intervals.

Industry, Atmosphere, and Earth Energy Imbalance

Industrial growth in the 19th-century, driven by the use of fossil fuels, changed the composition of the air yet again and began changing the world climate. This was not just because of new methods for dyeing of cloth. Oxidation reactions during the 19th century consumed many forests to make iron. Soon they turned to the buried fossil forests of the Carboniferous period, a 60 million year era of prolific plant growth on the Pangea supercontinent.

The International Panel on Climate Change (IPCC) uses the year 1750 as a pre-industrial baseline, aligned with earlier proxies buried in ice, to reference the early gradual changes in atmospheric CO2 and other gases. Following Guy Callendar’s careful assessment of historical temperature records, European scientists use the 1850-1900 instrumental records to baseline the more slowly responding increases in global temperature.

The rise in temperature is not caused by using energy. Until the industrial age, per capita human energy consumption was negligible compared to natural phenomena. Transportation, cooking and heating accounted for less than 200 watts (joules per second), about that of a large incandescent light bulb, or 6 GJ/y. (A gigajoule represents a kilowatt of power expended for a million seconds, roughly a fortnight.) For reference, a typical human uses about 60 watts of energy just to maintain basal metabolism. Exercise can raise this to a sustained 180 W, similar to the effort required to ascend a tall building in an hour. A superb athlete or the much larger four-legged beast, the canonical horse, outputs 746 W. However our two-legged transport is so efficient that near-naked human hunters were able to chase animals to exhaustion, before the invention of bicycles, baby strollers, Land Rovers and Spandex running suits.

In the industrial era, the 18th century widespread use of water power, and later coal-generated steam, powered the manufacture of textiles and other commodities. There were a lot more garments to wash and dry. Per capita energy use rose tenfold to 25 GJ/y by 1900 and thirtyfold to 77 GJ/y by 2023, while world population itself rose eightfold from the start of the 19th century to today. Consumption per capita in the US is roughly 4x that of the world average. The huge increase in energy use has mainly been supported by combustion of fossil fuel, but not everywhere—Iceland, where renewable geothermal and hydropower make aluminum smelting and Bitcoin mining profitable, consumes twice the energy per capita compared to the US.


Memorial to Welsh and Slavic men who died in the coal mines.
Glass and granite sculpture, 2006, of miners emerging from Mine No. 6, Vintondale, PA, along the Ghost Town Trail.

Energy itself is not the problem. It is the waste products from its production. Humans now consume about 0.4 zettajoules (ZJ) of energy annually. Nature provides ten thousand times that amount via the Sun. The Earth annually receives 3,863 ZJ of solar energy, which creates a habitable planet where the tiniest amount is absorbed photochemically to provide our food, lumber, and suntans. The rest of this energy once was returned back to space as outgoing longwave radiation, keeping the polar bears cool and the tropics warm. But currently, about 0.4% of the absorbed solar radiation is retained. This Earth Energy Imbalance is mainly caused by increased greenhouse effects. Or as Vice President Al Gore wrote in a preface in 2021, “every day, we continue to release 162 million tons of heat-trapping global warming pollution into the atmosphere, as if it were an open sewer.”

It is a common misconception that the present climate crisis is caused by too many humans using too much energy. All the power expended by industry, farms, homes, shopping malls and SUVs is a minuscule fraction of the energy budget of the Earth. It is the mode of production of that power, the greenhouse gases generated by fossil fuel extraction and consumption along with deforestation and unsustainable agriculture, that alter the radiative energy balance. This imbalance is turning up the thermostat. It heats the atmosphere and oceans, melts the glaciers, poisons the water, and leads to a crisis.

Part III European Colonization and Western Expansion

The Great Lakes region where my forebears settled was once covered by the Laurentian  ice sheets. The Lake Border Moraine system that terminated the last ice age extends southward to Champaign, IL. The freshwater lake system is believed to have originated in topography left by the Cape Verde tectonic hotspot plume that crossed N. America from 300–200 Ma (Tao, Z., Li, A., Wu, J., & Fischer, K. M., 2025).

My forebears were colonists, not conquerors but victims of circumstances. In the British Isles, appropriation of the atmosphere, forests, water and minerals by a wealthy few had reached a point where those whose livelihoods depended on the bounty of nature could not survive. The common ground where everyone toiled for their sustenance had been enclosed. Huge tracts of land were denuded of trees to raise sheep, smelt iron or grow cash crops. The air and rivers were treated as sewers. With the soil exhausted, the timber converted into ships, and their armories stocked, the wealthy sought to enrich themselves overseas, where the dispossessed populations could also be resettled. Spanish crowns had claimed hegemony over the New World and enriched themselves on its spoils, but they squandered their wealth through centuries of warfare. The Spanish navy declined and much of the North American continent was seized by the British and French.

Kelp and Clearances of the Scottish Highlands

European industry and trade brought a concentration of power into the hands of the bourgeois and merchant classes, who relied on the tenants of the feudal nobility for labor. Their increasing appetite for domestic raw materials, wool for spinning and weaving, soda ash for glassworks, bleach and soap, ended the traditional agricultural use of common lands. Peasants were forced to abandon their homes because of rents, enclosures, and clearances. The newly-empowered lairds, who thought grazing sheep and burning kelp to extract iodine was a greater purpose for God and King than the tenancy of farmers, forcibly drove 3/5 of the peasant population from their ancestral homes. They were moved to crofts, rented plots of a few acres by the sea which provided only partial subsistence, eked out by working in the fisheries, quarries and harvesting kelp. Exports of iodine-rich kelp ash grew several percent per year over the years 1702–1802. The growth of soap, glass and bleaching industries during the early nineteenth century created a surge in demand for iodine. For a time, the Napoleonic wars embargoed cheaper sources from Spanish colonial ships and there was a fivefold upsurge of ash exports during the next 7 years. The decisive battle of Trafalgar of 1805 and the end of Napoleon’s naval blockade meant that Chile could supply England’s sodium iodate needs ten times over. Prices then collapsed, further ruining the peasants.

The Edinburgh-based Government of Scotland had been trying, with mixed success, to gain control over the Highlands and Hebrides Islands from the “wild, wykked Hieland-men” ever since it ended the Norse influence over the area in 1266. Various methods were tried, from a period of attempted control by proxy via the Lords of the Isles, to James V’s direct action against clan chiefs and James VI’s commissioning of the Fife Adventurers in 1597 to take over the Isle of Lewis and use all necessary means to “root out the barbarous inhabitants”. The King’s perfidious 1692 Glencoe massacre of thirty some men, women and children of an allegedly dissenting clan was followed by the building of a network of military roads across the Highlands in the years after the 1715 Jacobite uprising. Along the old military road from Fort William to Glasgow, now known as the West Highland Way, lies a remote and empty landscape peppered by ruins of former homesteads. Stone enclosures lie tumbled in heaps, lorded over by feral goats. The pacification of the Highlands by the British required massive bridges and roads to move the crown’s army. The roads are today used by tens of thousands of hikers annually, but only a few souls inhabit the villages year-round.


Abandoned drovers house along the West Highland Way

Emigration to the US and Canada

My maternal lineage is an English-Scots-Irish mix across two centuries of migration, beginning with the initial colonization of New England. My seventh great grandfather, Mr. Stephen Gates of Norwich and his family took a perilous two-month passage aboard the ship Diligent in 1638, joining an earlier group of Pilgrims in the Plymouth Colony on the land of the Patuxet tribe, all of whom were soon wiped out by settler-introduced disease. 

Two hundred years, and five generations later, George Gates migrated from Dummerston, Vermont to the fertile land along the Rock River in Wisconsin, earlier ceded by a drunken Sauk chief in the name of the indigenous peoples. George brought his wife Sophia Perry from Potsdam, New York. Their daughter Mary Theodosia was born in 1851.

The Isle of Lewis and the Outer Hebrides were at that time inhabited by Clan McLeod, reduced to submission over the centuries by kings of Scotland and England. They were the ancestors of Kenneth McLeod, an early 1800s immigrant. Kenneth’s wife Annie McIntosh was another of those Scots from the Isle of Skye. Her family “crossed the stormy Atlantic” in 1798 to Canada during the early highland clearances. Annie could still recite bible verses in Gaelic a hundred years later. She outlived most of her family, dying at the age of 103 in Sanilac County, Michigan. Her son Finlay McLeod, after the death of his first wife, remarried to Sarah Ann McLain. Their daughter Ellen was my maternal grandmother.

After Oliver Cromwell’s 1649–1653 re-conquest of Ireland, the Irish population lost much of its land to English landlords. Three quarters of the arable land in Ireland was devoted to grains and livestock, which the English absentee landlords reserved for export. Increasingly the Irish subsisted on potatoes and cabbage or other greens they could scavenge, such as the nettles growing freely in the graveyard. Their crofts typically had only a few acres per family. Lucky were those who could raise chickens and a pig or kept a cow to supplement their diet. Thus the Irish peasants were dependent on a single crop. When that crop failed, those who could emigrated and the rest starved. Although there was plenty of food exported out of Ireland to England and her colonies, the potato blight of 1845-1851 reduced the peasant population to skeletons. Nearly a quarter of the population who remained died of starvation and disease.

Those who survived passage and landed on American shores were often  destitute, surviving on the charity of the community. They were willing to do any work, however hard and dangerous. In these circumstances, labor building the canals, mines and railroads to carry the coal from the Pennsylvania mountains absorbed the hapless immigrants. Their attire was made from rags and sacks stitched to cover their nakedness, some having pawned the clothes off their backs to gain passage.


The young Andrew McGlone family living on the prairie in Marshalltown, Iowa ca. 1870. Frank McGlone is seated with his mother, Mary Theodosia.

Were the McGlone, McCleod and McIntosh families who left Scotland at the end of the 18th century to move to Canada just simple crofters who could not survive on meager portions of land? Were they “cleared” by soldiers and bailiffs from their ancestral homes as clans, driven to the sea and forced to burn kelp to survive? We don’t know, but likely they had enough kelp and iodine in their blood and had lost their taste for seafood and arsenic-laced soot.

The second phase of Highland Clearances (ca. 1815–1850) involved overcrowded crofting communities from the first phase that had lost the means to support themselves, through famine and/or collapse of industries that they had relied on (such as the kelp trade), as well as the effects of continuing population growth. Landowners, who were obligated to pay Poor Law rates in proportion to the number of people on their land, deliberately pulled down those dwelling-houses that they did not personally need, some with their occupants still inside. The extensive demolition of houses left ruins of villages that can still be seen today. “Assisted passages” were common, when landowners paid the fares for their tenants to emigrate. Tenants who were selected for this had, in practical terms, no choice. Kenneth McLeod came from Scotland during the second phase. The Scots Gaelic crofters who remained in the Highlands lived lives of abject poverty, and Scottish emigration continued for another century, bringing presidential mother Mary McLeod Trump, and many others. We can only surmise that Kenneth, like Mary, had been down on his luck.

For several generations the Scottish MacIntosh, McKenzie, McLain, McDermott, MacGregor, as well as the Stewart, Perry and Gates English bloodlines mingled and migrated westward along the Erie canalway to the fertile and abundant farmlands of the Thumb of Michigan. While many of the “pioneers” were escaping from oppression, the largely peaceable inhabitants of the Upper Mississippi River Valley, Illinois, Wisconsin, Iowa and Minnesota were were pushed westward by military force. The famous warrior Black Hawk did not acknowledge the supposed purchase of the ancestral land of the Sauks, Meskwakis, and Kickapoos from non-tribal agents. He fought a succession of wars in 1832. The Sauks were accused of siding with the British loyalists of Upper Canada, and were therefore dispossessed of their land by the U.S. Army, whose soldiers included the young Abraham Lincoln. As their ancestral lands were parceled out by the Federal Government to the eager immigrants, the natives were pushed westward to less fertile lands.

A later Celtic wave of immigration is associated with the potato famines and the mass deportations that ensued, driving the predominately Catholic Irish to emigrate. Settlements in Northern Ireland with diversified subsistence agriculture had weathered the vicissitudes of climate and disease over the centuries, but the commercialization of land use caused their ultimate extinction. They were replaced by Scots loyal to the British Crown.

Those Ulster Protestants were possibly just seeking better opportunities and refuge from their impoverished homeland. The Scots Plantation in Northern Ireland, on land seized from Irish nobility, anglicized the Gaelic family name, ‘Mac Giolla Eoin’, or Son of the servant of St. John, to McGlone. The McGlones joined many other distressed people in Ireland marginalized by English rule. 

Barney McGlone was born in 1808 in Ireland and his wife Mary Ann, born in 1812, emigrated to Oswego, NY, a town on Lake Ontario connected to the Hudson River via portions of the Erie Canal. They eventually settled in Wisconsin, during two decades in which the population of the future state exploded 100-fold. The older of their two children, my great-aunt Nell, married a Mr. John Tuckwood in Janesville, Wisconsin. Her brother Andrew Jackson McGlone was born there in 1841.

Annie McIntosh, 1893, Sanilac, MI, grandmother of Ellen McLeod; Finlay McLeod and Sarah McLain, parents of Ellen McLeod.

The Railroad Boom and Western Expansion in North America

The Erie Canal completed in 1825 had provided the route and economic basis for the first wave of expansion to the Ohio and Mississippi river lands, sending the McGlone name westward. Those lands had been plowed for millions of years by the kilometers-thick Laurentide ice sheet. The last of the glaciers receded merely 11,000 years ago, leaving behind fertile soil and the deep freshwater Great Lakes.

The flanged wheel, invented by William Jessop in 1789, made efficient land transportation possible on fixed, graded ways. The flanges allowed trains to negotiate bends in the track without horses to steer the leading carriage. The invention of the steam locomotive in 1825 replaced horse power with coal. Peter Cooper’s Tom Thumb pulled passengers on the Baltimore and Ohio in 1830 (after losing a race to a horse). Between 1850 and 1871 the United States government transferred approximately 1.31 million acres of public land to powerful, unregulated railroad monopolies.

In the 19th-century West, railroads were the only year-round reliable means of transportation for people or goods. Many of the Irish went to work on the railroad. As their gravestones attest, these pioneers often died young. The canals and great tunnels beneath the Appalachian Mountains were dug by Irishmen who could find no other means of living. In the case of the 1302-meter-long Crozet Tunnel dug through hard rock beneath the Blue Ridge in 1858, nearly a quarter of the workers employed died of disease, or rockfall following blasting, or runaway carts. The use of slaves was abandoned as too expensive, because their owners demanded payment for the lives of those killed by accident.

Andrew was of better fortune than many first-generation Irish-Americans, since his family owned property in New York. He trained in Chicago to be a skilled locomotive engineer and joined the Scottish Rites of Freemasonry. His wife, Mary Theodosia Gates, was a great-great-great-great-grandaughter of Stephen Gates, a Puritan emigrant from Coney Weston in East Anglia. Ms. Gates was almost the youngest of a large family of siblings and half-siblings. Perhaps she was eager to leave home, or the family had no room for another daughter. Standing at the throttle of a steam locomotive, Andrew was a handsome catch for his 17-year-old bride.


Andrew Jackson McGlone, engineer and freemason, tintype ca 1870.

The railroads brought destruction to the indigenous peoples of the upper midwest. For the next half century, their buffalo were slaughtered and the armies built up during the Civil War were dispatched by trains to finish the taking of land. The dispute over the lands taken by broken treaties continues to this day in the Black Hills of the Dakota territory, where protection of water from oil pipelines has mobilized thousands. But we must return to my family members, who were simply glad to raise families, free of rents to noblemen. Their life was hard, but the opportunities were great.

Thanks to the photochemistry of metal salts described by Ms. Fulhame, we have graphical records. Ferrotypes made by photographers working in booths or the open air at fairs and carnivals started to document life around the time of the Civil War, and studios sprang up in towns, where family portraits were taken. Those fragile “tintype” photographs are time capsules of my great-grandparents and great-great-grandparents, contemporaries of the western expansion.

Two children were born, Franklin and Mildred, in Janesville, Wisconsin where Andrew’s older sister was living. The couple settled in the burgeoning railroad headquarters of Marshalltown, Iowa. The town had been named only a decade earlier, following the “Indian resettlements” of the early 19th century. The endless expanses of Central Iowa, a land of rich glacial and alluvial deposits overlying the billion-year-old Midcontinental Rift, are forbiddingly quiet even today. Life here was lonely for the young wife, but she could take trains to visit relatives in Wisconsin, a respite from the desolate prairies.

Family records state that “Andrew was a railroad engineer on a small gauge train of the Burlington Line”, eventually the Chicago, Burlington and Quincy. The line was part of a planned Iowa Central Railway, from the mines and fields of Minnesota to the Mississippi River, and then to St. Louis,  hoping to draw business away from the Chicago warehouses that dominated midwest shipping. A roadway for the tracks was prepared along the Iowa River with untold amounts of labor and borrowed money. After the financial crisis of 1873, further construction was discontinued. The line was not completed until many years later. 

The Iowa Central had acquired at least 36 locomotives by 1881, most of them the smaller standard gauge that replaced wider gauges being used in Ohio and the South. Shaped iron rails replaced earlier wood and strap-iron. Introduced by the Maryland Mount Savage and Pennsylvania Montour Iron Works in 1845, the wrought iron rails were a tremendous improvement over the wooden and strap-iron tracks of the horse-drawn age. The rails were relatively brittle and needed frequent replacement as trains grew heavier, larger, and faster. Steel rails from Johnstown, PA were introduced in 1867, the price of which dropped precipitously as open hearth steelmaking took hold in Pittsburgh, and the Bessemer process in Baltimore’s Sparrows Point, but the Iowa rails were made of much weaker iron. Even by 1880, only 29 percent of the 115,000 miles of track in America were laid with steel rail.

The machine builders of the textile industry quickly adapted to producing the 2-4-0 Mogul and 4-6-0 Mason engines that could haul heavy freight. The Mason engines were of a simple coal-fired design, with four wheels on a bogie in front, six traction wheels, and the signature smokestack. The engine and tender weighed close to a hundred tons. Plentiful supplies of bog iron ore supplied charcoal smelters. The forests were chopped down. The foundries of Taunton, Massachusetts poured millions of tons of iron. Tens of thousands of such engines were built in New England and Pennsylvania. Railroad mania took hold, with an accompanying disregard for safety. Promoters formed hundreds of companies to build tracks on free right-of-way land, opening vast territories to military conquest and commercial exploitation. By 1869 the Central Pacific and Union Pacific railroads, built by Chinese and Irish immigrants, met at Promontory Summit, Utah where the last spike was driven in the transcontinental railroad line connecting Council Bluffs, Iowa and the Missouri River, to Sacramento, Oakland, and the San Francisco Bay. The commercial boom of coal-powered transportation had begun, mileage doubling and tripling each decade until the 20th century. Commerce became global, as commodities of furs, sugar and tobacco gave way to tea and opium on clipper ships and then steamships.

The Capitalocene-Kleptocene-Necrocene-Pyrocene-Wasteocene

The railroad boom launched an even more disastrous trend in the atmosphere that was to give rise to a new geochronologic unit one hundred years later. The Anthropocene period has variously been dated to the ten to fifteen millennia of human activity driving Pleistocene game to extinction, the clearing of forests for the advent of agriculture, the use of steam power in the industrial revolution, or the rise of colonialism that stole the lands and lives of indigenous people across continents. Alternate names have been proposed, the Capitalocene, Wasteocene, or Necrocene, denoting the deadly power over nature exerted by the exponential growth of finance capital; the Kleptocene, denoting the wholesale appropriation of land, wealth, and  human bodies that attended colonization; or the Pyrocene and Petrocene, for an era of unchecked petroleum extraction and wildfires. But physical atmospheric boundaries can be documented by the carbon dioxide (CO2) concentration frozen into the Law Dome, Antarctica ice cores. Modern inflections start in 1750 with the industrial revolution, followed by George Stephenson’s steam railroad in 1825. By 1850, anthracite coal in Pennsylvania was streaming down the inclines and onto the Lehigh River in canal barges to the Delaware River and thence to the cities. The D&L Canal and many others were soon replaced by more reliable and profitable railroads. By the 1890s seven railroad companies controlled nearly the entire anthracite region on which the population of the Eastern seaboard depended.


CO2 concentration in the southern hemisphere vs. age. Upward inflections occur at the start of the Industrial Revolution ca. 1750 and railroads ca. 1825.

In 1825, the CO2 concentration trapped in Antarctic ice had risen from a pre-industrial value of 275 to 285 ppm, not that anyone was aware.

The conquest of the Mayan and Aztec civilizations of the “New World”, from 1492 to the defeat of the Spanish Armada a century later, and the “Columbian Exchange”,  the decimation of population by smallpox, measles, influenza and cholera, also affected the atmosphere. Koch et al. (2017) proposed that a climate episode known as the Little Ice Age with a few-percent dip in CO2 concentration around the year 1600 was caused by land use changes following the Great Dying in the Americas. Genocidal armies and European diseases wiped out 50-60 million indigenous people in N. America, ending cultivation of land that subsequently reverted to prairies and forest, fixing atmospheric carbon into the biosphere. Thus the end of the neolithic age, the extinction of megafauna, the rise and fall of agricultural civilizations, and the European colonizations driven by commerce, were all anthropogenic influences on the planet. In recent years, only the Second World War and the “oil crisis” of the early 1990s have noticeably retarded the upward climb in the accumulation of CO2 spawned by the industrial revolution and rail transportation

The Torch Lake Mason engine in the Henry Ford Greenfield Village Museum in Dearborn, MI,, similar to the one that Andrew drove. 

The Death of Andrew Jackson McGlone

Locomotives rely on a leading set of wheels, or bogie, to maintain direction around curves and to keep on track at switching junctions. If the bogie derails, so does the train, with destruction to follow. In 1838 the computer pioneer Charles Babbage invented the cowcatcher to protect the bogie and persuade wandering ruminants to give way to a superior producer of greenhouse gases.

Andrew was killed in 1875 when his locomotive the Amos Russell derailed  after hitting a herd of cows. Engine 22 of the Minneapolis & St. Louis Railway was a ten-wheel Mason Locomotive acquired in 1873. It may have failed to deflect the animals, or the cast-iron tracks may have buckled and cracked under the force of the impact. Andrew was thrown from his open cab and was crushed by the engine as it tumbled down an embankment, a few miles south of Grinnell near Searsboro in Poweshiek County, Iowa.

An accident on a freight train would not have been newsworthy, so details were sketchy. A large number of passenger injuries might have been noted, but the sheer number of railroad worker injuries made Andrew’s death just a statistic. Multiple-fatality accidents were reported, as when the iron strips called “snakeheads” covering the wooden rails peeled off into passenger cars, impaling their occupants, or in 1883, when engine No. 32 of a similar design was operated at a gauge pressure of 172 lbs, 12 higher than its rating, and the boiler exploded as it split, derailing the train with 44 cars, crippling the fireman and killing the engineer and brakeman. Railroad work was then among the most dangerous occupations in the United States. Working conditions were abominable and deadly: boiler explosions, collisions at crossings and with other trains, decouplings and runaways, as well as derailments. The annual fatality rate in the early 1890s was approximately 9 per 1,000 workers, more than that of the most hazardous occupations today. As with the game of “Russian roulette”, a 20-year railroad career would entail one chance in six of an untimely accidental death. Workers were required to sign contracts that indemnified the companies from liability. A precedent-setting ruling upheld in Farwell v. Boston, 1842 deemed workers “fellow servants” who were solely responsible for their own safety, not the “masters” of the railroad. The firemen and brakemen soon formed fraternal benevolent associations to provide life insurance. Federal safety regulation by the Department of Transportation was to wait another hundred years. The highly profitable U.S. rail freight industry, which can carry many times the tonnage per gallon of fuel as highway trucks, today finds it more convenient to ignore those regulations, staffs longer trains with fewer workers, and minimally compensates the victims when those overloaded trains buckle and derail.

After its derailment, the Amos Russell was rebuilt at Burlington’s Marshalltown shops and placed back in service a decade later. A similar Mason locomotive, the Torch Lake, carries passengers at the Henry Ford Museum in Dearborn. This engine lacks a cowcatcher.

From a contemporary report by the Ancient Free and Accepted Masons entitled ”The Death and Funeral of Mr. McGlone”:

“… We have it from reliable source, that he was driving engine No. 22, in the Iowa Central, bound north, and ran into a drove of cattle, which threw the engine off the track, turning it completely over, and crushing the frontal, or bone of the forehead, which caused instant death. Being an old resident of this city, his remains were brought here for interment, arriving yesterday afternoon, via Chicago. They were taken to the residence of his brother-in-law, Mr. John Tuckwood, in the 4th ward, where short services were held, after which they were borne to the cemetery. The remains were accompanied by William Barnes and Geo. S. Hickox, Marshall Lodge No. 108, Marshalltown, Iowa; John Davis and E. F. Pierce, of Division 146, of the Brotherhood of Engineers, Marshalltown, and Mr. B. L. Abbott, Chicago. …

“In behalf of Marshall Lodge No. 108, A. F. & A. M. of Marshalltown, Iowa, we the undersigned, beg to return our thanks to the officers and members of Western Star Lodge No. 14 A. F. & A. M., and Janesville lodge No. 55, for the promptness and brotherly interest manifested by them in paying the last tribute of respect to our deceased brother, A. J. McGlone.

“Mrs. McGlone was in this city visiting her brother-in-law, Mr. Tuckwood, when her husband met his untimely death. On Saturday last she received a letter from him stating that he would come to Janesville on the 11th of August to accompany her home—the very day his mortal remains reached the city.”

The Oak Hill Cemetery in Janesville where Andrew was buried occupies hundreds of acres. His grave sits on a hill overlooking the town, where Jack and Ellen Tuckwood were buried decades later. The office had no record of his grave, but the groundskeepers kindly led me to the Tuckwood plot, where the worn, barely recognizable letters of A. J. McGlone were visible on his overturned gravestone.

Andrew Jackson McGlone’s gravestone in Oak Hill Cemetery, Janesville, Wisconsin

Climate and Extreme Weather Events

The city of Marshalltown where the McGlones lived is today a small industrial junction along the Iowa River.
I stopped there briefly on my way to Des Moines for the Register Annual Great Bicycle Ride Across Iowa (RAGBRAI 48). 
Long freight trains passed through several times an hour. Only a few historic buildings, notably the 7-story Tallcorn Hotel built in the roaring 20’s for railroad bigwigs, and none of my family history there survived the destructive funnel clouds of tornadoes. Their energy is comparable to that of atomic weapons and destroyed much of the town center in 1961 and again in 2018.

The town of Oakfield, where Andrew’s widow Theodosia lived with her family in Wisconsin, was hit by an F5 tornado in 1996. After leveling a third of the town, it was followed by a smaller one 20 minutes later. Contemporary descriptions give a picture  of the force of nature when she is angry: “The tornado was strong enough to level the Friday Canning Company, sweeping up millions of empty cans and leaving them scattered over a 50 miles (80 km) radius.” Homes were ripped from their foundations, bending the steel bars anchoring them to concrete. “Automobiles were carried 400 feet through the air and mangled beyond recognition.” The town’s canopy of oak trees was reduced to splinters. As in Marshalltown, the people of Oakfield, numbering barely 1000, recovered with the grit born of 150 years of farming and industry.


Chase photograph of funnel cloud of Oakfield tornado. (Cailyn Lloyd photographer – Storm Talk CC BY-SA 4.0)

Returning from Iowa, club oak trees still lined the roads leading into Oakfield. Abandoned houses in the farms nearby contrasted with the hundreds of new wind turbines, spinning in the wind that rushes across the prairie, as if taunting the ever-threatening twisters.

Tornadoes have become more frequent in Wisconsin since 1950 and nationally since 1979.  While it is too early to draw a causal relationship with anthropogenic climate forcing, the unusually high amount of water vapor in the atmosphere caused by warmer temperatures is clearly a factor that intensifes storms. In 2024 an F2 tornado struck in February north of Janesville after temperature records were broken. Recent tornado statistics are skewed by increased detection in sparsely populated areas after Doppler radar networks were introduced, but many new areas are having to deal with them. The smallest state, Rhode Island, where cyclonic storms were rarely heard of until recently, was hit by Hurricane Gloria (1985 ). I also witnessed a destructive tornado pass by my office in Providence on August 7, 1986. The tenfold-larger state of Maryland has had hundreds of injuries and nine fatalities in this century as a direct result of twisters.

Extreme weather events are no longer restricted to the midwestern “tornado alley”. The tropical cyclone heat potential of the oceans provides a clear-cut measurement of the amount of heating that the earth system has undergone in the last half century. Upper ocean heat profiles are measured by casting profiling bottles from ships. Analyzing these measurements, scientists found that the oceans have absorbed “an increase of 240 zettajoules representing a volume mean warming of 0.09°C of the 0–2000 m layer of the World Ocean. If this heat were instantly transferred to the lower 10 km of the global atmosphere it would warm this atmospheric layer, on average, by approximately 36°C (65°F).” The scientists are quick to note that “earth’s climate system simply does not work like this”, at least not yet, but the tropical storms move more slowly and carry heavier rainfall, causing more flooding.

When Andrew died in 1875 the CO2 concentration had risen to 290 ppm.

The Journals of Frank McGlone

Frank McGlone and his dog, 1875 (age 4), soon after his father’s death.

The untimely demise of a railroad worker brings our tale to Andrew’s son, my grandfather Frank. He grew up on the farms of the Wisconsin prairie, where thunderstorms scattered the hay, two-inch hailstones pounded the crops, and great snowstorms and temperatures of 18 below in winter (-28 °C) were not uncommon. 

Frank died six years before I was born, so all I know of his life comes from family photographs, the journals he wrote as a young man, and the reminiscences of his youngest child, Doris, recorded in 2003. Much of the following is described in her words.

Frank’s mother, Mary Theodosia (Dosh to everyone), a young woman of 24, was left with a meager widow’s pension, hardly sufficient to feed 6-year-old Frank and 2-year-old Mildred. The flat, rural area south of Fond du Lac and Lake Winnebago was the home of the Gates family, where Dosh lived with her sister in Oakfield, Wisconsin, along the Chicago and Northwestern railway. As a young widow, she found domestic employment nursing wounded Civil War veterans. Andrew’s sister Ellen and brother-in-law Jack Tuckwood looked after the two McGlone children in Janesville, a much larger community 90 miles to the south, where they could go to school with their first cousins, Frank, Cora, and Clara. 

Frank McGlone went to work at age 16, leaving Janesville to spend much of his time on the isolated farms near his mother. He had nearly completed high school. His diaries from 1886 to 1900 describe the life of a fatherless boy growing into manhood, doing farm chores all around Wisconsin. David, his oldest grandson, discovered the journals among his father’s possessions, “who as we well know, kept everything.” He handed over the twelve little volumes to his Aunt Doris, saying “These may be of some interest to you.” She painstakingly transcribed many pages, commenting where appropriate, and filling in some blanks.

There was always work for Frank in every season, and every able hand was needed. Farming – wheat, oats, corn, livestock – was the essential way of life. Unlike the eastern states, where convicts and indentured (or kidnapped) Europeans had comprised the majority of agricultural labor, chattels had never worked the land of the Great Lakes. The Northwest Ordinance had declared land north of the Ohio river to be free, but owners of slaves in the southern colonies who had amassed huge fortunes in cotton, tobacco and sugar cane, destroying the land in the process, were eager to expand northward.

Wisconsin was a refuge for escaped slaves. Besides, with large-scale European immigration and availability of western farmland, maintaining an indentured labor force became unprofitable in the mid-19th century, at least until the near-enslavement of East Asian laborers to build the trans-continental railroad.

Expansion of the plantation system to the western territories met with popular resistance. Abolitionists founded the Republican Party in Ripon, WI in 1854 to champion the cause. After John Brown’s unsuccessful attempt to lead a slave insurrection in 1859, the election next year of Republican Abraham Lincoln led to nationwide polarization, and soon the secession of most of the slaveholding states. A brazen southern aristocracy raised the banner of insurrection. The Confederacy attacked the Union troops at Fort Sumter, SC shortly after Lincoln’s inauguration. The American Civil War began. Wisconsin sent 91,000 troops to the Union Army, boys and men toughened by the rigors of frontier life. Eight hundred of its sons died in a single 3-day battle at Gettysburg in order to protect the Union’s inland route to Pennsylvania, the port of Philadelphia, and the North. Those soldiers prevailed at an enormous cost in lives, and their loss increased the need for the labor of young men like Frank.

After his first year of work in 1886 for the Cooley’s farm, Frank “bought a dandy suit of clothes for $15.” As the son of an engineer who died in a wreck, he was known up and down the line and “jumped” rides on the engines to visit his relatives and his younger sister Millie, who became a seamstress. She and his mother often patched the single suit that he owned for several years. Other than this suit, he seldom mentioned fashion, which likely yielded to the heat of the summers and the stunning cold of the northern winters. A new suit with kid gloves was bought in 1889. The earliest picture shows him in a high collar shirt with rounded celluloid lapels under a woolen suit and a buttoned vest. A necktie, hat, and underwear completed his outfit.

Frank’s journals speak of the harsh monotony of farm life, broken up by religion and singing. Many weeknights in Michigan he attended prayer at the Young People’s Congregational Society of Christians. He studied mechanical drawing, and apprenticed as a machinist in the shops of DeGroat, Giddings and Lewis in Fond du Lac. Trained by his uncle Jack Tuckwood, he worked as a substitute in the switching yards, driving the smaller “special” engines, which could also taxi him to visit friends up the line.

Telephone switchboard exchanges were invented in 1887. A phone conversation with a friend, and the intermittent employment at the Giddings and Lewis foundry convinced Frank in 1890 that he should move to the bigger cities in Illinois. In 1891, by way of Milwaukee and Chicago, he landed a job at the W.F. Barnes company in Rockford, a maker of machine tools. Employment at the Barnes Works began just before Christmas, but settling in to a “church home” was not accomplished until late February. He attended a variety of churches, complimenting the sermons in each, but highly praised the First Congregational. This may have been influenced by the very welcome visit to his room from the minister on New Year’s Day. Still, the Presbyterians had his attendance the next Sunday.

This was an era in which the Christian Endeavor Societies (CE) were appearing in many Protestant churches with a variety of activities to appeal to young adults, many of whom had left their rural homes for jobs in the city. At 22 Frank was elected President of the local CE and opened his first prayer meeting in that capacity. That month he conducted hospital prayer missions and attended all the concerts of the Moody Bible Institute Quartet. In time Frank devoted weekly energy to the CE and was faithful to church services, prayer meetings, Bible classes, and pastoral lectures as well as the socials. By February, he was assigned to the Reception Committee to greet newcomers, and this was a role for which his years of lively participation in the Congregational Society of Christians at home had fitted him, as he organized the games, speeches, and refreshments. In his journeys to Chicago he “looked at Moody’s Bible Institute and attended a lecture.”

As Frank’s fortunes as a machinist improved, he upgraded his wardrobe. “Today I bought a new overcoat and cap.” He was generous to a fault, and felt a need “to send my checkered coat and a vest to Miss Footes for a missionary in Kentucky.” Weekly letters to Ma and Millie contained ten dollar postal orders at times. Every so often he gave “to a beggar 10 or 15 cents for a dinner or a bed.”

The severe weather of the winter of 1892 caused Rockford to lose its gas heat. Electric lights were so dimmed that shops, schools, and churches were closed. Among the disasters Frank describes were the horses falling into the rising river as the spring thaw began; the roundhouse burning, destroying an engine; the outbreak of typhoid fever. Frank and his co-workers had to be examined and vaccinated: “My lungs ache and my arm is swollen and sore.” However, there could always be winter fun, as on Sunday February 23rd when “Bobby Wills proposed after a church social that a dozen of us go in his two bobsleds hitched to his horse for a ride down hill. Everyone got wet and chilled but we sang and sang.”

The city of Rockford had a public library and there were also opportunities at the nearby College. “An excellent evening from ‘The Jinglers’ — a colored quartet.” Next week there came a Beloit College professor who lectured on his impressions of the Passion Play of Oberammergau in Germany with stereopticon slides. The Rockford Congregational Church held Literary Club meetings in which the topic was Thomas Edison’s inventions featuring a model of the phonograph. On another evening a missionary from China reported on his work with magic lantern slides. “We played host for all the concert attenders; the Mendelssohn program was so beautiful.” “My church has joined the Illinois Association of Congregational Churches. May go to the regional conference.” Next day: “Decided to stay and help with the statewide meetings to be held in our church for three days; much planning work.” But there was time for young men and young women to enjoy the summer outings. “We made the most of the moonlight nights in parks; also watched the Odd Fellows parade from a high tower downtown.”

Frank was chosen to be a delegate to the Christian Endeavor National Conference in New York City. He wrote his mother to ask if she had any objections. For a few days there was no response, so he went ahead. “Reserved a Y room and a berth on a Pullman. At planning meeting I persuaded the others that Niagara Falls Short Line is the train to take.” “The fund money is coming in. I included ice cream in the picnic day’s festivities and we made five dollars just there.” More socials were held to raise enough for the delegates. Then came a crushing blow. Frank’s employer, Mr. Barnes, would not let him have the three work days off. “Feel sorry and badly disappointed. Had a lovely letter from home, and everybody is happy to know I’ll be going and wish me well. It broke me up. And at the CE everyone is disappointed that I had to resign as a delegate.” Now he had to arrange that the room and berth reservations, which he had made at a good rate from someone he knew, could be transferred to his substitute. Six days later he bought a NY Mail and Express newspaper to read all about the conference, and then he waited for his friends to return with their report. “Now the Misses MacFarlands have returned from New York. We had a delightful evening rowing on the lake, and I heard all about their trip.”

While still at the Barnes Works, Frank looked for another job with disappointing results. Assignments at Barnes varied, and there were occasional difficulties which he took seriously enough for diary entries: “Fitting number 3 sleeves today but had bad luck; just got through spoiling one.”… “Glad to change to sensitive spindles today.”

Meanwhile, spare hours were put into running switching engines in the yards, and “that is a good place to see the loading of P. T. Barnum’s Circus. Saw the erection of the animal’s tent tonight.” Sister Millie came to Rockford for a “most enjoyable” visit. He drove her all over town, introduced her to friends and visited a seminary and the hospital. Then at night they attended an Amherst College Banjo and Glee Club Concert. While he went to work, Millie chose to visit on her own a kindergarten and an elementary school. (She was in teacher training and hunting for a job.) “Poor Millie,” he wrote a week later, “she has the mumps probably from the children she hugged at school.”

Frank had a cash surplus in April, useful for an expanding lifestyle: new shoes for work and dress; white flannel trousers for lawn tennis which he was learning; and a white tennis shirt. “And there is enough this month for five dollars to Millie for her music lessons.” Board and room continued to be $4.00 a week, and paychecks averaged nearly nine dollars a week.

From early July Frank’s mind was focused on acquiring a bicycle. Each entry cites a friend’s bike borrowed for “a fast run, you bet.” He shopped around and made comparisons. Then he took his first ride on a “pneumatic,” but the price was beyond him and he had to search for a good buy. As August ended he chose a Humber model for $75. In the same entry he complimented himself for doing a good job at work making perfect number 2 and 3 sleeves. The next three evenings were spent cleaning the bicycle. “I had great good luck getting my bike together; she runs fine, and I rode fast to Hiram’s where we masticated a couple of ripe melons.” Each successive visit to friends or attendance at the fair or inspection of factories was reported to the diary as “Good ride” with a list of times and a concluding “cleaned the wheels.”

Alas, the Works’ dynamo shut down too many days, and the hunt for a better job became more urgent. Frank was careful not to quit yet, and “I wheeled around doing my errands in great spirit until I broke my right pedal pin by turning a corner too short and lost 11 ball bearings.” The repair of the pedal’s pin he managed to do at work in the next two days. “Looking ahead, I’ll continue Mechanical Drawing and pay the tuition for Geometry too.” He was just returning from a lecture at the YMCA’s Star Course, Old Ocean, Our Slave and Master, with magic lantern slides when he got a message to see Mr. Weir in the morning. He had interviewed with the chief engineer of Ingersoll Milling Company once before. This time he could write “Struck a job this morning.” The next day at Ingersoll’s he was assigned to running the large planer, and “I did a good job but I am so tired tonight.” In the following weeks there were reports of “hard luck today; tool caught and broke and took a chunk out of a finished piece of work. Mr. Weir was very patient.” Again, “tool caught, tool fell, stubbed toe,” but these woes could be balanced against the 79 hours of work in a week that were now possible. Mr. Weir entertained the work force at his home for a pre-Christmas dinner.

As busy as he was with work and the long evenings of Mechanical Drawing for Perspectives and Geometry’s new discipline, he did have time to watch the political rallies — chiefly Republican in that part of the Midwest — “until election day when I voted for President for the first time. Voted Prohibition.” The temperance candidate John “Don Juan” Bidwell won 2.2% of the vote, while the Populist (People’s) Party won 8.5%. Democrat Grover Cleveland however won a plurality of the popular vote and defeated the incumbent, Benjamin Harrison.

On his 23rd birthday in 1892, his old friends Hiram and Dorothy made him dinner and a cake. A gift was a pen wiper. He returned that night to a “warm room; at last a stove has been put in my room and I have a cord of wood.” Work hours in November ranged from 10 to 13½ daily. $10 was paid monthly on his bike. “Tonight I could have prayer meeting in my room, but we were mightily disturbed by mice running around. They had to come in out of the cold. Sadly had to kill one.”

In the last weeks before Christmas Frank helped plan a big meeting when Billy Sunday, the popular baseball player who had been drawn to preaching by Dwight Moody, addressed the crowd in Rockford. He put on multiple performances per day attended by thousands, with the help of advance men like Frank, and accrued substantial wealth as a result, some of which was donated to the Chicago mission. The evangelical organizations such as the Bible Institute and Brotherhood had great influence in business, education and politics. They preached temperance and devotion to work. Their popularity only declined with the advent of radio and movies. Inspired by the sermons, Frank began a chapter of the evangelical Brotherhood of Andrew and Phillip with five of his friends, and was appointed to be the leader of their prayer meetings.

“Started early to reach home on Christmas Eve; after an absence of one year I will see Ma and old friends.” The list of expenditures for December details the gifts he took home: a book for Millie, carving set for Ma, handkerchiefs for Ada and Grandma, and a brush for Jessie. He attended church services with Ma and Millie in Episcopal, Baptist, and Methodist churches, thus seeing many old friends. Grandpa and Grandma Gates, at 92, were well enough to join all the relatives in “a very Merry day with Aunt Sophia [Mildred Sophia] and I enjoyed the cantata Christmas night.”

“Millie is 18 now, and we made all the visits in Fond du Lac together. I saw all the boys at the shop; then toured the creamery factory and the new yeast works. Saw new machinery everywhere. Back in Oakfield it is time for goodbyes.” “Reached Fond du Lac in time for a sleighing party all the way to Ketchim and the Fisher’s; had a jolly good time. Next morning Annie A. and I took a sleigh ride up around Breakneck. Very pleasant.” “Packed my game, cuff and collar box; and the razor bag that came with the little card ‘Fondest Christmas Greetings from Grace’.”

Frank had worked at the Burson Mill (maker and operator of automatic knitting machines) as well as Ingersoll, both enterprises begun in 1892. These industries were at the cutting edge of manufacturing, and their world-wide successors are in operation today. But the economy was weakening, work was short-lived, and shutdowns were all too common. 


The Gilded Age, Frank’s Ministry, and a Broken Engagement

The Rockford industries were built during Mark Twain and Charles Dudley Warner’s “Gilded Age.” This was an era whose prosperity was as thin as gold leaf. The riches of coal lay in the ground for the taking, but mining it and bringing it to market by boats and rails was labor- and capital-intensive. Great cartels were formed, backed by foreign and domestic bankers and politicians. The railroads could charge unreasonably high transportation prices because they owned most of the coal they transported; the price only shifted profits within their own conglomerate. The anthracite mines of Pennsylvania had to come under the control of the railroad cartels; those that did not knuckle under were starved out of the market.

For years after the 1869 transcontinental “golden spike” the railroads had expanded rapidly. At their annual peak in 1887, 12,984 miles of new lines were built on speculation that they would soon haul goods, much for export. This did not happen. London banking entered a crisis in 1890 caused by crop failures and excessive overseas speculation that cascaded through world commodity markets for several years. The bankruptcy of the Reading Railroad and its chain of coal and rail monopolies on February 20, 1893 was prompted by what was euphemistically termed a “financial tangle” generated by the industrial barons J.P. Morgan and Archibald A. McLeod. The Philadelphia and Reading had been over-leveraged while expanding into New England, in competition with Morgan. The failure of the Reading triggered a severe nationwide banking crisis in 1893 that did not get resolved for several years.

The Panic of 1893 threw millions out of work, including Frank. A banking crisis compounded difficulties with heavily mortgaged farms, crop failures and mismanagement of currency backed by precious metals. Businesses and manufacturers were unable to deliver cash to pay workers or buy materials. The panic brought an end to the boom in construction of railroads to nowhere. A quarter of the railroad companies went bankrupt, and unemployment spiked, especially in the midwest. Across the nation, armies of unemployed workmen converged on Washington to demand action.

Frank’s job at Burson Mills ended in July 1894 as the shop closed. For a short time he was employed by an innovative Chicago bicycle manufacturer, Gormully and Jeffery. Frank’s hands turned the wheels of industrial progress for another few months, and then he went back to doing odd jobs on farms, living with his mother in Oakfield, Wisconsin.

The deepening recession closed most factories, and the railroad shops had no work for Frank, so he turned to the Christianity that had sustained him for a decade. In February 1895 he began training as a lay minister at what was to become the Moody Bible Institute in Chicago.

For his scholarship Frank spent much of his time scrubbing the basement hall floors and waiting on table, doing odd jobs and occasionally earning a little money working his machinist trade. His cash on hand was seldom more that a few dollars, but he gave money to beggars and paid his weekly room and board. He took correspondence courses in drafting from the Scranton School. His religious instructors told him to pay back the railroads from his savings, nearly a month’s wages, for all the free train rides he took. On April 4th, 1895 he wrote in his journal: 

Went down to Chicago and Northwestern General Offices and saw President Hughitt and told him about riding over the road without paying for it and he said ‘Oh, dismiss that matter from your mind; we freely forgive you.’

He was “still burdened with it” and gave them $42.10 in payment of the rides. The next week he went to the Pass Department and paid $22.25 for having ridden on a free pass issued by a conductor. “It was a hard battle, but with God’s grace it was won.” 

The daily classes to prepare him to be a “Moody worker one-on-one” were called Personal Worker Training, and they met in a downtown Presbyterian Church. Additional Bible Study took place at the YMCA, and he attended a variety of Protestant churches and made brief notes on the sermons. The Institute maintained a Mission with meals at a busy downtown intersection, and it bore the un-Midwestern name of the Pacific (formerly Beer) Garden. It was here that trainees began their counseling outreach, seeing to the special needs of men there and conducting prayer meetings to “gather them all in.”

Moody’s Pacific Garden Mission in Chicago. From William T. Ellis LL.D. Billy Sunday: The man and his message.

The intensity of his encounters were expressed in his diary when he returned, sometimes by midnight. Both failures and successes to “save souls” characterized his first months in the field. Some examples:

“Walked home with a man, Melville Mendell, who is deeply under conviction but is just not willing to ask Christ to come into his heart.”

“William Booth expects God to supply his needs when not yet a child of God — yet. He couldn’t give up his job singing in the chorus at McVickers Theater although he felt he ought to if he were to become a Christian. … Next day Booth resigned from the chorus, asked for forgiveness, and together we walked over to [the house of] Cousin Ada Worthing. He found a job that night as a pressman.”

“A con man hung around but wouldn’t give up his operations in several states yet.”

“An alcoholic man had lost his home, children, and wife; was truly broken up and readily accepted Christ.” “There was a good evening meeting and I sung one fellow down and another got mad and went out.”

“Helped a beggar to bed — a sailor who hadn’t been home for 11 years. He walked to the Mission with me but ducked out without an excuse.”

“William Booth has backslid and not come back to the P.G. as promised. Tonight I dealt 30 minutes with a young man without result as he wrestled with his backsliding.”

“Went down to the stockyards to look for a missing member of my flock — Charlie Apitz. He just disappeared. I saw how pigs and cattle are killed; it was very interesting and troubling.”

“A man came in for prayer, he said, but really just wanted a bed. He took the pledge but drink called him back. I try to understand. But even [cousin] Frank Tuckwood, who couldn’t be found last week, turned up in a saloon.”

“Tonight a man somewhat sobered up prayed and then asked for 10 cents to take a streetcar, and I gave it. Then I learned that he lives close by and I offered to walk with him. I asked for my ten cents back and he turned hostile and threatened to slug me. Had to leave him with just a prayer. We must get more jobs for our Pacific Garden men.”

“A man caught my hand as I left and cried out for the sins he had just committed and admitted that he was attracted to the Mission for the singing in preference to going home to wife and children.”

“Charlie Apitz eluded me for three weeks. But after some hospital visits Charlie has turned up again and is doing well. I had written his mother and told her not to worry — I felt he would return to Pacific Garden, but I told her where he could find housing. I found him a coat, vest, and pants.”

“Both Ma and Millie write me they are anxious about where I am working each night. But I am trusting in PHILIPPIANS Ch. 4 verse 19 ‘My God will gloriously supply all your needs with his wealth through your union with Christ Jesus.’ Well I have applied scripture readings to William Booth who has turned up again. Said he was afraid he would fail. Last night he entered saying ‘I’ve come back to the Lord.’”

While at the Institute, Frank became engaged to a co-worker named Belle, with whom he corresponded daily for a couple of years. She was based in Michigan, near Grand Rapids, and traveled as part of her mission work. Their ministries clearly took precedence over courtship.

Meeting back at the Institute, “We went our separate ways for a day”— Belle to a guest house by street car and Frank to a familiar room. He found it “good to be back—quite a few glad faces along with the new.” Some classmates had returned from foreign missions in Africa and Central America. Belle attended lectures and singing classes, and then it was time for her to return to Holland (Michigan) by steamer. He saw her off on the 24th and began a letter to her on the 26th.

Frank resumed his familiar life at the Institute with lectures and housekeeping chores. He became more reflective in some entries: “I thought I had made some progress in handling my anger with the help of prayer. But I am so vexed with a new requirement that I take a lunch tray daily to the rooms of Mr. and Mrs. Newell, the Senior Administrators. This is an unwarranted duty in addition to my work as a waiter at that time of day.” They did give him an extra 50¢ weekly. Another assignment which he tried to resist but eventually had to perform: going out on the streets to sell the Colp Publisher books of Moody tracts and gospel hymns. He found it difficult to be a salesman and was notably unsuccessful after many tries and expressed real anger, certain that this was not the way to use his experience in leading people to Christian faith. “It was only through prayer that I could resolve this conflict and turn the anger into a positive determination to be out in the field.” He listed for August a succession of small jobs of cleaning and substituting in the dining room for small additions to his income. His regular routine from early morning until time for lectures totaled 30 hours of work for the week and he found himself so tired that he sometimes leaped up from a nap, late for a meeting. Almost every evening was committed to working at Pacific Garden. He made contributions toward relief of the 1896–1897 famine in India where five million died, and to other overseas missions.

In late 1897 Frank broke off the engagement. The journal for the year 1896, when he was courting Belle, disappeared. He burned a trunk full of her letters by the riverside.

Had a long talk with Belle this morning and confessed that I do not care so much for her as formerly. Had prayer and felt better, Belle and I drove to Saugatuck to visit her brother’s school. I knew it was the Lord’s will that we break our engagement.

After Frank’s training, he was sent on a mission to rural areas of the Michigan Thumb, where there were few churches, staying in the homes of his parishioners. He ministered to the McLeods, whose youngest daughter Sarah was in need of “saving.” He sang hymns with their older daughter who played a small organ. He made it his business to give her sheet music. His notations changed from “Miss McLeod” in March, to walking “Ellen” home in May. Was it long separations of missionary life that ended the previous engagement, or his attraction to the talented lady who would become his wife for 41 years until his death? Perhaps Ellen appealed more to Frank’s practical side than the missionary-minded Belle. Ellen had apprenticed to the millinery trade in Detroit, making “fashionable hats with very broad brims, feathers, piled decorations, and velvet ribbons running through.”

The money Frank earned preaching was not nearly enough to support a family. Resenting his position while at the Institute as a glorified butler to the Moody administrators, he returned to his mechanical trade in the cities along the Michigan shores.

Frank married Ellen McLeod at the Elk Presbyterian Church in Peck, Michigan. Ellen’s mother and then father had died when she was in her teens, leaving her with seven older half-siblings and three younger ones to care for, so her older siblings were the ones to give her away. A wedding announcement gives the couple’s home as Tawas City, on Saginaw Bay.

Ellen and Frank McGlone soon after their wedding December 19, 1900.

Ellen’s older half-brothers and sisters were scattered in rural Michigan farms near towns of Amadore, Saginaw, Melvin, Caro, Vassar, Marlette, Brown City, Yale, Peck, and Port Huron. These were places where Frank traveled and preached, while sending gifts and keeping touch with “Ma” and Millie back in Oakfield, Wisconsin.

Ellen’s grandmother Annie McIntosh, wife of Kenneth McLeod (d. 1853), had emigrated at the age of five to the Canadian wilderness. Born October 18, 1793, Annie died March 8, 1897 in her 104th year. Her family had settled in rural Glengarry, Ontario, but at age 20 she met the Michigan farmer Kenneth. They had eleven children before he was killed by lightning. Their son Findlay had several children before his first wife died. He remarried to Sarah Ann McLain, mother of “the second family” of McLeods, including Ellen.

Many years later, Frank’s daughter Doris described Ellen’s “Northern Michigan branch” as farmers with no running water but who were at the same time “free, earthy, and wonderful, feeding the chickens and drinking milk fresh from the cow.” Aunt Flora, “kind of smelly, as if clothes had just come out of mothballs—or should have been in mothballs”—represented this group. “She had many children, some of them just barely scratching out a living. We went to the house of one in which there were the eight grandchildren, who tried to make a meal for us. And we were bringing in as much food as we could tastefully do and not look as if we were dispensing welfare, so, another time they could visit us, with their heads held high, bringing the eight children. The best thing we had to offer in our house for entertainment was the flush toilet. It never had a rest. Not for usual purposes, but just because they thought it was so marvelous.”

Frank’s daily journals as a Bible Institute minister were replaced by terse records of hours and wages. He went to work in the Tawas, Michigan and Rockford, Illinois railroad machine shops, wherever work could be found. Wages were skimpy, $0.34 an hour, sometimes a penny more. Ellen gave birth to a son, Elno, then tuberculosis struck. In those days the only therapy available was a dry climate and mountain air, so they moved to Colorado, where Frank could work 50–60 hours a week in the machine shops, sometimes 10 or 11 hours a day. He worked for the Colorado and Southern, Grand Trunk Western, and the Denver and Rio Grande Railway. His sister Millie and brother-in-law lived nearby, helping the young mother, but the second child, a baby boy, died from tuberculosis. As his daughter Doris recalls, “For a long time, there were three enlarged photographs in the oval gold rim frame on the dining room wall and it puzzled me very, very much that I didn’t know one of the faces and I questioned it because the boys being at that point one and two are in long dress, white dresses; the baby’s in a sort of christening gown. Finally, it was explained to me that there had been little Carl.”

Heartbroken, Ellen insisted they return to Michigan, despite her ongoing struggle with the TB that had taken Carl’s young life. Five months of illness and unemployment also beset Frank while in Colorado in 1907. Ellen’s brother-in-law Perseus “Perd” MacGregor, involved with the Oakland Motor Car Company in the Detroit suburb of Pontiac, Michigan, helped Frank find a job in the area. The McGlone family moved to Flint, Michigan. The 1908 October–December journal entries show Frank’s hours of work for a small carriage-maker-turned-automobile company formed by a Scotsman, David Dunbar Buick. They were steady earnings, ten hours a day, six days a week at $0.30 an hour (with no overtime bonus).

Perd MacGregor rose to a high position in the Pontiac company. He and his wife Sarah, Ellen’s youngest sister, sat in the company pew in the church on Sunday. Doris recalls, “At the end of the church service, we’d all meet in the lobby, have a little chat and I would be standing with my tongue hanging out, hoping that Uncle Perd would say to my mother, Ellen, let’s take a little spin around the block before we go home. Ahhh! That meant a ride in the country, the window down, the radio playing and I loved that!”

General Motors and Flint, Michigan, the Vehicle City

David Buick introduced the modern overhead valve gasoline engine that allowed higher compression and more power than the slide valve or L-head designs inherited from the days of steam. He sold 750 of his stylish blue vehicles with a powerful, 21-horsepower motor and ivory-colored, wooden spoked wheels, but he could not meet expenses and was bought out by 1906, losing money on his further ventures. His name brand continued without him, and his genius was only recognized years after his death in 1929.

The Buick production line moved to Flint in 1908, where the company and Pontiac Motors were taken over by William Crapo Durant’s Flint Carriage Company. Mr. Durant soon absorbed two dozen other automotive enterprises. Thus was born General Motors (GM), for another century the largest motor vehicle manufacturer in the world. From 1908, Frank was to work at Buick for twenty-four and a half years.

From the hours and dollar amounts in Frank’s journals, GM’s wages were less than the $60 or $70 a month that he was earning in Colorado, but the hours were steady, and a few years later Frank purchased a house at 912 Young St. in the Central Park neighborhood. Additional revenue came in 1911 from raising chickens and a rooster on wheat and cracked corn, after investing in glazing for a henhouse. In 1912 he had a cow whose milk was sold, 8 quarts a day at 7 cents a quart, with expenses for bran, corn meal, and cottonseed meal. He paid a dollar in July for the cow’s damage to the Higbee family’s garden, but sales of milk brought in several dollars a month. The GM factory town expanded and so did the family. When Mildred was born, he built an additional room on the second floor of the house for the boys’ bedroom. Years later, to everyone’s surprise, Doris arrived. Frank, now 50, built a bay window so that Ellen could sew, raise the baby, and make her fantastic hats.

Frank’s diary 1907, hours and wages from Denver & Rio Grande; 1908 Buick Motor Co.

The start of the 20th century marked the beginning of the addiction to the concentrated fuels that powered modern forms of transport, especially the automobile. Coal and steam could not compete with the instant response of gasoline and internal combustion. Flint, a suburb of Detroit, soon became the automobile capital of the world, with General Motors at one time the world’s largest corporation. GM’s extravagant profits funded Dearborn mansions and corporate jets but the riches did not filter down to the factories in Flint. Faced with the superior quality and fuel economy of imports, its market share declined. GM chairman Roger Smith claimed that he would out-compete Japanese imports if “technological change suddenly made it possible to make gasoline out of seawater such that the price of gasoline would drop a nickel.” And then factory after factory closed down.

Hitchhiking through Detroit in the late 1960s, I caught a ride from an autoworker in his old car. He had been laid off for two years but recently rehired. “The factories are building tanks for the Vietnam war,” he said with disgust.

Flint/Grand Blanc is now a monument to the decay of the domestic automobile industry. Flint’s one-time, six-figure manufacturing employment dropped to 55,000 in 1990 and is now 13,200 jobs. The city is known for the poisoning of its water supply by politicians, who in 2014 diverted untreated water from the Flint River into the drinking supply in order to save money. Within a month there were widespread complaints from residents, and GM discontinued using the now-corrosive water supply. “The Flint water crisis now enters year eight. … Meanwhile, the population of Flint has fallen to fewer than 95,000 residents, the lowest in more than a century. Not all of the city’s lead service lines have been replaced, and residents’ deteriorating home plumbing, also damaged from the toxic Flint River water, have not been addressed, nor the widespread contamination left by leaded fuel, batteries, and coal burning. Residents still complain of rashes, hair loss and other ailments from the city’s drinking water.” While at least a dozen people died from acute ailments, the Michigan Supreme Court ruled in 2022 that a judge had no authority to issue indictments in the Flint water scandal, a decision that wiped out charges against former Gov. Snyder, his health director and seven other people. All charges lodged against any of those responsible for the ongoing water crisis were dropped.

Those who believe that the system of corporate power will eventually right itself should ponder the wreckage left behind by General Motors in Flint. The tangled prosecution of those who poisoned the water cost the state $60 million, and the criminals simply ran out the clock.

Frank’s Children: Planes, Paints, and Automobiles

Frank rose in pay as a time-and-motion piece rate supervisor at Buick. An expert machinist, he was suited for this job, which involves measuring the time it takes to perform each repetitive motion required to produce a given part. Any unnecessary movements are eliminated, and thereby production quotas are set, as opposed to assembly line work where the speed of the line dictates the pace of the work. This system nominally provides incentives to work harder than an average skilled laborer and turn out more pieces per shift, at the cost of eliminating any time for muscles to rest. As the quality of the materials and the condition of the tools vary, the rates might be raised or lowered accordingly, but the object of the piece rate system is to lower the cost of production by speeding up the pace of labor. As such, it was subject to abuse by supervisors via sloppy record keeping or setting rates that prevent workers from earning higher wages. Also known as the “sweating” system, it was not until 1938 that the Fair Labor Standards Act set a minimum wage or compensation for overtime. How Frank felt about this work is not recorded in his journals, but his involvement in the temperance movement and the Moody Bible had surely given him a reputation for industriousness and high personal standards, such as fifteen years earlier when he repaid the Chicago and Northwestern Railroad for rides that he had taken for free on their engines.

During the years Frank worked for GM, the corporation had dozens of brand names and absorbed Buick, Oldsmobile, Cadillac, Oakland, Pontiac, Chevrolet, as well as diversifying into electrical (Delco/AC, Frigidaire), heavy trucks (GMC), and briefly tractors. His wages were used prudently. He set aside contributions to a company pension fund for his retirement. These were frugal times, where the amenities were fifty-cent blocks of ice for an icebox, a flush toilet, a hazardous wringer washer, and a grape arbor in the back garden. As did so many others, he purchased stock in GM. The lessons of the periodic commercial crises of the railroad boom were forgotten in the giddy atmosphere of the roaring ‘20s.

Frank’s son Ronald made paints at DuPont Chemical, holding several patents. He marketed the shiny Duco lacquer coats that made those four-wheeled stinkpots more attractive than Henry Ford’s black Model T’s. Ron’s face was badly burned at work. His daughter Mary Jean contracted pneumonia at the age of seven, a few months before the patenting of life-saving sulfa drugs. They borrowed money to pay for oxygen tanks, but nevertheless she died. Ron himself succumbed to cancer in 1974.

Frank’s youngest son Milton also went into the auto industry, but he suffered from tuberculosis (as had his mother a few years earlier) and spent a year in a sanitarium. He could no longer do the work at Chevrolet Parts and Service, so he and his wife opened a Dairy Isle franchise restaurant across the street. They worked long hours and made the best hamburgers in Flint, but they could not sustain it. He sold that business and had a miserable few years working in a public welfare office, eventually earning a business degree. He joined a banking trust division, but died in 1972 from pancreatic cancer.

Frank’s older daughter Mildred had many aptitudes and did well in school, however she felt marginalized in the family because of her “girlish” ways. She envied her cousin Bernice’s rich wardrobe, even though her mother tried to create almost equally stylish dresses. Bernice went to a finishing school paid for by Perd MacGregor, while Mildred’s life at home with her puritanically strict parents was so stressful that she lived for a time with her brother Ron and his wife. She became pregnant while in college and married a med student, who was a fairly successful poker player as well. The doctor moved in social circles and expected as much from her. His practice began in the deep south, where she was to be merely a gracious host and well-dressed wife and mother. But the doctor’s abusive, social-climbing mother never found her “good enough” for her son. Her self-esteem was crushed.

Frank’s oldest boy Elno Duncan paid his way through Alma College working summers as a Fuller Brush salesman. He began a teaching career at Flint Central, but gave up that job in 1927 for the excitement of a commercial startup in Chicago. With W. B. Stout and Henry Ford’s all-metal Trimotor airplanes, he marketed flights to Cleveland and Detroit. As the company merged into the United Air Transport Company, he honed his skills with a camera and distinguished himself by making scale models to promote the future DC-3 and Boeing 247 airplanes that would make a profitable business of coast-to-coast travel. Elno rose to become a Vice-President for advertising at United. The ad copy was titled The Romance of the Mainliner in the glossy magazines. His passion for photography allowed him to travel the continent and eventually the Pacific Ocean. His love of cinema led him to move to California, opening a commercial studio partnership “Cate and McGlone” in Hollywood for travel films. He contracted with American Airlines, Pan Am and United, making films promoting tourism to the exotic paradise of Hawaii. In later years United flew Frank and Ellen out to see Los Angeles, San Francisco, Portland and Seattle on the company ticket.

Stout Air created daily scheduled passenger routes. Commercial air had carried passengers for several years as adjuncts to the profitable US Mail contracts, which shortened delivery in the mountainous West from days to hours, albeit with considerable loss of planes in dense fog and wind. While leather-capped pilots would often parachute to safety before their planes crashed, passengers were seated on mailbags, if there was room, and provisioned with parachutes (but not instructed in their use). Multi-passenger, inter-city flights became more common in the 1920s. Henry Ford built an airstrip and terminal building in Dearborn, Michigan, with booking and bus service at a downtown hotel, marking the dawn of fashionable air travel. Soon, Amelia Earhart was marketing “active wear” flying fashions. Henry did not allow Ford Field to be operated on the sabbath, so all Stout Air Services flights were advertised as “Daily Except Sundays.” But among the greatest innovations of Stout Air were the “Uniformed Airline Pilots” with their peaked caps and embroidered wing insignia. Denoting authority and professionalism, these replaced the aviator’s “bomber jacket” daredevil look.

Elno McGlone with Stout’s scheduled passenger airplane, 1927.

By 1960, as Elno had predicted, air travel had exploded in the US. Passenger jets could fly cross-country in the stratosphere above the turbulent clouds. Air passenger miles doubled every decade for more than a half century, adding to atmospheric pollution, only pausing for the global pandemic in 2020, while railroad travel declined and rights-of-way were abandoned.

Air travel, as fuel-intensive as it was, never came close to the ten-times-greater energy consumption of cars and trucks, due to the prolific post-war production and marketing of automobiles and the 1954 National Defense Highway System, the “Interstates.” Those vehicles and roads required concrete, asphalt and steel in enormous quantity, each contributing massive additional amounts of carbon dioxide to the atmosphere. 

Elno’s youngest sister Doris, and son David McGlone, with Boeing 247 model, 1933.

With increasing ownership of automobiles, public transit on the once ubiquitous electric trolleys stagnated. Transit lines suffered a precipitous decline after they were taken over by General Motors and related investors. The National City Lines corporation (NCL) purchased the streetcar system in Baltimore in 1948, and converted the system to run on GM’s buses. Ridership then plummeted by double digits in each of the following three years. An anti-trust suit convicted GM of conspiring to monopolize sales to local transit companies controlled by NCL. Upheld on appeal in 1951, the verdicts fined GM $5,000 and its treasurer $1. The electric trolleys that Frank rode to the factory became a thing of the past.

Goldman Sachs, GM, and the Great Depression

The monetization of the auto industry in the 1920s led to frequent bankruptcies and ended many of the brands whose names are now forgotten in automotive history. Widespread speculation, inside trading on the stock markets, and outright frauds perpetrated by stock issues and reorganizations, secured by equally fraudulent stock issues, could not last forever, in spite of the pervasive euphoria induced by ever-rising paper values. Many who invested their life savings in the stock market lost everything. In 1931, GM was the world’s largest automaker. By 1933 sales of automobiles had dropped by 75% and GM had laid off most of its workers. GM sold cheaper models of cars, and with those layoffs and its overseas subsidiaries, GM made a profit in every year of the Great Depression.

The explosive growth of fuel-burning capitalist production, powered by James Watt’s steam engine, gave birth to the science of political economy. The Scottish Enlightenment produced Adam Smith’s Wealth of Nations, followed by the French physiocratic school, e.g., Jean-Baptiste Say, culminating in Karl Marx’s Capital. After these definitive works, the profession of economics was restricted to the axioms of capitalism: the individual is the basic economic unit, everybody pursues their own self-interest, and the marketplace is the ultimate judge of societal good. As well as denying the evolution of social consciousness, altruism, and ultimately, love, these axioms simply reduce economists to being touts for the stock market. As the Harvard economist J. K. Galbraith described it, his profession had abandoned science to become proselytizers for a system in which “competition was an effective substitute for honesty.”

For example, in December 1928 the wizards of Goldman Sachs issued stocks in their eponymous Trading Company, in which they immediately purchased a controlling interest for a few million dollars. It appeared to be a financial miracle after they pumped it up by these purchases, doubling its value in 3 months. The Trading Company sold a quantity of shares to the famous founder of GM, William Durant, who then resold his shares to the public. Goldman Sachs launched two major stock offerings in the next few months, the Shenandoah and Blue Ridge Corporations, in combination worth nearly a billion dollars in early 1929. The enthusiastic directors were from every major institution, including noted banking official and future Secretary of State John Foster Dulles, for whom Washington’s international airport is named. Within a few months these corporations had acquired several major banking and investment firms, and in a few days in August had launched seven more international investment trusts, with world-renowned directors and gilt-edged securities sold to the public. The stock market reached its peak on September 3rd, 1929. The stock market crash on October 29 exposed those paper thefts, and myriad other frauds. It took four years for the Trading Company to unwind its stocks, which by 1933 had declined to a mere 1% of their marketed value. Fast forward to the housing mortgage crisis of 2008 following which the wizards of Goldman Sachs, after obtaining $10 billion in bailout funds, were penalized for ‘serious misconduct’ that inflicted harm on investors and the American public at large, and precipitated foreclosures on homeowners.

Frank was laid off from Buick just short of the age of 65, and was denied a pension, although he had contributed to its fund from his wages for years. The new GM manager was an outsider appointed by Alfred P. Sloan, Jr., whose highly profitable German Opel division of GM in the 1930s made possible Hitler’s rebuilding of the army Panzer divisions and the subsequent Blitzkrieg invasions. The manager instituted job cuts and forfeitures that none of the old guard, who knew the workers well, would have had the stomach to do. Doris said, “Any of the presidents of any of the branches of General Motors at that time might not have been able to carry [layoffs] out to the degree that the company wanted because they just couldn’t do it to the men with whom they had grown up in the auto industry. There were feelings there. … A man was brought in from outside. He didn’t even look like the other people, [laughs] we thought, and so most of the talk was of his ruthlessness.”

Frank made the rounds of many auto companies after the layoff, hoping for a “Chevy job,” noting the ruthlessness of the new manager. Every man of age 55 or older was cut. A family friend in Frank’s section was so depressed that he committed suicide after months of unemployment. Frank worked odd jobs, painting and the like. There were offers of help from his sons, but money was very tight, especially with the constant illnesses and hospitalizations of Elno’s son David.

All the McGlone family savings were lost in the Michigan bank crisis of February, 1933, eventually recovering only 20 cents on the dollar. Frank held on to his GM stock, hoping it would rise to $100 before he had to sell (it sank). He remortgaged the house at Young St. and rented out all but two rooms, grateful that they did not have to give up the spacious home in which he and Ellen had raised five children. As his savings dwindled, he surrendered life insurance policies for cash. Through his church connections he was offered a position as a janitor at Flint Central, while Doris attended. This was a bitter demotion from being a piece rate manager and skilled machinist at the largest automobile manufacturer in the world. Frank declared, “You will have to find a new sobriquet for me instead of Lazy Fellow. I have joined the Exalted Order of Janitors at Central High, age no bar.” His wages were 45 cents an hour.

Frank struggled to pay the bills on his meager salary, taking on extra duties when he could, but cancer struck him and he became too sick to work. He went into a hospital, where his family took shifts feeding and caring for him. Doris and Ellen lived in one room and shared a kitchen with another family, after renting out the rest of the house. Ellen’s bronchitis worsened after years of recovery from tuberculosis.

There were hospital bills. The grown children chipped in as much as they could. “This is a time, you know, no Medicare, no Social Security,” Doris related many years later. “There is nothing to support families with these expenses.” Even after renting out all the rooms in the house they could not afford to pay the mortgage, and they had to sell the house in Flint.

Frank McGlone died on August 4th, 1941 after a long struggle with cancer. Although he had attended meetings in Colorado, Frank was not a union man, and never was a word spoken against GM in his household. Frank and Ellen had lived a frugal life, never beyond their means. The children did not get new sleds under the Christmas tree. Oil was spread over their unpaved side street in summer to settle the dust, but the bills were always paid on time. The children all went to college. Frank never owned a car.

In 1941 the CO2 concentration had risen to a plateau of 310 ppm. The roughly 10% increase from the pre-industrial era, after more than a century of power generation by coal, was slowed somewhat by the worldwide depression and the wartime curtailments of industry.

Michigan Snows

On a snowy afternoon at her home in Maryland, Frank’s daughter, Doris Elaine McGlone, now in her eighties, reminisced about her childhood in the company town of General Motors. Michigan had enacted free public schooling in 1867, and the original 1875-era high school was soon outgrown. The four-stories-tall Flint Central building, with its two swimming pools, gymnasium, and a dramatic tower, was built in 1922 on Crapo St., named for the GM founder’s grandfather. Schooling in Flint was “the epitome of education with a reputation for excellence throughout the state, and Flint Central was the forerunner.” A middle school and a junior college were added next door.

The hillside next to Flint Central was often used for sledding in the winter after school let out. The fondest memories she had were the days when the snow had been flattened by toboggans into an icy glaze, and pieces of heavy cardboard carried those without sleds down “break-up hill.”


Flint Central High School (1922), after it closed in 2009.

Frank had a conversation with Doris when he took the janitor job at the high school. He told her she need not be seen with him if she would be embarrassed. Doris was outraged that he would even consider such a thing. She loved her father dearly, strict as he was. His job also had benefits — the keys to the library were hers to use, and she had the run of the school after hours, as an editor of the school newspaper and then the Yearbook.

Doris had witnessed the December 30, 1936 Flint auto sit-down strike as a sixteen-year-old in high school. The workers remaining at GM, after the previous years layoffs, had rebelled against repeated speed-ups, wage cuts and management indignities. Spontaneous job actions had been conducted for months, all over the industry, but had not won decisive victories, leaving room for GM to maneuver and stall negotiations. Their patience exhausted, the autoworkers seized the key production plants of Buick and Chevrolet and held them for 44 days. When the bosses got wind of the union’s strike plans, GM tried to lock the workers out and take production elsewhere, but the workers refused to leave the factories, shutting down the essential production of stamped metal parts. They fought the scabs, strikebreakers brought in from out of town. They beat back the police and neutralized the largely sympathetic National Guard. During the strike, families threw meals through the open windows of the shop and women bravely surrounded the phalanxes of police. 

Doris’s brothers were at the battle. Milton was trapped inside the plant in the clerk’s office, and Ronnie worried because his DuPont paint employment was now in jeopardy. Seeing such activity was indeed formative to her later rebellious activity, but young Doris was most upset at the time that the National Guard soldiers took over the high school grounds, its skating rink, and snow sledding hill.

Graduating in 1937 during the height of the Depression, Doris went to Flint Junior College (now Mott College), living at home, mending books in the library for 15 cents an hour, raised to 30 cents by the National Youth Act (a part of Roosevelt’s New Deal). The city was ready to close the College for lack of funds, but Doris and many others went door to door soliciting funds from the alumni to keep it open, after their work in the library was done.


Vandalized music room at Flint Central High, 2023.

The last class graduated from Flint Central High in 2009, closed at the height of another auto industry crisis. It is now a derelict building, roof leaking and walls overgrown with vines, no money being provided by the Board of Education for its upkeep. Nor is the estimated $3 million available to demolish it, so it sits deteriorating and vandalized, the back entrance wide open and unguarded. The floors that Frank had cleaned and waxed are now buckled beyond repair. The music room’s grand piano is trashed. Graffiti covers everything. The cabinets full of beakers and flasks in the chemistry classroom that Doris so loved are smashed on the floor. This once-magnificent building is a glimpse today into the future of the automotive and petrochemical industry — the abandoned factories and refineries leaving behind a costly cleanup.

Doris won an alumni scholarship to transfer to the University of Michigan in Ann Arbor, where she resided in the prestigious Martha Cook dormitory. She was one of the “back mezzanine gang,” living above the kitchen and waiting on tables to pay her room and board. The kindly dietician made sure her staff got orange juice in the morning, with seconds, and dessert at night, with seconds. Doris came to love grapefruit and avocados, luxuries the McGlones never had. She put on fifteen pounds and her tomboy body started to show curves through her rather shapeless dresses.

In the kitchens, the Martha Cookies were radicalized by talk of unions on the radio and they felt that they should get equal pay with the other university cafeteria workers, forty cents an hour not thirty five. On a faculty night  dinner, they paraded around the tables with the dessert trays, but left them on the serving table. The next morning their strike was beaten down by threats of losing their scholarships and they had to apologize. But from that time on, Doris and her sidekick Jean Maxstead would burst into the Marseillaise at the drop of a hat.

Doris graduated Phi Beta Kappa from the University of Michigan in 1941. Like Jean, she had aspired to go on to graduate school, and was awarded a graduate fellowship at UM, but her brothers advised her to forgo further education to get a better-paying job as a teacher. She had earned a teaching certificate for high school social studies, and she so impressed the superintendent of Dearborn’s Henry Ford High School with her teaching knowledge that she was offered a job on the spot—at a better salary ($1300 for the school year) than Flint’s.


Doris McGlone and Jean Maxstead at graduation, 1941, Kodachrome slide.

That summer, Doris worked the four to midnight shift at GM’s Albert Champion (AC) Spark Plug factory, at a rate of 65 cents an hour. She  turned all her money over to her brothers to cope with the hospital bills for her father. This was fine; all that she needed for herself were “stamps and envelopes to write to [her fiancé] in Buffalo. You didn’t phone in those days.” There was no money for a private duty nurse for her father, so the family took shifts caring for him. Doris came in at lunch time.

Flint filmmaker Michael Moore’s father, a veteran of bloody landing battles in the Pacific, took a job on the assembly line at AC after the war for the next 35 years. Like many other assembly line workers, he started a family on 65 cents an hour pay. After drastic layoffs, Michael Moore exposed General Motors’ long trail of destruction in Flint in the award-winning 1989 movie Roger and Me.

When Frank McGlone died at the end of that summer, Doris had a few hours off from the factory for the funeral service. Frank’s 90-year-old mother, Mary Theodosia Gates McGlone, attended his funeral. Shortly after, “Dosh” was struck by a turning car in downtown Flint while stepping off a curb. Since the family had no money for a hospital, she recovered from a broken hip in the “county home” poorhouse. She lived independently for another six years, long enough to meet her youngest great-grandchild Christopher, to hear of the atomic bomb, and see the modern age.


Mary Theodosia “Dosh” Gates McGlone, ca 1946.

Part IV Pacifism and Pearl Harbor


The Neumanns of Buffalo, NY

My father, William L. Neumann Junior, grew up in Buffalo, New York. His father, “Senior,” was a German Catholic, engaged in the plumbing trade in Buffalo, the home of the American Standard radiator and porcelain fixture brands. He married a devoutly Catholic lady and built a well-framed house, shaded by a stately row of Lombardy poplars. Concord grapes covered an arbor next to the garden. The household was proper and pious. Fish was served at dinner every Friday. A gingerbread clock on the fireplace mantle chimed the hours. The wallpaper inside the house had precisely matched seams. It was said that when Senior had berated a tradesman for a mismatched pattern inside a closet, the fellow stormed off the job, leaving his tools behind, and never returned.

Bill’s older sister Beatrice married a Cleveland, Ohio man and gave birth to a son, but she died shortly after. Bill’s nephew Clifford was raised as a stepchild in a well-to-do family. Bill’s older brother, with a deep booming voice, played ice hockey like every young boy in Buffalo, and joined the family business. He and his wife never conceived children and instead adopted a boy and a girl (it was said that a hockey injury had the same effect as a vasectomy).


William L. Neumann Sr., daughter Beatrice (1901–1925), and wife Elizabeth Boller.

The garage next to the house had shelves full of pipe fittings. There were buckets of lead for soldering. The plumbing parts were degreased in carbon tetrachloride (now known to be a greenhouse and ozone-depleting vapor and a potent liver toxin). The business prospered, with a grease pit and a big red fuel pump behind the garage for its trucks.

Left: Elizabeth C. Boller, ca. 1890. Right: Elizabeth Boller at her wedding to William Neumann, Sr., 1900.

Knowledge of my paternal lineage commences with a brother and sister of the Brandenburg Neumanns who emigrated just prior to the Franco-Prussian War of 1870–71. Frederick Neumann took his family’s building trade to Buffalo, NY, where William Neumann Sr. was born. My handsome, cigar-smoking grandfather had gradually lost his German identity and spoke the language only rarely. William Jr.’s mother Elizabeth, née Boller, was from Alsace-Lorraine, territory ceded by France after the Franco-Prussian War. Only a trace of these emigrations is preserved in the family correspondence.

Gregory F. Neumann, Frederick H. Neumann, Bill (Jr.), William Louis Neumann Sr., in Williamsville, NY, 1916.

Bill, the young William Jr., was a free-thinker and ladies’ man. In an all-too-common occurrence of the era, he had been struck by an automobile at the age of 15 and was thought to be on death’s door with a severe concussion. A priest had administered the sacrament of extreme unction. Bill thought that through this holy blessing, the Lord had absolved him of sins he had committed or would ever commit in this life. After recovering from a morbid diagnosis, and with a rather liberal interpretation of scripture, he was no longer bound by a strict Catholic moral code and acted accordingly. His youth was spent sneaking into dance halls when he couldn’t afford the admission. When he could afford tickets and go on dates, he kept meticulous balance sheets of expenses and romantic rewards.

With a reputation for being the “black sheep” in the family, Bill abandoned the plumbing trade for the seductive allure of the intellectual world. It was the height of the Depression and there wasn’t much work anyway. He paid his way through State Teachers College at Buffalo (now Buffalo State University) as an assistant grading papers. In the summers he earned money as the head lifeguard at the city pool and fed himself on free ice cream at the pool’s refreshment stand.

At the college, Bill was strongly influenced by Arnold R. Verduin, a socialist, pacifist scholar. Dr. Verduin had specialized in Spanish constitutional history, and under his guidance, Bill was admitted to the doctoral program in history at the University of Michigan. He pursued an interest in Latin American studies, and minored in the new field of sociology. It was there, at that fateful lecture by Mlle Curie, that he met the young lady from Flint. He was smitten by the woman who was his intellectual equal despite being nearly five years younger, who liked to dance, and was pretty besides.

The couple were influenced by the “revisionist historians” who questioned the “war guilt” placed solely on Germany for the origins of World War One, a deadly contest over territory and spheres of influence with the British and French colonialists. The post-war declarations of the Versailles Treaty demanded reparations to be paid by the German people, who were at that point dying of starvation by the tens of thousands and revolting against the Kaiser. The young historians sought to interpret all policy decisions in light of the horrors of the war, and what they held was President Wilson’s mistaken involvement. After WW I, many students aligned themselves with the British Peace Pledge Union, even after the fall of France in 1940. The Oxford Pledge originated at the Oxford Union, Oxford University, England. The signer vowed to refuse to participate in war, and it became a feature of many pacifist and anti-war rallies at British and American universities in the early 1930s.

A Gallup Poll in the US as late as February 1941 found that 85% of those interviewed were opposed to involvement in the escalating European and Sino-Japanese wars. The popularity of historical revisionism and Roosevelt’s supposed non-interventionism led not just a few intellectuals into the same camps as the America First isolationists, whose members were anti-Semitic Nazi apologists. These revisionist historians were scholarly types, whose sophistry fed the more rabid, anti-semitic and pro-Nazi press, not today’s blatant Alex Jones type deniers. But the young pacifist couple’s moral compass was certainly stronger than that of their friend Harry Elmer Barnes, who was accused of being a self-serving Holocaust denier after his theories were challenged in academia.

Students of the day knew the horrendous consequences of World War I and the punitive, crippling peace terms established by the post-war Versailles Treaty. There was a generalized abhorrence of war and its wastefulness. But Barnes had shocked Doris by suggestions that the British Foreign Office had been planning to invade the low countries of Europe to challenge Germany for domination of the continent.

Bill and Doris both came from strict, church-going families, but there was a crisis of faith after the war. When asked how she came to reject her religious upbringing, Doris explained, “Ah, the scientists are to blame!” And the most radical ideas of the day were indeed springing from science, particularly the transmutation of atomic elements and the release of enormous amounts of energy.

A greater influence and clearer voice was the socialist Norman Thomas, a pacifist clergyman who was no friend of any bellicose nation. Having lost the support of the Presbyterian church in New York for his stance against entering WW I, he became a leader of the Socialist Party and maintained his anti-war stance even in the face of what he saw of the rise of fascism and oppression when visiting Europe in the 1930s. Socialism was trending in the universities, and students were overwhelmingly opposed to going to war in Europe, but the pacifist conviction of my parents was deeper than most, rooted somewhat in their religious upbringing, which they never completely renounced, and informed by their historical training.

Bill’s pacifist principles placed him at odds with his family and the local draft board.Unlike the situation in 1917, the renewed 1941 compulsory draft allowed for placement of Conscientious Objectors (COs) in the unpaid Civilian Public Service (CPS) camps. The historical Peace Churches, Brethren, Mennonite, and Quaker, agreed to run the CPS camps for the Selective Service under the newly instituted draft, to avoid the brutality and imprisonment that pacifists suffered during WW I. Their uneasy relationship was the best that could be brokered over the opposition of Roosevelt and most of Congress. The aim of the camps was to provide “work of national importance” for those whose conscience did not permit killing or serving in the military. Service was assumed to be “for a year and a day” prior to the outbreak of war with Japan. After Pearl Harbor and the fall of France, the British army narrowly escaped capture in Dunkirk, and as the horrors of the Nazi regime became known, it was clear that the CPS camps would last for the duration of the war, until the last GI was discharged.

By early autumn Bill had been drafted. After requests for occupational deferment were denied, many appeals, and a personal hearing, the Williamsville, NY draft board approved Bill’s CO application, based on their mistaken perception that Bill was still a Catholic or that his objections were based on religion at any rate. He was placed in a very rural camp in Royalston, Massachusetts, where cutting down logs for the winter and digging water holes was deemed to be work of national importance. In the eyes of Selective Service, the best part of this was that these men would be in the woods and unable to spread their doctrines and contaminate the public. 

Soon Bill was on his way to camp, sporting a checked red woodsmen’s cap and flaming plaid woolen shirts. Their necks were soon buttoned and collars turned up as temperatures dropped below zero. Some internees intentionally wounded themselves with their axes to escape working in the woods in the bitterest cold of New England winter. Although physically fit, Bill managed to get assigned to the kitchen rather than the woods. The weekend furloughs allowed him to visit Boston and the local colleges where pacifist ideas were popular. His camp hosted Norman Thomas for a few days.

Doris and Bill had planned to marry after she graduated from U. Michigan, in hopes that the draft board would grant a deferment to married men. Neither his lectureship at the University nor a fellowship awarded to pursue Latin American studies in Chile were sufficient for exemption. Until this time only one liberal member of his “old guard” doctoral committee was aware of his pacifist stand.That fall, the couple set the date for December 26th, a time when he had a brief furlough and she was on break from her teaching job in Dearborn. On Sunday Dec. 7, Doris was writing invitations for the wedding when the news of the Japanese attack on Pearl Harbor broke.

While supposedly neutral, Roosevelt had been secretly hoping to enter World War II, supplying weapons and vessels to support Britain. Congress voted the next day to declare war on Japan, thereby entering the war in Europe as well. Montana Representative Jeannette Pickering Rankin, a suffragist and lifelong pacifist, the first woman to be elected to a United States federal office, cast the sole opposing vote. It was immediately clear that the war would not end soon, and that Bill’s service would not be over in “a year and a day.”

A wave of anti-Japanese “yellow peril” hysteria ensued. Racist stereotypes overcame radio journalists and popular writers against their better nature. Over 100,000 loyal Americans, three-fourths of them US citizens, with “12.5% or more” Japanese ancestry were “evacuated” from their farms and factories, interned in race tracks and fairgrounds, and then to desolate “Relocation Centers” in the wilderness from Arizona to Arkansas, with never a shred of evidence for the claim that they were a security risk. Many did not speak Japanese. Their farms were taken over, for at most pennies on the dollar. The prevailing jingoism allowed the Vegetable Growers Association to proclaim, “We’re charged with wanting to get rid of these people for selfish reasons. We might as well be honest. We do; and we don’t want them back when the war ends.” The Nisei land had been expropriated, and it was almost half a century before an act of restitution was signed into law.

COs assigned to teach in the Japanese internment camps witnessed the machine guns and barbed wire surrounding the children at play. While there were no gas chambers, the physical conditions and principle involved were not that different from the Nazi concentration camps or for that matter the slave labor camps of the antebellum south, euphemistically called plantations. The “relocations” took place in parallel with the Nazi regime’s policy of disenfranchisement of anyone with a Jewish grandparent. The Nazi policies echoed a ruling eighty-five years earlier by Chief Justice Roger Taney, persisting throughout the Jim Crow era, that those descended from African slaves were, by the Constitution, “a subordinate and inferior class of beings” and “had no rights or privileges but such as those who held the power and the Government might choose to grant them.”

After Pearl Harbor, Bill’s strait-laced Catholic parents felt even more keenly the stigma of their son being a pacifist, and much worse, dating a Protestant. Bill’s popularity in his home neighborhood in Williamsville, NY, where “Junie” was chief lifeguard at a huge pool in the Italian part of the city, sank to the bottom.

On the wedding invitation list was a Filipina doctoral student. A judgmental aunt said, “Not only is she marrying a CO, but she has a Jap at the wedding” — Estefania Aldaba-Lim, who became the first female Philippines cabinet minister in 1971, and was awarded medals for her work with UNESCO and UNICEF.

The wedding was also fraught with difficulties. Elno’s camera flashbulbs didn’t work.The choirmaster with whom Doris had spent delightful years in high school refused to provide music on account of her marrying a CO (his wife the organist filled in). Bill’s mother was too ill with cancer to travel to Michigan. The radium treatments she was receiving were harsh and not providing relief from pain either. As Bill described the situation in a letter to Doris:

In regard to marriage … O.K. as soon as possible if not sooner, but I would like to see how my family situation turns out. While you know my stand, mother being the unquestioning Catholic that she is, would not consider me married unless she thought that I had been married by a priest. And I want her to think I’ve been married by a priest even though I wouldn’t under any circumstances want to go thru with that hypocrisy.

Now the question is, if we do get married in Flint on my furlough, how to prevent her from knowing of its real nature and yet let her know that I’m being married. If we have any sort of a group of friends there at all, I should invite my brother and Fern, and that would create complications since they would have to agree to enter in the deceit. And if I did tell them I was getting married and never invited them, they would feel somewhat hurt. Rather a dilemma … Only possibility I can see at the moment is to keep it quiet for the present …

Post-wedding photo on arrival in Buffalo, December 1941.

The parents were not told of the wedding in advance, but were expecting them the next day as they traveled back to the camp in Royalston. Alas, a friend of Bill’s had sent a card to his Buffalo home addressed to “Mr. and Mrs. William Neumann” mentioning the wedding that his parents had opened, believing that it was meant for them. There was a very tearful conference of the wives in the kitchen, as Doris explained that she had very much wanted to invite them, but could not because of Bill’s concerns.

After a short honeymoon at Lake Placid, Doris was dropped off in Albany on New Year’s Day, taking the train, with Michigan students drinking and carousing all night and after several transfers reached Dearborn early in the morning when school was starting. “When I arrived at school, in the wee, small hours, I don’t know if I changed clothes or whatever, but in the first hour of class, I put my head down on the desk and just rested. And the kids were told [chuckles] that I had just come from a wedding; now I was Mrs. Neumann. And the ninth-grade homeroom was very nice and just tiptoed around till later in the day. Because I had not slept.”


Families During World War II


Doris had a brief visit in Dearborn from Bill on furlough in February, 1942, and soon a child was on the way. After the school year ended she found a place to live in Athol, Massachusetts near the camp, which she could visit on weekends. She earned her board and keep by caring for children. The COs were not given a family allowance nor pay accorded to soldiers. Even the German prisoners of war from the European theater received wages for their work in the US camps. COs who were detailed to work on farms brought in considerable cash earnings, of which they never saw a dollar.

Later in the war when Doris and Bill wrote to the White House of their financial hardships raising a sick child, Eleanor Roosevelt personally replied to Mr. and Mrs. Neumann:

I am truly sorry for the hardships many of you and your families endure, but war brings hardships and heartache to everyone.

Not all of the CPSers were doctors, teachers, artists, skilled tradesmen or church-affiliated farmers. A few were simply draft dodgers. The pacifist, Quakerly directors of the camp were sorely lacking in the administrative skills needed to deal with these calloused lads who shirked their duties. As a result, anarchy prevailed in Bill’s kitchen, and staying in Royalston became untenable. Late that summer Bill arranged a transfer to a camp in Glendora, near Los Angeles, where they could settle with a southern California community of CO families. While there he performed research in forestry and bravely fought brush fires side by side with juvenile detention prisoners. But when General Hersey, director of theSelective Service, inspected the camp, Bill’s name came up as being cited for sleeping through a roll call at the end of a week-long fire. Selected as an example for the General, he was exiled to a dam construction camp 18 miles west of La Pine, Oregon, where he ingratiated himself into the community there as a camp cook and barber. The Wickiup camp was designated to house the worst of the COs. After letters on his behalf, the Selective Service relented and he was transferred to the less punitive Camp Coleville in Northern California, where he could at least reach his family in Glendora occasionally and do some research in the Reno, Nevada public library.

As the war drew to a close, Bill volunteered to be transferred to a newly-opened medical unit in Philadelphia, to be inoculated with hepatitis by way of milkshakes containing raw feces from the sick. Swallowing live fecal matter was supposed to replicate sanitary conditions in Italy, where members of the 5th Army Corp and 15th Air Wing were bogged down with illness. The hepatitis virus had not yet been identified and could not be cultured—hence the experiment. He and his friend Benjamin would then rush to a bar to drown the taste in beer. They did not contract the disease. While this novel preventive treatment was never published in medical journals, and the experiment was thus a failure in their cases, Bill was able to do doctoral research while in Philadelphia, and finally was on a long bus ride back to Glendora for a furlough shortly after the German surrender in May.

The separations were difficult — in one letter Bill writes,


I’ve often wondered what we’re going to do about our letters, honey, since obviously they are a bit too intimate even for friendly eyes—but so nice to read when the other side of the bed is very, very empty. Maybe some nice evening, chez nous, fortified by a big jug of wine, we could go through them together and snip out the passages meant just for us—and leave the rest for Chris and his [future] brothers and sisters to read later on.

Doris Neumann and son Christopher, Glendora, CA (1944).

Doris raised her first son by herself in wartime, under the stigma of being a CO’s wife. She supported herself by teaching, hitchhiking daily to a school in Pomona after losing a teaching job in Glendora because the local “Teddy’s Rough Riders Post” of the American Legion objected to Bill’s status. A saner voice from this era is contained in a letter from one of its members, a Mr. Herbert Ganahl, who wrote to the Superintendent of Schools:


I have been active in Legion affairs for a number of years; I am not at all in sympathy with the point of view of conscientious objectors, but I do wish to protest against this prejudice toward Mrs. Neumann, who as a free citizen is entitled to her minority point of view. The right to be one’s self and to think for one’s self is one of the things we are fighting this war for.

I know without saying that you believe in freedom of thought, speech, and conscience, and I think such persecutions against minority viewpoints should be ignored. In doing so you will have the support of every right-minded citizen of the community, as well as a great group within the Legion who are continually attempting to express the real American point of view.


Doris and her young child survived in spite of his needs for special diet and constant allergies, and the strain of being a wartime pariah in the community. But when her oldest brother Elno moved to Los Angeles for the sake of his wife’s health, things were almost normal. She managed to find childcare and hitchhiked 10 miles to her job in Pomona, but the loneliness of years of separation was trying, even with occasional nighttime bike rides to the San Dimas mountain camp. As she put it facetiously,

In the future, darling, please arrange your working hours so as to leave you free to get the evening meal; I’ll work hard and bring home lots of bacon too if dinner is ready when I arrive, complete with kisses for cocktail. All the rest I’ll do … just that I ask, and no more.


Doris’s older sister left her gilded-cage life in Louisiana to join her in southern California, the sister and four young children sharing a small apartment. They took over and painted an old chicken coop to live in before a rental apartment turned up. Doris and Mildred contracted brucellosis from the unpasteurized milk of the town. Mildred returned to Louisiana and struggled with depression. She changed her name to Stephanie (she always hated the name Mildred). Stephanie eventually moved back to California with her children and resumed her career as an artist, but she died suddenly, just as Doris was planning a visit. All her art was lost.


Neumanns in the Postwar Period 1945 –

World War Two ended soon after the explosion of atomic bombs on August 6 and 9 at Hiroshima and Nagasaki, Japan. The chain reactions that gave rise to these bombs were the culmination of the research that followed Marie Curie’s discoveries, including those of many other women, particularly Lise Meitner, who correctly identified and modeled the neutron-induced fission giving rise to chain reactions. Marie’s student Ștefania Mărăcineanu and daughter Irène Joliot-Curie identified artificial radioactivity. Many, like Marie, died of radiation-induced cancers.

Unlike the rapid assimilation of formerly-Nazi German scientists, the end of the war did not end the racial backlash against the Japanese-Americans. As relations with Japan improved, President Truman claimed that the use of atomic weapons saved further loss of Japanese lives. In fact, the intentional bombing of civilian populations in Japan had gone on for months, most particularly the incendiary attack on Tokyo that killed over 100,000, and bombings continued after Hiroshima while negotiations for surrender were underway. Truman’s decision to demonstrate the atomic weapon in this manner was not a military necessity, according to his generals in both wartime theaters. It was enabled by the underlying racism towards Japanese people. The Asian-hate phenomenon persists, as evidenced by the domestic auto industry’s chauvinist response to overseas competition, or the insults thrown in 1973 at Hawaiian Senator Daniel Inouye by a lawyer for the Watergate defendants. “Danny” was one of a group of Bill’s students at George Washington University Law School who later framed the new Hawaiian state’s constitution. Hawaiians overwhelmingly elected him to Congress in 1959. He then served almost fifty years in the Senate. Danny’s GWU gang roasted a pig in an imu pit in our backyard in 1949, likely the first such luau feast in Washington, D.C.

The surrender of the Japanese Emperor Hirohito in August, 1945 did not end conscription or free the conscientious objectors from the camps; the Truman administration prioritized the demobilization of soldiers. Dozens of letters were written in search of jobs when Bill was finally discharged from the CPS in December. By then, most academic openings were already filled by GIs.

A countrywide search for employment again split the family between cities. Bill found work at the end of February, 1946 in the exploding post-war Washington D.C. metropolis, where there was an utter lack of housing. For a month he couch-surfed with friends while Doris stayed with relatives in Michigan and Buffalo. Grabbing the morning paper’s classifieds, Bill would seek out apartments, only to find a long line of applicants had beaten him to the door. Eventually a besieged landlord sensed his pacific nature and picked him out of a boisterous crowd. He rented a small flat above a butcher shop in time for Doris to give birth to a second child. Baby Gregory was weighed weekly on the same scale as the meat. By July they abandoned that hot and smelly flat to share a suburban house with another young couple.

As one of the first scholars to dissect the diplomatic events leading up to the war with Japan in a pamphlet “Genesis of Pearl Harbor”, Bill was called out as a part of a supposed Japanese fifth column. Historian William L. Shirer and Edward R. Murrow of the Columbia Broadcasting System Sunday News Roundup declared that “Genesis … though written and published by American citizens, sounds as though it were written by the clever little men in Tokyo.”

It is unlikely that the CBS journalists had taken the trouble to read the Genesis work, based as it was almost entirely on diplomatic documents from the State Department of the United States. In perpetuation of the “surprise-attack” mantra of Roosevelt, they neglected to note that Ambassador Joseph Grew had warned in January 1941 of an attack on Pearl Harbor in reprisal for US trade sanctions. Although the Japanese codes had been broken and diplomacy had reached an impasse over US demands for hegemony over Indochina, such warnings were not taken seriously enough to alert the Pacific Fleet, or as some have suggested, were purposely withheld.

Many liberal thinkers, artists, writers, even the atomic scientist Oppenheimer, later had to face “red” accusations and devastating character assassinations during Senator Joseph McCarthy’s inquisition era. The House Un-American Activities
was unaware that its former co-chair Samuel Dickstein was being paid $1,250 a month by the Soviet Union (USSR) as a spy.
These same congressmen had produced the “Yellow Report” argument for the internment of Japanese-Americans as potential spies.


Expertise in foreign affairs was sorely lacking in the US, to say the least, following the pre-war years of supposed isolationism. In this situation Bill was able to use his knowledge of Far East affairs to advise the US Senate and Naval War College, and he kept clear of the McCarthyist purges in academia. His doctoral committee eventually relented and granted him his Ph.D. degree. Bill traveled and did research in pre-Castro Cuba and Mexico. He kept a flintlock souvenir from the Spanish-American War of 1898 in his basement.


Pacifism and US isolationism became odd political bedfellows. Bill was hired by, and eventually directed, the “Foundation for Foreign Affairs.” This think-tank was funded by the William Regnery textile magnate, a founder of the America First isolationist organization. Although staffed by many bright, progressive intellectuals, Mr. Regnery’s son Henry eventually turned the Foundation into an ultra-right-wing outfit, and the revisionist historian who paid Bill for research became a Nazi apologist.

In 1950 the Foundation published Bill’s book, Making the Peace, 1941–1945. Following the surrender of Hitler’s Wehrmacht and Tojo’s Imperial Japanese Army, the Allies had divided up the occupied territories of Europe and the Middle East. “Peace” involved a massive occupation of the former combatants’ countries, which continues 80 years later with over 170,000 US troops stationed overseas. Indochina and the Korean peninsula remained contested spheres of influence, while many former colonies of European powers overthrew their shackles.

As the Henry Regnery Company that funded the Foundation became radically conservative, editing its journal was increasingly contentious. Bill took a year’s leave to teach in Hawaii, returning to Washington where new housing was becoming available in the suburbs. He found temporary teaching jobs in Virginia, in Georgetown, College Park, Baltimore, and Towson, Maryland. Commuting by train and hitching rides became tiresome and he purchased a 4-door, maroon, 1949 Plymouth sedan, which allowed the family to vacation on the Outer Banks of North Carolina, renting shacks in the dunes of Kitty Hawk. Bill started a job at Goucher College in late 1952, and the family moved to Baltimore County two years later. The family outgrew the sedan and bought a Plymouth station wagon, the better to haul nursery trees and mulch to landscape the bulldozed clay suburban yard. Bill’s books about Far Eastern diplomatic relations before and after Pearl Harbor gained him tenure and a sabbatical year in Europe.

Prof. William L. Neumann, Public Relations Office, Goucher College, ca. 1960.

Bill (lacking McGlone genes) disliked air travel, so the family traveled on trains and ocean liners, where reading books and newspapers was the thing. Responding to an ad in a left-anarchist magazine, the family sponsored Angéle, a young Spanish Civil War refugee in Carcassonne, France. The refugees formed a commune, pooled their resources, and built houses outside the cramped, tubercular walled city. We visited in the early sixties. With no knowledge of French but empowered with a franc and a few centimes, I was sent to the store to purchase the evening’s liter of the delicious local Languedocian red wine. Typically the regional grapes were harvested by students, whose compensation package included a liter a day. The children each had a glass of “pipi des anges”; the six-year-old’s was half-diluted with water. Many years later, I returned as a scientist whose colorful maps of Mars were featured in a magazine on the news stands. My knowledge of the southern dialect had faded. The only words I recalled were fromage and cassoulet, of which there was plenty.

The sabbaticals built connections with British and German roots. Bill purchased in succession a red-and-black VW Microbus, an orange Buick Opel, and later a héllblau Mercedes-Benz car picked up at the factory in Stuttgart, much to the consternation of our Jewish friends, some of whose families had died in the Holocaust.

Bill did not talk much about his own family’s history, whose records were largely destroyed during the two world wars. A letter from a distant relative revealed that the ancestral migration had a lot to do with the Franco-Prussian War of 1870. He was more concerned with maneuvering his way out of the Foundation and writing about the chaotic situation in Europe, where displaced persons confronted economies destroyed by war. The 1945–1950 period could also serve as an example of the difficulties that will arise in the course of climate-driven political collapse. Bill, confined to CPS camps for nearly five years, and Doris, struggling to support the family as a teacher, were far from the most desperate couple on the planet, nor among the millions who did not survive the war. They were young, he hadn’t contracted hepatitis, and she was never arrested by the supposed protectors of democracy at the American Legion.


Another Wartime Family, in Germany

As Bill’s book Making the Peace came out, the Everyman Library monthly book club sent a remarkable work, Thy People My People, by a Canadian-born professor living in Germany. Elizabeth Sims had married Albert Hömberg, a German professor of medieval civilization, in 1938. They had met while she was studying abroad, and were settling in Toronto. Albert and his well-to-do family were staunch anti-fascists. His older brother, a member of the Reichstag, had been sent to a concentration camp after Hitler assumed power. After Hitler’s invasion of Poland in 1939, Britain and therefore its colonies were at war. Albert and Betty were thus considered enemy aliens, and under the threat of internment, they were deported to Germany. In Münster, the bespectacled geographer Albert was soon drafted into a Luftwaffe mapping unit, where he dutifully kept maps of bombs dropped and planes lost. During the war he was able to visit Betty on furloughs from his station in France. She gave birth to two boys and a girl, but Betty was mostly alone in a war in which her own and her husband’s people were fighting on opposite sides. By 1942, the German army was being decimated by the Soviets at Stalingrad, but there was no escape from the military other than to be shot for desertion. It was difficult to live as “the English woman” among the mothers of the village of Roxel, whose husbands and sons were dying at the front, but there were human bonds that led to survival in the face of air raids and meager rations. The women secretly distributed the anti-Nazi sermons of the Bishop of Münster, Clemens August Graf von Galen.

There are few if any English-language accounts of life on the home front in Germany under Nazi rule, particularly from the perspective of an “enemy,” whose husband was eventually captured and spent fifteen months in prison camps. More remarkably yet, she hid their uncensored correspondence from discovery by the SS beneath the doghouse, and was able to find a publisher. Her harrowing tales of living under British bombers on starving rations as a German soldier’s wife, is still timely for its depiction of the turmoil within Europe during the war, and that which resulted during the post-war Allied occupation.

Despite General Eisenhower’s German heritage, the Allied occupiers knew little of the language or culture and cared less for their POWs. While Elizabeth worked long hours as translator for the Military Government, Albert remained in captivity, unable to let his family know he was alive, collapsing and nearly dying from months of hunger and privation in prison camps. His multilingual skills as a go-between with the French and then American guards won him a few extra rations. The Hömbergs were the lucky ones. Hordes of displaced persons across Europe terrorized the homefront as they begged or robbed to survive. Many of the prisoners of war did not.

The plight of those millions of refugees foreshadows the climate refugee migrations to come.

“Making the Peace” in a world dominated by the same imperialist powers that had broken it was the joint concern of the Neumanns and the Hömbergs. Their visions of a united and more peaceful world were similar. After their books’ publication, they corresponded, connecting in Münster in 1954 and afterwards. They shared their concerns regarding the fragile peace agreements and the rise of neo-colonialism. Wars to control sources of raw materials and markets continued, but who could foresee the looming environmental catastrophe that would result from the post-war expansion?

Albert died in 1963 and William Neumann died suddenly in 1971, just as his compilation of non-interventionist writings about WW II was going to press. His last article concerned Roosevelt’s feckless irresponsibility and the agreements made at Yalta and Potsdam that were supposed to bring about peace but left a world in turmoil. The widows renewed the family bond, and sixty-five years later, Albert Hömberg’s diaries and letters were published, and a new German translation of Elizabeth’s book.

Hömberg family, 1960. Phillip, Beata, Betty, Albert, Peter.


Part V  The Author as Protagonist


When my father died, I had finished college and was working in Canada. I never closed the loop as the wayward son coming to know his father, but we had shared our love across the generations through music. A stereo we assembled out of vacuum-tube amplifiers played the old and new folk/jazz records on the vinyl 78 and 33 rpm formats. Newspapers littered the foyer and clippings covered the desk in the study. The print-based media, I told him, were giving way to electronic communication that would save his beloved trees. Some fifty years later this has almost become true, albeit to the detriment of responsible journalism. But there are so many other conversations I would have wished to have had.


The Atomic Age

The 1945 atmospheric test at the Trinity site in New Mexico inaugurated the atomic age, under military secrecy. Shock waves recorded in Tucson, AZ allow a modern reconstruction of the exact time. The bomb vaporized its test stand, melted the desert sand, and dumped many kilograms of un-fissioned plutonium and uranium into the air. Photographic plates were fogged in Rochester, NY, and NM residents downwind suffered the immediate medical effects of fallout. It was not until forty years later that a peer-reviewed study of the radioactive plume was published, and the long-term effects of exposure are still being assessed, made uncertain by the great passage of time.

President Truman dropped the next two bombs on Japanese cities. He became cockier in his relations with the USSR after the Potsdam Conference, as he realized the immense strategic power he had over his temporary ally. And Truman had said of the Russian threat, “If it explodes, as I think it will, I’ll certainly have a hammer on those boys.” He believed that after incinerating the inhabitants of two cities, no nation could challenge his military dominance. This was not true, it was soon discovered.

The Soviet scientists and leaders knew much more about the US program than the President. Their first atmospheric bomb test in 1949 shattered the US monopoly on atomic weaponry. Within three years the “Super” (thermonuclear) bombs had also been tested, and the principles of chain reactions inspired by H. G. Wells and patented by Leo Szilard could be understood by non-scientists. Simply speaking, when a neutron is captured by a nucleus of U-235 or Pu-239, the nucleus fissions into lighter, highly radioactive isotopes, releasing energy and emitting 2-3 more neutrons, which causes a rapid chain reaction. Controlling them is another matter entirely.

The bomb “club” continued to grow as the UK, France, and China joined, followed by three more countries, conducting a total of 2056 tests as of 2023, half by the US and a third by the USSR. The tests left a global imprint on the atmosphere in the form of carbon-14, which peaked in 1964-65 with a partial nuclear test ban, and has declined since. The dilution of radiocarbon (14C) by fossil fuel extraction since 1850 is a global indicator of human activity. The dramatic increase of 14C in the 1950s from atmospheric testing is a “golden spike” of the Anthropocene event. 

The US nuclear submarine Nautilus was launched in 1954. By the early 1970s the missiles, bombers and warheads of each superpower numbered in the thousands, more than sufficient to extinguish nearly all of civilization. Nuclear warfare was sold to Congress by General Motors CEO Charles Erwin Wilson (Eisenhower’s Defense Secretary) as giving a “bigger bang for the buck.”

The horrors of nuclear war were papered over by promises of peaceful uses of this new source of energy. Nine years after the first controlled, self-sustaining nuclear reaction occurred in Chicago under wartime secrecy, heat from EBR-I (Experimental Breeder Reactor-I), a fast-neutron reactor at the National Reactor Testing Station in Idaho, powered a small electric generator, illuminating four light bulbs. In 1954, three years before the Sputnik era, a Soviet nuclear reactor was connected to the power grid. The cost of electricity generated by nuclear power was touted by Lewis Strauss, head of the Atomic Energy Commission (AEC) as soon becoming “too cheap to meter.”

Argonne National Labs Experimental Breeder Reactor-I, first electric power-producing nuclear reactor, Idaho, December 20, 1951.

A nuclear reactor is a very slow version of an atomic bomb.
Chain reactions are maintained by prompt neutrons generated by splitting atoms, and by delayed neutron emissions coming from the lighter fission product atoms. Control rods absorb some of those neutrons to keep the reactor core from blowing up. The fuel core must be kept in a delicate balance with the growing concentration of fission end-products that also absorb neutrons. Loss of control can occur almost instantly due to design and human error as happened at Chernobyl in 1986, by technical failures as at Three Mile Island in 1979, or more slowly from inadequate safeguards for natural disasters as at Fukushima in 2011. Even an overfilled flask of reprocessed fuel can fatally explode in an instant, as happened in 1964 in Wood River Junction, RI, and in 1972 at Nuclear Lake, NY. Small reactors are needed for chemical analyses, but power reactor disasters have left large areas contaminated for centuries.

Accidents and fatalities have occurred at innumerable mining, refining and processing facilities, e.g., the film “Silkwood” (1983). Countless other “incidents” have occurred in military weapons facilities, the extent and severity of which continue to be uncovered. The indirect cost of permanent storage of the millions of tons of waste produced by reactors is beyond analysis.

Nuclear Lake, NY along the Appalachian Trail. An explosion in 1972 injured a technician and scattered plutonium dust, contaminating the 1100 acre facility.

The initial atomic weapons required a massive wartime effort to separate  weapons-grade uranium from its non-fissionable component. The effort was first implemented in the mountains of Tennessee, where hydropower dams provided the electricity needed to run electromagnetic separators and diffusion pumps, and relatively unsophisticated and unsuspicious labor was available. The Smoky Mountain foothills region was also an idyllic place for children in summer camps to escape the city heat, raise corn, chickens and hogs, ride horses, pick wild berries, swim in clean rivers and experience natural wonders such as Sliding Rock and the Pisgah National Forest. Weekly camping trips could reach these destinations via buses, and a highlight of our summer was a visit to Oak Ridge National Laboratory.

In 1958, the Director of Oak Ridge Alvin Weinberg, a veteran of the Manhattan Project, led us campers (his son included) to the visitor center. Kids could place a silver dime in a box to be irradiated. Neutron activation creates radioactive isotopes 110Ag and 108Ag, with half-lives of 24.6 seconds and 2.42 minutes, respectively. The dime was waved in front of a Geiger counter, buzzing for a few minutes. The isotopes decay to stable cadmium atoms by emitting electrons that are easily blocked, as well as a few more penetrating gamma rays. Afterward, the dime went back into its owner’s pocket! Coins in circulation no longer contain silver, radiation awareness is greater, and that exhibit is no longer in the Children’s Museum.

In 1963, at the height of the national nuclear exuberance, the local Baltimore Gas and Electric company sent me as a high school student to the National Youth Conference on the Atom in Chicago. The conference promoted “the promise of the peaceful atom,” by which it meant the ever-expanding growth of revenue for “investor-owned utilities.” We visited the Argonne National Laboratory and the Dresden Generating Station, with its impressive control room.

Dresden 1 Nuclear plant control room, Illinois, 1963. The author is on the far right.

The Dresden 1 reactor operated for 18 years, and was succeeded by two more units, whose fueling is now subsidized by the state of Illinois in the name of “climate action.” At the conference, the utilities projected a growing demand for electric power, which they argued could only be met at scale by nuclear reactors, using an “almost infinite” fuel supply extracted from natural uranium and thorium via a breeding cycle. The growth rate of power demand envisioned was said to be 6% per year, a number picked out of the most optimistic regions of thin air. The heat energy required to generate that amount of power is roughly three times as much. Using a slide rule and logarithms, I calculated that by the year 2100, enough electricity would have been generated in the US alone to melt the entire Greenland ice sheet. That amount of water raises sea level globally by 7.4 m (24 feet). Of course, power generation itself is unlikely to reach that level, but during heat waves in France, when river temperatures are unusually high and electricity demand peaks, the excess heating of rivers by nuclear plants prompts shutdowns.

Dr. Weinberg had proposed breeder fuel cycles using high-temperature molten salts or liquid sodium alloys as coolants in low pressure containment vessels, even demonstrating a reactor that could power airplanes. Such breeder reactors were said to achieve nearly double the thermal efficiency, with greater safety and less nuclear waste. However, the materials and processes needed for such high temperatures and outputs to make them practical and less polluting are still not available. Industry by this time had invested heavily in the light water moderated, pressurized reactors used in submarines, where abundant cooling is available. Congress was thus convinced that industry’s choice was the only path forward. The growth of demand for electricity had slowed, and the US molten-salt research was abandoned in 1976, much to Alvin’s dismay. He warned management about the dangerous design of the commercial reactors. Congressman Chet Hollifield said, “Alvin, if you are concerned about the safety of reactors, then I think it may be time for you to leave nuclear energy.” Weinberg was fired from Oak Ridge soon thereafter. Six years later, the Three Mile Island power plant meltdown occurred.

The Great Acceleration

Bombs and nuclear reactors have not yet abruptly shortened my life span by the rapid conversion of mass into energy, but another troublesome explosion occurred in the postwar era, the growth of the automotive industry. My father’s car carried him to part-time jobs in Charlottesville, VA and Baltimore, MD. The family could drive to the Outer Banks in half a day. Each tank of gas and quart of oil was recorded in the Ward’s Motor Record Book. President Eisenhower’s Interstate Highway System launched highway travel measured in trillions of passenger miles. Gasoline production skyrocketed to meet demand, first in remote areas of the US, and later fueled by relatively cheap oil from Asia, Africa, Latin America and the Middle East. 

The price of this convenience was death. After a century of colonization and forced displacement from their lands, leaving indigenous peoples in Oklahoma only some mineral rights, oil was discovered on Osage tribal land. The royalties were so lucrative that a generation of tribal members was bilked of headright monies by congressionally appointed legal guardians, and terrorized by a spate of murders that may have numbered in the hundreds. 

Only a few were paying attention to greenhouse effects, but there were other deadly consequences of cheap gasoline. Oil refineries polluted the air with cancer-causing fumes even when they did not catch on fire. The US motor vehicle annual fatalities also exploded, peaking at 56,278 in 1972. For years the leading cause of death amongst the ages from 15 to 24, this rate slowly yielded to safety engineering, notably the implementation of seat belts and passive restraints, but has taken another upturn of late.

The International Geosphere-Biosphere Programme (now Future Earth) published a dashboard of 24 indicators of human enterprise and its impacts on the Earth system. The synchronous acceleration of socio-economic and Earth system trends from the 1950s to the present day – over a single human lifetime with little sign of abatement – is known as the Great Acceleration. Economic activity of production, trade, and consumption grew rapidly from as the world recovered from war. The post-war expansion of energy extraction and consumption popularized four-letter words like “smog,” and left indelible asphalt marks on the planet.

As did the Gilded Age, the Great Acceleration stretched railroad safety to its limit. The wreck in East Palestine, Ohio in 2023 showed the disastrous effects of exceeding those limits. A wheel bearing failure combined with an overloaded train derailed 38 cars, spewing toxins, sickening many people and forcing a massive evacuation. The safety issues with Norfolk Southern Railway that caused the derailment are typical of the industry as a whole. The train was 3 km long, with only two qualified humans on board to manage its three locomotives and 149 railcars, 20 of which carried hazardous vinyl chloride and butyl acrylate. Detectors had indicated overheating wheel bearings for an hour before the wreck, as the undercarriage showered glowing sparks. The crew was not warned in time to stop the train, because the temperature hadn’t crossed thresholds set by the railroad’s infamous “Precision Scheduled Railroading” plan. Several months earlier, a similar incident in Ohio also led to a derailment, four miles after a mechanic, who was overruled by a supervisor, had determined that an engine’s wheel was overheating and should be removed from the train.

Toxic chemicals still roll through East Palestine. Automatic brakes have been mandated for hazardous cargo, but a cost-benefit analysis led the Trump administration to rescind this rule. Condolences are cheaper than safe operation.

The globalization of trade has increased the hazards of ocean-going bulk carriers. Tankers and container ships are tremendously efficient, but still among the most hazardous of human activities. On average, 1,566 containers are lost at sea each year. The ship that killed six workers and destroyed the Francis Scott Key Bridge in Baltimore in 2024 had a series of electrical failures just prior to sailing, but the Coast Guard was told it was only routine maintenance. Electrical failures recurred within half an hour of being released into the channel by tugboats, causing complete loss of power and steering. Within minutes the Key bridge was destroyed. From 1960 to 2015, there have been 35 major bridge collapses worldwide due to ship or barge contact, with a total of 342 people killed, according to the World Association for Waterborne Transport Infrastructure (PIANC). Hundreds of deaths occur during maritime operations every year, however, the true number of fatalities including those at sea is likely to be far higher than any of the reported statistics. 

Middle-East oil put a curse on the Palestinian people. Under pressure from European refugees and with the financial might of the US controlling the UN, their land was partitioned in the year I was born. This did not happen peacefully. The majority of the Palestinian Arab inhabitants were expelled from their homes, and more than 5 million, including descendants, are registered as refugees. Skilled Palestinian labor operates the oil fields of the Arab states, where their stateless existence continues.

It was not pity for those fleeing fascist anti-semitism, but greed for the riches of the Levant that motivated the colonization of the Jerusalem lands. The US turned away the MS St. Louis refugee boat at the start of WW II, sending their passengers back to Europe, where many died in concentration camps. Of the millions fleeing the holocaust, only 982 Jewish refugees without visas were taken in by the US during the war. The Secretary of State who blocked the issuing of visas to Jews was later awarded the Nobel Peace Prize for creation of the United Nations, which soon carved up Palestine.

Oil has continued to curse Jews and Arabs alike. The political turmoil attendant on pumping and shipping cheap oil from the Middle East fueled the 1956 Suez and Sinai wars. As a nine-year-old I read in the daily papers that British and French warplanes were bombing Egypt’s air fields and cities to assert control over the Suez Canal, killing many civilians, while Israel massacred the Palestinians of Gaza. The curse on the desert-dwelling people continues to this day as oil-empowered clerics and dictators foment terror on innocent people. The Israeli extremist government continues a modern-day genocide with US-made armaments and bombs, while holding its people to a militarized, apartheid existence.

In the 1960s we demonstrated against the nuclear brinkmanship engendered by the Cuban missile crisis, against the deployment of nuclear weapons and submarines overseas, and against the preparation of chemical, biological, and radiological weapons at Fort Detrick, MD. The CND’s peace symbol adorned our jackets. We banned pesticides, but did not see the carbon poison.

One of my high school classmates wrote in my yearbook, “Don’t go communist, ducky!” Of course I took that as a challenge, since I had been exposed to many points of view by my historian parents. They had subscriptions to Soviet Life and Peking Review. But I only had an inkling of the scale of human activity and its social implications. Gasoline was cheap, four bucks would fill a tank. Road trips were the thing. I learned to drive on a Buick Opel. My Pontiac station wagon went to San Francisco on weekends from college in Portland, Oregon, and then across country, where it died upon return. A Ford bread truck took my belongings back East, through Chicago, its bearings grinding all the way to Nova Scotia, where it also died.

I had a fascination for those internal combustion engines, a two-stroke lawnmower, a classmate’s Lambretta motor scooter, eventually the oil-spitting, twin-cylinder British machines with unreliable electrics, and then a Honda motorcycle that was ill-suited to the icy, rutted streets of the New England and Canadian winters but would emerge from burial in snow storms and start without hesitation at the push of a button. Carburation and corrosion eventually led to the motorcycle’s demise, but only after several chilly trips along the East Coast and an epic ride to Chicago and back. Flat tires, busted chains and rider fatigue meant frequent stops in rural towns. The realities of internal combustion and small motorcycle gas tanks came to a head during the oil crisis of 1973. While politicians blamed the Organization of the Petroleum Exporting Countries, or OPEC, there was no real shortage of oil in the US, since the bulk of the imports were from Canada. But prices jumped, gas lines grew long, and stations closed at night. I drained enough gas from the pump hoses to get to the next open station.

Corporate Average Fuel Economy standards eventually led to a slight decline in per-capita gasoline consumption in industrialized countries like the US and Germany, and the supposed oil crisis ended. Gasoline is now cheaper than milk, and monster vehicles dominate the highways.

Asia and Africa, newly released from traditional colonialism, became huge markets as well as endless suppliers of Earth resources. Hydraulic fracturing of previously unproductive shale beds eventually extracted more oil within the US than from the Middle East, and the price of gasoline is now so heavily subsidized that monster trucks have become the family mall cruiser. Far from being held hostage to OPEC, US oil is exported. But the biggest export was the addiction to concentrated energy in the form of oil. As a car dealer lamented in 1969, after lethal smog events and the Santa Barbara oil spill, “The car is now looked upon like some kind of dangerous drug.”


The Cold War


P. M. S. Blackett, the British physicist and Nobel prize winner, writing in 1948, claimed that the nuclear bombing of Japan “was not so much the last act of the Second World War as it was the first major operation of the Cold War against the Soviet Union.” “This view … has two phases: one, that the use of the bomb was intended to end the Pacific War before the Russians moved into China and, by a declaration of war, acquired a claim to sharing the occupation of Japan. The second phase of the argument is that the bomb was also intended as a demonstration of American power that would make the Russians more amenable to accepting American policies in Eastern Europe and the Balkans.”

 Following Churchill, Roosevelt, and Stalin’s 1945 Yalta and Potsdam conferences, the United States, the United Kingdom, and the Soviet Union launched a postwar reorganization of Germany and Europe that soon became systems of military alliances and contentions over spheres of influence. Nuclear weapons were to preserve these superpowers’ domination of the rest of the world. Throughout the Cold War, as George Orwell described it in October 1945, the United States and the Soviet Union would combine to produce enough atomic, chemical and biological weapons to kill everyone on Earth. Roosevelt had approved preparation of such weapons as early as 1942. Many decades later the US closed down its chemical weapons program, after killing six thousand Utah sheep in an accidental release. The Cold War arms buildups and contention for dominance went along with worldwide economic expansion.

The Cold War terrorized schoolchildren. We were made to huddle under desks in classroom drills, to somehow protect us from the blast and fireball effects of a nuclear attack. Home fallout shelters were promoted at state fairs. Buildings were labeled with black and yellow “Fallout Shelter” markings. In 1962, contracts were negotiated for the production of 400,000 aluminum outdoor signs and one million steel indoor signs. The Office of Civil Defense proposed using the radiation warning symbol (yellow background with a magenta circle in the center of three magenta blades) to mark fallout shelters. The color scheme was rejected because a fallout shelter was supposed to represent safety, whereas the radiation warning symbol represents a hazard, so they changed the foreground to black. The color change was unconvincing.


The national fallout shelter sign released by the Defense Department on December 1, 1961, with arrows to indicate the direction to the shelter.

U.S. involvement in Middle Eastern and Asian politics escalated. It supported the corrupt Kuomintang-led government in China as it retreated to Taiwan Island. It rejected popular votes and manipulated the United Nations into participating in the bloody 1950-53 Korean War. In the 1960s, “advisors” to the French colonialists eventually embroiled a huge military occupying force in Southeast Asia supporting a corrupt, anti-popular regime. The atrocities committed during this undeclared war culminated in the shameful Christmas bombing of Hanoi in 1972.

I had been naively interested in pursuing science as a pure career, so I was flummoxed in 1964 when the college interviewer asked what role I would play with regard to world affairs. But as the Vietnam war escalated, scientists increasingly began to reject the ties of research to the war effort. A “Research Stoppage” protest against war research in March, 1969 launched a number of loosely organized, chiefly academic groups like “Science for the People” that mirrored uprisings by aerospace and technical workers in Italy and France, particularly the May, 1968 events. The military conscription in the US, instituted during the Korean War, was revived for the Indochina war and spawned numerous protests against the draft.

The enormous cost and dangers of the atomic arms race created growing domestic alarm. To counter this, President Eisenhower launched the “Atoms for Peace” program in 1953 with great fanfare about peaceful uses of atomic powers. Popular magazines touted gadgets that would “Find a fortune in uranium.” Eisenhower’s administration built a merchant ship NS Savannah that was powered by a reactor in 1959. It carried passengers for three years but it cost too much to run and was never profitable. The Savannah is today docked in the Baltimore harbor. Its reactor was removed and stored at the Clive facility in Utah. Decontamination of its propulsion room is expected to be completed by 2030.

The US propaganda campaign “Operation Candor” sought to normalize the detonation of nuclear bombs on American soil, as the head of the AEC Lewis Strauss put it, “thereby create a climate of world opinion that is more favorable to weapons development and tests.” The Soviet counterpart was called “Nuclear Explosions for the National Economy,” promising to “blow up mountains, change the course of rivers, irrigate deserts, and chart new paths of life in regions untrodden by human foot.” These so-called “Plowshares” programs conducted 35 US and 156 Soviet tests, ostensibly to dig canals and mobilize trapped oil and gas deposits. These programs left the atmosphere radioactive and were not peaceful by any stretch of the imagination. Their legacy was the continuation of the arms race and proliferation of enriched uranium material abroad.

Disarmament advocates argued that the atmospheric effects of thousands of warheads and bombs in a large nuclear exchange could kill most living things on Earth through mass starvation, in addition to the prompt effects on targeted populations. Warnings of a nuclear winter followed by years without a summer were published in 1983, at the height of the Cold War. Although motivated by the effects of Martian dust storms and the historical (1783, 1815, 1883) Laki, Tambora, and Krakatoa volcanic eruptions, the study by Turco et al. was widely criticized. The climatic and ozone-layer impacts of large quantities of firestorm aerosols released into the stratosphere continue to be debated, especially after the 1991 Pinatubo eruption. Decades of study of nuclear conflagrations has led to the realization that clouds, whether natural or human-induced, are critical to understanding Earth’s energy balance.

The mentality of deterrence by “mutually assured destruction” pretended that we could somehow avoid the horrors experienced in Hiroshima and Nagasaki. Civil Defense propaganda scenarios were based on a regional war, whose effects were said to last two weeks. Meanwhile, atmospheric testing of bombs a thousand times more powerful than those used against Japan continued until November, 1962. Tests near Bikini Atoll exposed many fishermen to fallout; the crew of the Lucky Dragon was severely burned and some died. Soon public testing in the South Pacific was abandoned. Hundreds of explosions were then conducted in the Southwestern US, closer to the Los Alamos laboratory. Civilians were used as guinea pigs because they were more realistic targets than mice and this knowledge somehow would win the Cold War. In Nevada, the only concern expressed was that the population not be alarmed by the blasts. Dosimeters recorded many Roentgens of fallout radiation in neighboring states from a single test, which fogged photographic film in places as far away as Troy, New York. A few horses and sheep died from the immediate effects; people in the immediate areas were told to stay indoors for a few hours and given free car washes. The AEC denied any responsibility for the deaths of animals out in the open. A generation of citizens of the Southwestern US suffered burns and cancers as a result of the 928 nuclear devices exploded there between 1951 and 1992. The Radiation Exposure Compensation Act of 1990 eventually approved claims of over 40,000 “downwinders” in limited regions, none in New Mexico, paying $50,000 to those who developed certain cancers in specified counties. Many more claims were denied or never filed due to death; New Mexico victims are still seeking compensation.

Many of the areas used for above-ground explosions are still too contaminated to inhabit.

The Cold War ended, but its enduring legacy was the skimpy two-piece women’s swimsuit named for the Bikini Atoll where the first seven post-war tests were conducted. Introduced four days after the blast and soon worn by glamorous celebrities, Vogue magazine described it as the “atomic bomb of fashion.”

The Space Age

The United States emerged from two world wars virtually unscathed and in possession of enormous wealth. The Cold War had moderated sufficiently that a group of scientists could propose an International Geophysical Year (IGY) to conduct a systematic study of the Earth and its planetary environment. As a part of this non-military program, Eisenhower promised to put a satellite in Earth orbit, which inevitably would overfly Soviet airspace. Thus it was a shock when the USSR launched the Sputnik-1 satellite in 1957, marking the commencement of the Space Age. Sputnik was the culmination of the research conducted by the Russian scientist Konstantin Tsiolkovsky, the American Robert H. Goddard, and the Nazi V-2 missile scientists who were extracted from Germany at the end of the war. Sputnik-2 soon followed with the launch of an unfortunate canine husky mutt “Laika.” Meanwhile the US was planning to launch satellites using the V2 rockets captured along with German scientists and engineers. After spectacular failures and rivalries between the Navy’s Vanguard project and the Army’s Redstone missiles, the US succeeded in orbiting Explorer-1. This bootleg project carried particle detectors that discovered (and were overwhelmed by) the high-altitude Van Allen radiation belts. These satellites were stunning achievements of the 1957–58 IGY. As a cover for their plans to conduct spying from space, both Soviet-aligned and western governments supported the IGY, renewing the scientific exchange of data that had been broken by the Cold War.

For nerdy grade school students, physics was now exciting. Pocket protectors were fashionable. Our high school started teaching Russian in 1956. The networks carried such innovative programs as Continental Classroom, where early-risers could hear lectures by Nobel-prize-winning scientists and see experiments performed in real time that would explain gyroscopes or demonstrate the attraction of gravity between spheres suspended from glass fibers. Rocketry was many a kid’s dream, the logical extension of 4th of July fireworks. The renewed importance of science education prompted amateurs to sharpen their astronomical observations.

The next summer I saw the aurora borealis over the mid-Atlantic sky, and the CO2-encrusted, white polar caps of Mars through a telescope. I did not imagine that I would measure their thickness with a laser altimeter forty years later.

On my way to college in 1964 at a concert in the Hollywood Bowl, I was stunned by the sight of a bright satellite flying overhead. Likely it was the Echo balloon passive communication satellite, a precursor to Telstar, used to reflect powerful transmitter beams over the curvature of the Earth. The space age had become a tangible reality. Soon, cross-continental and cross-ocean television broadcasts were routine.

The excitement of the space age soon wore off. Kennedy’s commitment of a space program to land men on the Moon developed side by side with the escalation of the most brutal war since the partition of Korea, the Southeast Asian war in Vietnam, Cambodia, and Laos. The installation by the US military of a puppet regime was countered by Ho Chi Minh’s nationalist upsurge, now known as the Vietnamese People’s War of Liberation. When Apollo astronaut Neil Armstrong stepped onto the Moon on July 20, 1969, I could only think of how many military and civilian lives had been taken by these wars with no end in sight. On that day I had bicycled from Chicago to Madison, Wisconsin to visit friends at The Progressive, arriving in darkness guided by the waxing Moon. We watched the landing on TV. Over the next several years, the Vietnamese harbors were mined, bridges and railroads destroyed, crops and forests defoliated, and millions of tons of bombs dropped, far exceeding the destructive force of the original atomic bombs. As the Apollo 17 mission ended, a ruthless Christmas bombing campaign commenced against the urban city of Hanoi. In this context it was difficult to rejoice in the great scientific successes of lunar exploration after achieving its Cold War competition with the Soviets. Federal budgets were overwhelmed by the cost of the war. The remainder of the Apollo program flights were cancelled, the Saturn V rockets with them, but I still wear my Apollo 40th Anniversary T-shirt.

On a clear night after sunset, I can now see many of the tens of thousands of operational satellites passing overhead, some as bright as airplanes, but distinguished by their majestic paths and their unblinking reflection of sunlight high above the Earth. The 1958 Vanguard-1 booster is still visible. Lately, trains of evenly-spaced dots of light cross the sky, satellites dispensed like Pez candies from the rockets of SpaceX. More will come from their competitors, all to bring video streaming and online shopping for the latest fashions to your armchair.

The most profound climatological investigation of the space age, however, was being conducted by a young scientist on a beach in California.


The Science of Global Heating

Oceanographer Roger Revelle warned Congress in 1957 that “The Earth itself is a space ship” and that global heating by greenhouse gases would endanger it with rising seas and desertification. His warning was hardly noticed. With nuclear physicist Hans Suess, he showed that substantial fossil fuel carbon was not absorbed by the ocean and remained in the atmosphere, exploiting the fact that 14C is absent from the ancient coal and oil deposits being burned. Revelle recruited Caltech postdoc Charles David Keeling, who at the time was measuring the atmospheric content of CO2 while camped out in California’s Big Sur (to avoid contamination by local sources). Keeling had been sponsored by an oil industry group in 1954 to monitor pollution, a problem they were concerned with even then, for its effects on climate. With IGY funding, he established an observatory on Mauna Loa in Hawaii in the well-mixed troposphere 3 km above sea level. On this remote mountain the impact of seasonal growth cycles on global CO2 content could be measured on a daily basis, dropping from May to September and rising the rest of the year.

The buildup of greenhouse gases provides an unambiguous planetary yardstick. Keeling’s record of atmospheric CO2 at the top of Mauna Loa in Hawaii started in 1958, and he eked out funding after the IGY from many sources. His son Ralph currently directs this worldwide program. The Scripps Institution of Oceanography scientists now use those measurements and ice core gases to confidently extrapolate CO2 concentration and oxygen-isotope-derived temperature data back into prehistory. Temperature in Greenland doesn’t always correspond with global measurements, but it is representative of the range of variation that marks the Current Era. The Scripps record now shows that the post-war temperature spike, the Mann, Bradley and Hughes (1998) ‘hockey stick’ graph, is directly related to the human use of fossil fuels with its concomitant increase in CO2 concentration.

The first time in human existence that atmospheric CO2 exceeded 300 ppm was about the time the Titanic sank [1912] in the North Atlantic Ocean.

As tragic as the loss of life from sinking steamships, thousands of miners died each year in the US alone digging coal prior to World War II. A polished black stone memorial to the lost lives of Welshmen, Poles and Czechs was erected in 2006 in Vintondale, PA. Here, a century later, there are efforts to clean up the toxic orange, acidic syrup of mine tailings that flowed into the clear, clean water of Blacklick Creek. There are no memorials to the millions who died and continue to die from the toxic fumes of burning coal.

At Mauna Loa in 1958 the seasonally adjusted CO2 concentration was 315 ppm.

In 1959, the atomic physicist Edward Teller informed the American Petroleum Institute chair Robert Dunlop that the exponentially rising use of conventional fuels would lead to sea level rise. 

In 1969, the CO2 concentration at Mauna Loa Observatory was 325 ppm.

In 1971 the Nixon Administration Office of Science and Technology issued an initiative entitled “Determine the Climate Change Caused by Man and Nature.” It was never implemented. The OST administrator advised 40 years later that “There’s no compelling scientific argument for drastic action to ‘decarbonize’ the world’s economy.”

The nuclear scientist Alvin Weinberg, before joining the Manhattan Project, was modeling the infrared absorption spectrum of molecules such as CO2. He later chaired a prestigious Study Group on the Global Effects of Carbon Dioxide. While heading the Institute for Energy Analysis, Weinberg wrote in 1974 that a 5% per year global increase of energy production, if unchecked, would lead to melting of polar ice caps and disturbance of the atmosphere in the next century. In seeking funding for advanced reactor research, he warned Congress in 1976 that “…atmospheric concentration of 375-390 ppm may well be a threshold range at which climate change from CO2 effects will be separable from natural climate fluctuations … The consequences of an increase of this magnitude in atmospheric CO2 make it prudent to proceed cautiously in the large-scale use of fossil fuels.”

In 1976, the CO2 concentration at Mauna Loa was 332 ppm.

My Nuclear Career

In 1978, with Industrial Engineering training, I took a job with a design and manufacturing company that specialized in seismically-engineered power piping support systems, or “hangers.” The meter-diameter steel pipes carrying superheated steam under high pressure must not bind, buckle, or break under loads or during earthquakes. In the office, finite-element analysis and computer-aided design were emerging along with newer computers, while the steel fabrication was done across the street in the former Rhode Island ITT Grinnell Foundry. The head of quality assurance was the owner’s son-in-law. The Hanford, Washington and Seabrook, New Hampshire nuclear plants were large clients, as were six nuclear facilities in South Korea.

Nuclear engineering turned out to be a very brief career. The overly optimistic projected power needs of the Pacific Northwest did not materialize as grandly as they were pitched by the Washington Public Power Supply System (WPPSS), fondly pronounced “whoops.” The farmers, ranchers and small investors were unaware of the shaky financial underpinnings of the nuclear power industry, having grown accustomed to the relatively cheap and abundant hydropower of the Columbia River. The construction schedule slipped and projected costs grew far beyond the original budget. Meanwhile, the regional power needs declined due to a recession and to wisely enacted conservation measures – the “Kill-a-Watt” program. There was a record-setting municipal bond default of $2.25 billion. By this time the major investors had sold their bonds to the “widows and orphans.” Retirement funds had to absorb the losses, receiving 10 to 40 cents on the dollar.

The WPPSS Plant 2 in Hanford was eventually completed in 1984 but the partially built Plants 1, 3, 4, and 5, with their massive concrete containment structures, were mothballed and eventually demolished. It became clear in 1982 that the spring-loaded pipe hangers and supporting girders to safely carry the superheated, pressurized steam would no longer be needed for the bankrupt reactor project.

Two Seabrook, NH reactors were planned but the first unit didn’t begin full operation until 1990, 14 years after the construction permit was granted. The owners canceled construction in 1984 of the second reactor unit at 25% completion after spending $800 million. The difficulties led to the bankruptcy of Seabrook’s utility owner. The pipe support company business shrank and layoffs ensued. The blueprints for fabricating steel, carefully engineered to maintain safe operation under all conditions, stored on computer punch cards, represented untold hours of labor. The cards were pulled out of their drawers en masse and emptied into two dumpsters, to be recycled as paper.

Geothermal and Renewable Energy 

As the pipe support engineering job ended, in 1982 I joined a university geothermal exploration team. The Earth’s crust has abundant internal heat sources, largely derived from radioactive decay of primordial elements, but using them to generate electricity is capital-intensive and is difficult to implement globally. A large-scale geothermal power facility generally operates at relatively low temperature, but with infrastructure planning it can also provide district heating, extremely important in colder regions. The higher temperatures of volcanic magma bodies hold promise for even more electrical output. The boreholes at power plants can emit a modest amount of greenhouse gases, and sometimes environmentally hazardous waste fluids that must be pumped back into the ground. But in Iceland, CO2 is injected into basaltic rocks, and waste fluids are fed into thermal spas, where their unique minerals are said to have beneficial cosmetic and curative effects.

In 1982, the CO2 concentration at Mauna Loa Observatory reached 344 ppm. Europe’s alpine glaciers are shrinking, and the glaciers that provide the water power that makes up more than 70% of Iceland’s electricity production are threatened by climate change.

Renewable sources of electricity (wind, solar, geothermal, and small-scale hydropower) do not generate excess heat that would affect climate, but more importantly, their ongoing use does not pollute the biosphere and produce a long-lasting greenhouse blanket. While their economic advantages are strenuously debated, they are much faster to install and cheaper to operate than fossil or nuclear power plants. Their operation consumes no fuel or water. Although cheaper than any other source, solar and wind output is variable. They require land and supporting infrastructure. Their costs depend on the economic theory assumed and are higher if the baseload consumption of data centers and parasitic financial transactions is prioritized over the needs of consumers. Such data centers are already leading to drastic rate increases. A regional electric grid must adapt demand to supply with batteries, pumped-storage hydroelectricity, or other backup systems. Re-conductoring with higher strength cables and grid enhancements can improve regional transmission capacity without expanding land usage. Currently available technology is rapidly maturing to make environmentally sound power generation and delivery possible, but in the face of large amounts of profit, the political will to change the energy industry is severely lacking.

Department of Energy funding for the basic science of geothermal energy soon ran out. Through good fortune I was able to pursue a graduate degree in geophysics. The end of the Cold War saw a renewed interest in planetary science, using technology such as lasers developed to probe missiles in the blackness of space, but also capable of mapping the terrain of the Moon, asteroids, and the planet Mars. The topography of once-liquid channels on that dry, frozen planet was revealed in precise detail by the laser altimeter on board Mars Global Surveyor. That iconic MOLA dataset is now encouraging a younger generation in planetary exploration.

Astronaut Buzz Aldrin at Purdue University’s Space Day promoting Destination Mars, with children exploring a color-shaded topographic map created by the Mars Orbiter Laser Altimeter. My worst mistake as a Mars scientist was to shade the topography of the northern hemisphere where Buzz is standing in deep blue hues, reinforcing earlier beliefs in the existence of a Mars ocean. There certainly isn’t one now.

Climate Science in the Space Age

As the Viking 1 and 2 (1975) spacecraft flew to the cold and dusty surface of Mars, and the Pioneers (1978) to the hot, acid-laced clouds of Venus, Carl Sagan recognized that the radically different surface temperature of these planets from ours was mainly due to their atmospheric content. Venus may have once been earth-like in climate, but is now dry and has a thick CO2 atmosphere that maintains its hothouse status. Mars, with its very thin CO2 atmosphere, is much colder than Earth but once had widespread fluvial activity: river channels, erosion, subglacial lakes, perhaps even a transient ocean. A vexing problem in planetary science arises from the fact that the early Sun was only 75-80% as bright as today, and so how did liquid water exist at the surface of Earth, let alone Mars? Models assuming denser, greenhouse-gas-enriched atmospheres still struggle to achieve temperatures above freezing without invoking special circumstances.

Radiative balance is complicated. It does not help that some naive but very wealthy people deny that it matters, or babble about colonizing other planets.

There is no lack of liquid surface water currently on Earth, despite long periods in the past when the surface was largely frozen. The lesson from those other planets is that a 50% increase in Earth’s greenhouse gases over pre-industrial levels will eventually melt the remaining ice sheets, leading to meters of sea level rise. Sagan’s testimony to Congress in 1985 stressed the urgency of amity between nations in addressing the effect of fossil fuel burning on climate, that “we are all in this greenhouse together.”

Architect Bucky Fuller’s geodesic domes, and his Expo ’67 Biosphere 1 concept of a Spaceship Earth, inspired the space-mission-like Biosphere 2 project in the late 1980s. A large, sealed, glassed-in dome was built in the Arizona desert to simulate the self-contained ecosystems on which humans depend. The project was started by a beat-generation scientist/explorer/engineer/writer/activist John P. Allen and funded in part by oil billionaire Ed Bass. The experiment almost ended in disaster after two years as oxygen levels declined and the eight biospherian volunteers struggled to breathe, until air was let in. A second mission was cut short by financial chaos and the lack of understanding of why the experiment was failing. Bankers turned the complex over to a contractor, Space Biosphere Ventures. After considerable acrimony and lawsuits that ended the receivership, a science team from Columbia University’s Lamont-Doherty Earth Observatory took over management, led by oceanographer Wally Broeker. After kicking all the humans out, the team found that the composted organic matter put into the soil in order to grow sufficient food for the Bionauts in a limited space was the source of the problem. Where did the oxygen go? The relatively new masonry in the structure was absorbing much of the CO2 needed by the growing plants, depleting its atmosphere of both carbon and the oxygen that should have been returned via photosynthesis.

Wally Broeker conducted research at sea and was very familiar with the self-contained nature of ships. For example, marine vessels use waste heat from their engines while underway to distill seawater into fresh water for drinking and bathing. While on station, the main engines are off and the scarce water stored in tanks is rationed, whence the term “navy showers.” Noting in 1975 the temporary effects of aerosols that ships belch into the atmosphere, he wrote, “By analogy with similar events in the past, the natural climatic cooling which, since 1940, has more than compensated for the carbon dioxide effect, will soon bottom out. Once this happens, the exponential rise in the atmospheric carbon dioxide content will tend to become a significant factor and by early in the next century will have driven the mean planetary temperature beyond limits experienced during the last 1000 years.” Said Broeker, ”The climate system is an angry beast and we are poking it with sticks.”

The next year saw record high temperatures. While disavowing the simplistic “global warming” phrase in his scientific paper’s title, Broeker and colleagues used the repurposed Biosphere facility for innovative climate research. For example, they raised the temperature and partial pressure of CO2 in a self-contained seacoast environment in order to mimic the tropical conditions expected in the late 21st and 22nd century. Coral growth suffered from acidification, a severe consequence of fossil fuel combustion. In the enriched atmosphere, according to a tour guide, morning glory vines took over everything and had to be removed.

Biosphere 2, now a museum and research facility of the University of Arizona, underscored the complexity of ocean-earth-atmosphere interactions. It led to far better understanding of this interplay than at the dawn of the Anthropocene. It is also remembered for the flashy red outfits worn by the members of a hippie theater troupe who became the Biospherians.

In 1986 the CO2 concentration at Mauna Loa Observatory was 348 ppm. Although the 20% increase from a pre-industrial era may not seem too big relative to other changes in the atmosphere, the curve was not leveling off.

While from 1988 to 2015 world population grew ~45% (and 34% in the US), the contribution of fossil fuels to global heating doubled: 833 Gt CO2 was emitted in just 28 years since 1988, compared with 820 Gt CO2 in the 237 years between the birth of the Industrial Revolution and 1988. A 2012 survey of the 6028-m-high (19,777’) Mt. Chimborazo in Ecuador found that vegetation zones in the Andes were moving upslope by 3 meters per year. On this volcano, Alexander von Humboldt’s epic 1802 expedition had meticulously described the elevation of grasslands and glaciers. These have moved upwards by ~600 m since the pre-industrial era. Humboldt’s eponymous glacier, the last in Venezuela, has now disappeared. The montane animals, now living at 1 km below the peak, have nowhere to go but up. 

James E. Hansen, as chief climate scientist for NASA, an outspoken writer and activist, testified before Congress in 1986 about the threat of global heating and sea level rise. More attention was then being paid to environmental issues of acid rain, oil spills and the ozone hole forming over Antarctica. Hansen, who had established the dominant role of aerosols and greenhouse gases and the relatively minor importance of natural variability in climate forcing, startled the world when he testified again in the summer of 1988, a time when lethal heat waves, storms, droughts and wildfires were on the front pages and rising global temperature became a household issue. He also described the loss of habitats, and sadly described the decline of the once-prolific monarch butterflies in his backyard.

In 1988, the CO2 concentration at Mauna Loa Observatory reached 350 ppm. The oceans were beginning to resemble Joseph Priestley’s seltzer water.

Mean seawater pH. Lower values are increasing acidity. (Replace with vector file seawater-ph.svg) Details at https://hahana.soest.hawaii.edu/hot/trends/trends.html. Rate of decline is ~0.002 per year.

I enrolled in a marine geophysics doctoral program, surveying the South Atlantic in 1990 on a UCSD Scripps research vessel. Profiles of ocean temperature were taken as a routine daily activity to calibrate the multi-beam sonar mapper but also to understand slow changes in ocean circulation and climate.

From working in geophysical exploration to becoming a planetary scientist took another ten years. I became involved in the Mars Exploration Program and NASA Goddard’s laser altimeter group, where we created topographic maps of Mars more precise than any available for Earth. Eventually the changing thickness of Earth’s ice and its mass were measured by satellites with comparable accuracy.

In 2012, the CO2 concentration had reached 395 ppm.

The oceans have been absorbing most of the human-induced heating. The average temperature of the upper 2 km has warmed by as much as one degree. This leads to a corresponding rise in the surface tropical cyclone potential, the driver of severe Atlantic hurricanes. As seawater warms, it expands, adding to sea level rise. The salinity contrast between the heated tropical oceans and the fresh polar melt water increases, with dynamical and biological consequences. The excess heat has nowhere to go but down.

Much of the anthropogenic carbon dioxide is absorbed by the oceans. The huge inertia of carbon sources and sinks in the deep oceans is harder to reverse, as CO2 continues to rise exponentially. An immediate, dire  consequence is the acidification of the oceans; doubling of CO2 (in the absence of buffering) implies a drop (more acidic) of 0.3 in pH. Acidification affects the ability of marine creatures to cycle nutrients, produce shells, and absorb more CO2. The coral reefs die and the fish have nowhere to go.

Additional feedback loops are becoming evident. The release of methane frozen into permafrost and seafloor deposits (clathrates) as the air or water warms amplifies the greenhouse effect, although the time scales are debated. The melting of the Arctic is ending the summer pack ice where bears and indigenous people hunt seals. The polar bears have nowhere to go.

It was once thought possible to achieve an optimal climate, if fossil fuel burning were rapidly phased out. A considered estimate of what was required to return Earth’s climate to a steady state was to hold CO2 at 350 ppm. That was the level in 1988. Ten years later the Kyoto Protocol to limit carbon emissions was signed by numerous nations, and the level was 365 ppm. The largest emitters failed to ratify its limited goals, claiming it would expose them unfairly to competition from developing nations!

Climate resolutions give an illusory impression that progress is being made, while the hard facts say otherwise. At the time of President George Bush’s 2003 invasion of Iraq, CO2 was 375 ppm; by 2007, when Vice President Gore was accepting the Nobel Peace Prize with the UN’s IPCC, it was 385 ppm. The Protocol was amended in 2012, by which time CO2 had reached 395 ppm. When the US withdrew from the UN Protocol in 2017, it reached 406 ppm. At the protocol’s amended target date of 2020, CO2 concentration was 415 ppm and five years later it reached 430 ppm. The rate of increase is now more than 3 ppm every year, which bakes in ever-greater amounts of ocean acidification, greenhouse heating, and sea level rise. CO2 accumulation is on track to double the notionally optimal concentration by 2100. That level will take climate back millions of years to the Miocene (as shown graphically on p. 172), when hothouse conditions prevailed from pole to pole. Humans will have no place to cool off.

Part VI Politics and The Great Sickness

The automobiles, airlines, and the financial oligarchies that stem from them are poisoning the air we breathe and cooking the planet. Children are frightened. Young adults are often saddled with debt, afraid to start families, can’t afford housing, and barely manage to survive financially.

As World War II and the Cold War morphed into an Orwellian perpetual war, the American and allied soldiers came home from Vietnam, the Soviet soldiers from their predatory war in Afghanistan. Tuberculosis and pneumonia were nearly vanquished, polio and measles were on the run. I was spared most of these diseases, but an even greater sickness had struck the country — the McCarthyist Red Scare, loyalty oaths, spy trials, mutually assured destruction. The Cuban missile crisis, Herman Kahn’s Thinking the Unthinkable, and various movies normalized weapons of mass destruction. The childhood games of “Cowboys and Indians” with cap-pistols and bows-and-arrows spurred by settler-colonial “Lone Ranger” western movies seemed peaceful and relaxing by comparison.

The planet reeled from the expansion of the military assaults, not the least of which were the more than seven megatons of high explosives dropped on towns and villages of Indochina, followed by defoliants with their “agent orange” dioxins. Not until the Gulf War of 1991 were so many soldiers poisoned by their own government. A third of the chiefly American, Saudi and British soldiers deployed reported multiple long-lasting symptoms and pathologies not related to combat itself, arguably related to neurotoxic weapons, pesticides, and widely-dispersed chemical pollution. For example, 700 tons of “depleted uranium” (DU) munitions were fired in the Persian Gulf, widely dispersing breathable particulates, with about the same activity in becquerels as the amount of plutonium in an atomic bomb. DU is just as toxic when dispersed.Their use continues to this day in the Middle East. Decades of studies have attempted to isolate or deflect blame for the Gulf War Syndrome, but the veterans afflicted have little doubt that their safety was disregarded, as it was and continues to be from the atomic era fallout.

The oil wells exploded in Iraq. Emissions from uncontrolled wellheads contributed to the rise in greenhouse gases that are, in the short term, more potent than CO2. Methane has more than 80 times the warming power of carbon dioxide over the first 20 years after it reaches the atmosphere. Roughly 25% of today’s global warming is driven by methane. In 1980 methane concentration was 1600 parts per billion (ppb) at the Mauna Loa Observatory, more than double its pre-industrial value of 690. By the time of the Persian Gulf War (1990–91), it reached 1750. By 2006, the biogenic sources such as microbial methane produced by the warming of wetlands became a new source of climate feedback. In 2023, it reached 1932 ppb.

The ideologues of the Cold War and the Vietnam War turned first to helping the tobacco companies sell their addictive wares, and then the oil, coal and gas industries to spread doubt about the causes of the changes that were already becoming apparent to anyone who observed the trees and the seasons. At the height of the anti-war movement in May, 1970, an advertising copywriter hatched the mild-mannered “Earth Day” response to the growing unrest. Within a year of the campus shootings that killed six students and wounded dozens in Ohio and Mississippi, the advertising agency of General Motors and Standard Oil coined the jingle “I’d Like to Buy the World a Coke” sung by beatific young people on a hillside. Madison Avenue “communications strategies” helped the pharmaceutical industry promote painkilling drugs, as addictive as petroleum. They were tasked by fossil fuel companies to minimize perceptions of their ongoing damage to the environment — “not currently a problem,” and by numerous others to promote pesticides and junk foods, and expensive drugs for treating the obesity that stems from unhealthy diets.

As the economic and human consequences of perpetual war became increasingly severe, those social ills and physical pain were treated as “personality disorders.” Madison Avenue gaslighted the public with what has been described as “cosmetic pharmacology,” e.g., “Listening to Prozac.” While medical treatments for organic disorders have certainly advanced, direct-to-consumer advertising of antidepressant and painkilling drugs soon became as deadly as the automobile.

The deaths from illicit and prescription opioids rose to 106,000 persons annually in the US in 2021, rising to a rate that was only exceeded by the deaths from the pandemic coronavirus. Many of the same propagandists who helped the tobacco and coal industries have deflected the cause of the virus epidemic towards foreign powers, after the White House officially denied the problem and then led the world in spreading the disease. The resulting pandemic and disinformation campaign killed a million people in the US alone. A social murder of the underclass and essential workers ensued.

Mass extinctions happen one ecosystem at a time, one species at a time, one habitat at a time, one individual at a time. Desperation and nihilism arose from the moral and cultural decay attendant on the superpower status of a wealthy nation. My family has not been spared.

The sickness struck my cousin, who turned to alcohol and cough syrup before his teens to ease the pain of rejection by adoptive parents. He transitioned to heroin in a few years, and became even more addicted to the methadone treatments prescribed by the state. After my father’s funeral, we put the “Cuz” on a bus from Buffalo across the Peace Bridge to Fort Erie in Canada. Few questions were asked, New Yorkers crossed the bridge daily. From there by bus to Halifax, Nova Scotia where we weaned him painfully at a local drug crisis center. From his 120 milligrams “maintenance” to the final daily dose of 5, the detoxification went relatively smoothly over several weeks. But when that last tiny amount was withdrawn, the sickness returned in full force. He became belligerent and relapsed several times into heroin, not hard to find in a seaport town. He visited my mother on occasion, stole the family car, was arrested, and spent many years in institutions, after abandoning a mate and a child. In the end his liver succumbed to the years of alcohol dependence.

It touched my nephew, who was able to avoid the worst consequences of his youthful irresponsibility by joining the military, from which he withdrew at the height of the interventions in the Middle East. Ending a troubled relationship with an adopted family, he re-enlisted, but could not stomach G. W. Bush’s 2003 Iraq invasion, and re-entered civilian life.

My brother-in-law was the firstborn child, grandson of an illiterate immigrant tailor. His father had navigated a strongly Catholic educational system to become a wealthy law firm partner and marry a lovely down-east Yankee lady. But neither parent could countenance the boy’s learning differences in a highly conformist world, which quickly drove him to escape into drugs and alcohol in suburban New Jersey. Many years of alcoholism left him with a failing liver, and although a transplant operation was considered, he could not meet the time limits on sobriety. As infectious hepatitis and jaundice closed in, he dosed himself with over-the-counter medicine that aggravated his ulcerated body. He left no children and ended the male line of a made-up immigrant family name.

Schoolmates succumbed to the ravages of AIDs, others to leukemia or mental illnesses. My classmate’s daughter ended her life at the age of 17, likewise another classmate’s stepdaughter.

My niece succumbed to a fatal psychosis resulting from the profitably-prescribed amphetamine treatments for newly-framed “attention disorders.”

It struck one of my children, who acquired an addiction to drugs, tobacco, and alcohol.

There was the mysterious, tragic death of my younger son.

The Cultural War and Individualism

The National Association of Manufacturers (NAM) developed a strategy to support their anti-union, anti-labor and anti-taxation agenda. Calling it “integration propaganda” — public service announcements, news feeds, talk shows and films — their National Industrial Information Council flooded the newspapers and airwaves with pro-business ideology. The NAM’s Industry on Parade, televised from 1950-1960,celebrated the manufacture of plastics and denounced worldwide communism. The campaign was not above using cartoon characters, teenage heroes, and movie shorts to overcome the negative public opinion towards wage cutting and factory exploitation engendered by the Great Depression.

The philosophies promoted by the pseudo-economists known as the “Chicago School,” by politicians (Libertarians, Cold War Reaganites), and by the advertising industry were part and parcel of a new-style cultural war. In the beginning it was simplistic propaganda extolling man’s unbridled assault on nature. Dupont’s “Better living through chemistry” in 1935 aimed to show that virtually any human problem could be solved with science. Mass consumption and planned obsolescence went hand in hand with General Electric’s “Progress is our most important product” marketing campaigns, which included purchasing RCA and its television networks. While decrying government antitrust regulation, the destruction of the environment that accompanied wanton waste of resources was blamed on individuals, to be mitigated by well-meaning but ineffectual recycling campaigns and adoption of “green” technology.


From Hippiedom to Socialism

I grew up with relative privilege and was fairly naive with respect to the working-class movement. Class consciousness arises from material conditions, but evolves in complex ways as classes struggle. The common causes that affect the broad masses of people, apart from the super-rich, are the economic struggle, the struggle for peace and equity in all walks of life, and increasingly, for environmental security. Though I seldom experienced hardship and wasn’t a “red diaper baby,” I derived a social consciousness from my parents, who worked their way through college and up the academic ladder on a shoestring. My mother said that, as an unborn child, we picketed the National Theater in January 1947, at a time when such venues in Washington and Baltimore were closed to blacks. (I can’t claim to have been aware of that.) The music of the era in the vinyl record collection stemmed from industrial union organizing and the civil rights movement. As I grew up, I read books about the holocaust, works by anarchists, science fiction, and I admired Dadaist art. I let my hair grow long.

Several college summers were spent employed in the Oregon logging industry. It is one of the most hazardous occupations in the country, ranking with coal mining, steel, construction and air transport. I showed up to the hiring office with a hickory-stripe shirt, suspenders, and a new pair of the spiked “cork” boots of the woodsman; they X-rayed my bones, and finding nothing broken, sent me out on the choker lines.

There were close calls. A haulback line struck my elbow as it dropped from the sky. The hook tender, a working foreman standing three feet away, said it had just brushed the flies off me. That old Okie told me to stand a little closer to him. The one-inch steel rope could have just as easily sliced me in half. Accidents could and did happen anytime, as the powerful yarders ripped logs down the slope, occasionally popping stumps and saplings into the air. The Weyerhaeuser management’s main preoccupation was to avoid “lost time” accidents, so when my buddy’s hard hat got dented by a flying tailblock stump, they put him in the base camp for a couple days, hosing down trucks while his concussion subsided.


In Oregon, the Department of Agriculture’s Forest Service had leased huge tracts of public land to commercial logging with virtually no opposition. As I jumped from log to log in the bottom lands, where old steam-powered donkeys rusted amongst the 12-foot-diameter Douglas fir stumps, it was evident how much of the old growth forests had been destroyed in the last century. With the Bureau of Land Management, I worked on a crew to contain the 1966 Oxbow Ridge fire, one of the wildfires that burn hundreds of thousands of acres in the Oregon coastal ranges. This fire was sparked by human activity, i.e., log road construction. The work was hard but the chuckwagon was plentiful. We watched in awe and ran to the trucks as the winds picked up embers from the smoldering fire, igniting the treetops. When the fire crowned, it spread at the speed of the wind until it reached previously logged areas. The nascent American environmental movement, derogatorily known as “tree-huggers,” after the 350 Bishnois martyrs who sacrificed their lives to save their trees, was having its effects, in that the Forest Service managing the public lands in the Cascades also employed us as tree planters, at minimum wage. Our firefighting hoedad tools were put to use recovering the lands that had been burned or clear-cut previously. The steep volcanic hills take another century to become mature green forests.

A High Cascades forest replanted 40 years ago, after the Mt. St. Helens eruption. Toutle, Washington, 2023.


Active Resistance to Racism, War, and Environmental Destruction

“Let it be,” “Turn on, tune in, drop out,” were foisted on the baby boom generation as an alternative to active rebellion against a highly militarized society. After all, who doesn’t like rock and roll music? Soporific and hallucinogenic drugs promised peace and instant enlightenment, to a generation that was spoon-fed Cold War pablum in sterile communities. Easy listening like “Hey Mr. Tambourine Man” seduced the generation to “… forget about today until tomorrow.” Indeed, some students dropped out of college, abandoning the knowledge accumulated over millennia of recorded history as “irrelevant” and pretended to be liberated. Others realized that the history they were being taught was utterly biased, a history written by, for, and about victorious white men, and they plunged into the resistance struggle.

My senior year in high school was interrupted by the assassination of President Kennedy. As a senior in college, celebrating my first vote as an adult and my first legal drink, I watched the news of Martin Luther King Jr.’s assassination playing on the television screen over the bar at Lutz’s Tavern in Portland. Two months later, Robert Kennedy, who had opposed US involvement in Vietnam, was killed. At the height of the Vietnam war of liberation, in which over half a million US troops were committed and millions of soldiers and civilians died, the 1968 Democratic Party Convention suppressed any peace plank in their platform. The convention disregarded the plurality of supporters of a mildly-anti-war candidate Eugene McCarthy, and allowed the Chicago police to attack thousands of demonstrators with clubs and tear gas all over the city. The pro-war candidate, Vice President Humphrey, lost decisively to Richard Nixon. Small wonder that many youth abandoned the mainstream political process.

The materialistic, media-promoted culture of the individual was aimed at erasing any vestiges of class consciousness amongst workers, students and disadvantaged youth. It took a heavy toll on social activism, not to mention the largely working-class and minority soldiers who were sacrificed in foreign wars. It tried to co-opt the movement for women’s liberation and equality. It tried to create a generation that was easily swayed by corporate media, deprived of history so that life was ultimately meaningless. The youth were taught that the world was doomed, that we had to place our hopes in colonizing Mars. But the cultural onslaught of the 70s and 80s “me” generations was not triumphant. Our children (and a few grey-hairs) continue to protest. As the anti-war, anti-racist, non-discriminating popular sentiment grew, socially-minded people sought alternatives to the business-as-usual liberal politics.

The unpopular war in Vietnam, involving half a million US soldiers conscripted in the mid-1960s by an infamous lottery system, sparked massive protests nationwide. The spontaneous resistance against the draft in the US gradually rejected the pacification of youth with drugs and ‘alternative’ culture. The FTA underground organization in the Army sang “Hey hey LBJ how many boys did you kill today” in their platoon drills. The more radical elements of the Students for Democratic Society (SDS) struggled to find solid ground, following the mass demonstrations and urban uprisings in the US of 1963–1968, the campus shootings of 1970, and worldwide events such as the Paris Uprising, the Cultural Revolution in China, the Prague Spring in Eastern Europe and the 1965–1971 assassination of leaders of the Black liberation struggle in the US. Protests against suppression of black peoples and the anti-war movement were met with police attacks and prosecutions, that later revealed the widespread use of undercover provocateurs by the FBI. In due course of the COINTELPRO campaign, some who were nowhere near the alleged events were famously charged with conspiracy to destroy government buildings, when the instigators were government agents. In the face of this repression, some “New Left” leaders became anarchists, bombers, cult followers, or armchair revolutionaries. Many radicals simply merged into the left-liberal mainstream politics that was the window dressing for the ongoing imperialist wars. 

The struggles in the US against racial injustice and poverty were inspired by the courageous, nonviolent civil rights movement, but, faced with severe state repression, took up more militant methods of protest. Activists who avoided the questionable politics of the SDS and its diversions were aware that their strength lay not solely in the youth and student movement but necessarily in the historical working-class struggle against monopoly capital. They sought answers from the century of socialist theory and practice founded by Marx and Engels, and the experience of the worldwide revolutions that followed. They rejected the communist holdovers from the 1930s who were thoroughly compromised by the abandonment of revolutionary work stemming from the influence of Moscow. Partly as a reaction to Soviet influence and the attacks of Stalin’s successor Nikita Khruschev, Mao Tsetung launched a “Great Proletarian Cultural Revolution” in China to weed out “capitalist roaders” from the ruling party. This development invigorated the movements in the US with its militant red banners denouncing imperialism and supporting the Afro-American people’s struggle. The popularization of “Quotations of Chairman Mao” however failed to overcome the growth of capitalism in its various socialist guises. China soon rehabilitated its capitalist roaders under the banner of “modernization,” and now pursues a similar hegemonic path as the other superpowers.

The rise of socialist and communist ideology in the US had taken many forms after the Haymarket massacre in 1886 that followed the May 1 strike for an 8-hour working day. Anarchism and syndicalism, typified by the Wobblies (nickname for the IWW, International Workers of the World) held sway, but the international Marxist tradition upsurged with the Russian Revolution of 1917. The communists upheld the leading role of the working class in the fight for socialism. They rejected both anarchism and beliefs in a benevolent, enlightened capitalism. They built strength in the factories and gained numbers as the Great Depression forced millions of unemployed onto relief. The Communist International (Comintern), formed in 1919 with representatives of 34 parties, grew to 65 members in the period between two world wars. The Communist Party of the Soviet Union (Bolshevik), or CPSU, had enormous influence in the US, especially during the Great Depression. A progressive political agenda towards socialism engaged the country, so much so that President Roosevelt claimed that his New Deal concessions were the only way to “rescue capitalism from the grave.” Ultimately, it took a global war to save the ailing capitalist economy, and left Fort Knox full of sufficient gold to dominate the post-war years.

This memoir cannot begin to dissect the mistakes and ultimate failure of the Soviet state and the international communist movement that it led. As the first socialist party to seize state power, the Bolsheviks were faced with an economy in ruins after the World War and a civil war. They were opposed by external foes and beset with internal class strife, the legacy of a century of agrarian struggle under tsarism. Briefly, the Soviet state failed to cement the alliance of the workers and farmers embodied in the hammer and sickle. It succumbed to bourgeois methods of economic planningby bureaucratic managers. After the untimely death of V. I. Lenin, these apparatchiks created a new state capitalist oligarchy in the name of socialism.

The CPSU had played a stultifying role in the international communist movement for some time. Its disastrous 7th World Congress in 1935, after Hitler had taken power in Germany, led to capitulation of the workers movement in the name of a united front to fight fascism. The brutal purges the CPSU instituted under Stalin’s leadership were followed by a complete dissolution of the Comintern in 1943. The following years saw the end of what was effectively Russia’s empire in Eastern Europe. By the 1950s, especially after Khruschev’s coming to power and the suppression of revolutions in Hungary and Czechoslovakia, the worldwide communist movement was in disarray. Khruschev’s state-run capitalism had been pawned off as being Marxist-Leninist, but in reality the former Comintern leadership had adopted a new brand of imperialism, backed by the ideological trend that is henceforth described as revisionism.

The failure of attempts to transform society, as with the Paris Commune of 1870–71, does not, however, invalidate the concept nor remove the necessity. 

Soon after college I crossed paths with a group of students and faculty who began to organize in the universities but had roots in the worldwide anti-revisionist communist movement. Charged with a growing denunciation of Soviet-style revisionism, a young group of workers, students and veterans, the Cleveland Draft Resistance Union, began to study Marx and Lenin in the late 60s. With labor organizers, civil rights activists and the help of veterans returning from the Vietnam War, they picketed the Army draft centers in the communities.

The bourgeoisie tried to get a section of the US workers to be a fighting force for fascism. In New York, not long after a large anti-war mobilization took place, they launched a provocation against the peace movement. Calling a pro-war rally, the corrupt building trades union hacks forced workers to come by threatening to take their day’s pay. They mobilized a few hundred construction company foremen and union members out of the thousands present to attack some long-haired anti-war protesters. Soon after this event a similar rally was organized in Ohio. This was the launching of the so-called “hard-hats,” a movement aimed at showing that the workers supported reactionary war and fascism

Waving red banners, the communists waded into the pro-war demonstrations in New York City and in Ohio, putting an end to the supposedly worker-supported “hard-hat movement” — a front organized by reactionary businessmen and corrupt building trade union hacks dressed up as ordinary workers — in 1968.

Seeking a new understanding of the path toward socialism in the industrialized world, these militants examined the history of the Communist Party in the USA and its liquidation; they formed alliances with youth and student groups in other countries. They attempted to unite the anti-imperialist forces in the US and were rebuffed several times, chiefly by new-leftists who deemed the working classes incapable of revolutionary ideology, substituting Castro, Guevara, Mao and various cultural nationalists as the true leaders. They committed themselves to the creation of a new and revolutionary Marxist-Leninist communist party, a proletarian revolution, and the establishment of the dictatorship of the proletariat over the bourgeois classes so as to build socialism and ultimately eliminate class society. But they recognized that no new social system can be pictured in detail prior to its existence, and no large human endeavor is ever carried out exactly according to its original blueprint. So they engaged in the deepest possible study of the history of the international workers’ movement and the experience of the socialist revolutions and liberation struggles worldwide.

A new-left historian pontificates about attempts to create such a party: “Second was the Central Organization of US Marxist-Leninists (COUSML), which had been formed in 1973 mainly by the [1969] Cleveland-based Ameri­can Communist Workers Movement. In January 1980 this group, too, held a found­ing congress and declared itself to be the Marxist-Leninist Party. The MLP thus became the sixth anti-revisionist group to declare that it had founded the vanguard of the US working class—but with just 100 members it was the smallest vanguard yet.”

As with slivers and swimsuits, whether something is notable or not is unrelated to its size. The MLP was indeed a small group of self-professed nobodies. The clandestine nature of a revolutionary party precludes enumeration of its membership, but it was active in a dozen East Coast, West Coast, and Midwest cities. What this historian ignores is that the MLP came out of the heat of struggle, not out of the left wing of well-endowed, establishment parties. Nor was it one of the groups spawned by the Soviet-era revisionists that, under the guise of opposing US imperialism, support Russian involvement in the Middle East and its annexations of territory in the former socialist republics.

The MLP defended the rights of immigrants against deportation for the ‘crime’ of seeking work. It fought the efforts of the Carter and Reagan administrations to divide workers by place of birth and scapegoat the poorest of them. As the rights of women to safe, legal contraception and abortion came under increasing attack, the MLP embraced militant confrontation, not legalism, and battled the anti-abortion Operation Rescue thugs at clinics. While OR claimed to be non-violent, their actions coincided with over 100 clinic bombings and the murders of four doctors. Internationally, the MLP supported revolutionary fraternal organizations. While embracing the socialist triumphs of tiny Albania, it did not hesitate to criticize its mistaken positions. It opened public polemics against revisionist trends in the world communist movement, a struggle that continues to this day.

The MLP did not seek to gain influence by the ballot nor recruit voters for the Democratic Party. They campaigned against the capitalist two-party, no-choice system, and the suppression of democracy by these parties. They campaigned against the racist movement being floated against the court-ordered desegregation of schools in cities like Louisville, KY and Boston, MA, where school improvements and open enrollment had long been denied. As a concession to the growing Civil Rights movement, school integration was clumsily implemented by busing children long distances into other school districts. Meanwhile, Richard Nixon and the segregationist governor George Wallace campaigned nationally against busing.


 In Boston, buses carrying children to school were repeatedly attacked with stones by mobs, spurred on by racist politicians. After leafletting to oppose the segregationist school committee and the anti-busing thugs, the MLP wiped out a gang of racists and off-duty policemen styled as the “South Boston Marshals” when they attempted to break up a public meeting. Workers, led by militant, bespectacled women and a middle-aged professor, busted chairs over their heads and ripped off their jackets. They never showed their faces again.

Anti-busing gang’s jackets.