Reckoning with ‘the battering ram of the Anthropocene’

Also posted on Resilience

Is the word right on the tip of your tongue? You know, the word that sums up the ecological effects of more, faster and bigger vehicles, driving along more and wider lanes of roadway, throughout your region and all over the world?

If the word “traffication” comes readily to mind, then you are likely familiar with the work of British scientist Paul Donald. After decades spent studying the decline of many animal species, he realized he – and we – need a simple term summarizing the manifold ways that road traffic impacts natural systems. So he invented the word which serves as the title of his important new book Traffication: How Cars Destroy Nature and What We Can Do About It.

The field of study now known as road ecology got its start in 1925, when Lillian and Dayton Stoner decided to count and categorize the road kill they observed on an auto trip in the US Midwest. The science of road ecology has grown dramatically, especially in the last 30 years. Many road ecologists today recognize that road kill is not the only, and likely not even the most damaging, effect of the steady increase in traffication.

Noise pollution, air and water pollution, and light pollution from cars have now been documented to cause widespread health problems for amphibians, fish, mammals and birds. These effects of traffication spread out far beyond the actual roadways, though the size of “road effect zones” vary widely depending on the species being studied.

Donald is based in the United Kingdom, but he notes there are relatively few studies in road ecology in the UK; far more studies have been done in the US, Canada, and Western Europe. In summarizing this research Donald makes it clear that insights gained from road ecology should get much more attention from conservation biologists, transport planners, and those writing and responding to environmental impact assessments.

While in no way minimizing the impacts of other threats to biodiversity – agricultural intensification and climate change, to name two – the evidence for traffication as a major threat is just as extensive, Donald writes. He cites an apt metaphor coined by author Bryan Appleyard: the car is “the Anthropocene’s battering ram”.

Traffication has important implications for every country under the spell of the automobile – and particular relevance to a controversy in my own region of Ontario, Canada.

A slow but relentless increase

One reason traffication has been understudied, Donald speculates, is that it has crept up on us.

“These increases have been so gradual, a rise in traffic volume of 1 or 2 per cent each year, that most of us have barely noticed them, but the cumulative effect across a human lifetime has been profound.” … (All quotes in this article from the digital version of Traffication.)

“Since the launch of the first Space Shuttle and the introduction of the mobile phone in the early 1980s,” Donald adds, “the volume of traffic on our roads has more than doubled.”

Though on a national or global scale the increase in traffic has been gradual, in some localities traffication, with all its ill effects, can suddenly accelerate.

That will be the case if the government of Ontario follows through with its plan to rapidly urbanize a rural area on the eastern flank of the new Rouge National Urban Park (RNUP), which in turn is on the eastern flank of Toronto.

The area now slated for housing tracts was, until last November, protected by Greenbelt legislation as farmland, wetland and woodland. That suddenly changed when Premier Doug Ford announced the land is to be the site of 30,000 new houses in new car-dependent suburbs.1 And barring a miracle, the new housing tracts will be car-dependent since the land is distant from employment areas and services, distant from major public transit, and because the Provincial government places far more priority on building new highways than building new transit.

Though the government has made vague promises to protect woodlands and wetlands dotted between the housing tracts, these tiny “nature preserves” would be hemmed in on all sides by new, or newly busy, roads.

As I read through Donald’s catalog of the harms caused by traffication, I thought of the ecological damage that will be caused if traffic suddenly increases exponentially in this area that is home to dozens of threatened species. The same effects are already happening in countless heavily trafficated locales around the world.

“A shattered soundscape”

Donald summarizes the wide array of health problems documented in people who live with constant traffic noise. The effects on animals are no less wide-ranging:

“A huge amount of research, from both the field and the laboratory, has shown that animals exposed to vehicle noise suffer higher stress levels and weakened immune systems, leading to disrupted sleep patterns and a drop in cognitive performance.”

Among birds, he write, “even low levels of traffic noise results in a drop in the number of eggs laid and the health of the chicks that hatch.” As a result, “Birds raised in the presence of traffic noise are prematurely aged, and their future lifespans already curtailed, before they have even left the nest.”

Disruptions in the natural soundscape are particularly stress-inducing to prey species (and most species, even predators, are at risk of being someone else’s prey), since they have difficulty hearing the alarm signals sent out by members of their own and other species. To compensate, Donald writes, “animals living near roads become more vigilant, spending more of their time looking around for danger and consequently having less time to feed.”

A few species are tolerant of high noise levels, and seldom become road kill; their numbers tend to go up as a result of traffication. Many more species are bothered by the noise, even at a distance of several hundred meters from a busy road. That means their good habitat continues to shrink and and their numbers continue to drop. Donald writes that half of the area of the United Kingdom, and three-quarters of the area of England, is within 500 meters of a road, and therefore within the zone where noise pollution drives away or sickens many species.

Six-hundred thousand islands

When coming up to a roadway, Donald explains, some animals pay no attention at all, others pause and then dash across, while others seldom or never cross the road. As the road gets wider, or as the traffic gets faster and louder, more and more species become road avoiders.

While the road avoiders do not end up as roadkill, the road’s effect on the long-term prospects of their species is still negative.

When animals – be they insects, amphibians, mammals or birds – refuse to cross the roads that surround their territories, they are effectively marooned on islands. Taking account of major roads only, the land area of the globe is now divided into 600,000 such islands, Donald writes.

Populations confined to small islands gradually become less genetically diverse, which makes them less resilient to diseases, stresses and catastrophes. Local floods, fires, droughts, or heat waves might wipe out a species within such an island – and the population is not likely to be replenished from another island if the barriers (roadways) are too wide or too busy.

The onset of climate change adds another dimension to the harm:

“For a species to keep up as its climate bubble moves across the landscape , it needs to be able to spread into new areas as they become favourable . … In an era of rapid climate change, wildlife needs landscapes to be permeable, allowing each species to adapt to changing conditions in the optimal way. For many species, and particularly for road-avoiders, our dense network of tarmac [paved road] blockades will prove to be a significant problem.”

Escaping traffication

Is traffication a one-way road, destined to get steadily worse each year?

There are solutions, Donald writes, though they require significant changes from society. He makes clear that electrification of the auto fleet is not one of those solutions. It’s obvious that electric cars will not reduce the numbers of animals sacrificed as road kill. Less obvious, perhaps, is that electric cars will make little difference to the noise pollution, light pollution, and local air pollution resulting from traffication.

At speeds over about 20 mph (32 km/hr) most car noise comes from the sound of tires on pavement, so electric cars remain noisy at speed.

And due to concerted efforts to reduce the tailpipe emissions from gas-powered cars, most particulate emissions from cars are now due to tire wear and brake pad wear. Since electric cars are generally heavier, their non-tailpipe emissions tend to be worse than those from gas-powered cars.

One remedy that has been implemented with great success is the provision of wildlife bridges or tunnels across major roadways. In combination with fencing, such crossings have been found to reduce road kill by more than 80 per cent. The crossings are expensive, however, and do nothing to remedy the effects of noise, particulate pollution, and light pollution.

A partial but significant remedy can be achieved wherever there is a concerted program of auto speed reductions:

“Pretty much all the damage caused by road traffic – to the environment, to wildlife and to our health – increases exponentially with vehicle speed. The key word here is exponentially – a drop in speed of a mere 10 mph might halve some of the problems of traffication, such as road noise and particulate pollution.”

Beyond those remedies, though, the key is social reorganization that results in fewer people routinely driving cars, and then for shorter distances. Such changes will take time – but at least in some areas of global society, such changes are beginning.

Donald finds cause for cautious optimism, he says, in that “society is already drifting slowly towards de-traffication, blown by strengthening winds of concern over human health and climate change.”

There’s scant evidence of this trend in my part of Ontario right now,2 but Donald believes “We might at least be approaching the high water mark of motoring, what some writers refer to as ‘ peak car ’”. Let’s hope he’s right.


1 A scathing report by the Province’s Auditor General found that the zoning change will result in a multi-billion dollar boost to the balance sheets of large land speculators, who also happen to be friends of and donors to the Premier.

2 However, there has been a huge groundswell of protest against Premier Doug Ford’s plan to open up Greenbelt lands for car-dependent suburban sprawl, and it remains unclear if the plan will actually become reality. See Stop Sprawl Durham for more information.


Note to subscribers: the long gap between posts this summer has been due to retina surgery and ensuing complications. It’s too early to tell if I’ll be able to resume and maintain a regular posting schedule, but I do hope to complete a post on transforming car-dependent neighbourhoods as promised in May.

How parking ate North American cities

Also published on Resilience

Forty-odd years ago when I moved from a small village to a big city, I got a lesson in urbanism from a cat who loved to roam. Navigating the streets late at night, he moved mostly under parked cars or in their shadows, intently watching and listening before quickly crossing an open lane of pavement. Parked cars helped him avoid many frightening hazards, including the horrible danger of cars that weren’t parked.

The lesson I learned was simple but naïve: the only good car is a parked car.

Yet as Henry Grabar’s new book makes abundantly clear, parking is far from a benign side-effect of car culture.

The consequences of car parking include the atrophy of many inner-city communities; a crisis of affordable housing; environmental damages including but not limited to greenhouse gas emissions; and the continued incentivization of suburban sprawl.

Paved Paradise is published by Penguin Random House, May 9, 2023

Grabar’s book is titled Paved Paradise: How Parking Explains the World. The subtitle is slightly hyperbolic, but Grabar writes that “I have been reporting on cities for more than a decade, and I have never seen another subject that is simultaneously so integral to the way things work and so overlooked.”

He illustrates his theme with stories from across the US, from New York to Los Angeles, from Chicago to Charlotte to Corvallis.

Paved Paradise is as entertaining as it is enlightening, and it should help ensure that parking starts to get the attention it deserves.

Consider these data points:

  • “By square footage, there is more housing for each car in the United States than there is housing for each person.” (page 71; all quotes in this article are from Paved Paradise)
  • “The parking scholar Todd Litman estimates it costs $4,400 to supply parking for each vehicle for a year, with drivers directly contributing just 20 percent of that – mostly in the form of mortgage payments on a home garage.” (p 81)
  • “Many American downtowns, such as Little Rock, Newport News, Buffalo, and Topeka, have more land devoted to parking than to buildings.” (p 75)
  • Parking scholar Donald Shoup estimated that in 1998, “there existed $12,000 in parking for every one of the country’s 208 million cars. Because of depreciation, the average value of each of those vehicles was just $5,500 …. Therefore, Shoup concluded, the parking stock cost twice as much as the actual vehicles themselves. (p 150)

How did American cities come to devote vast amounts of valuable real estate to car storage? Grabar goes back to basics: “Every trip must begin and end with a parking space ….” A driver needs a parking space at home, and another one at work, another one at the grocery store, and another one at the movie theatre. There are six times as many parking spaces in the US as there are cars, and the multiple is much higher in some cities.

This isn’t a crippling problem in sparsely populated areas – but most Americans live or work or shop in relatively crowded areas. As cars became the dominant mode of transportation the “parking problem” became an obsession. It took another 60 or 70 years for many urban planners to reluctantly conclude that the parking problem can not be solved by building more parking spaces.

By the dawn of the twenty-first century parking had eaten American cities. (And though Grabar limits his story to the US, parking has eaten Canadian cities too.)

Grabar found that “Just one in five cities zoned for parking in 1950. By 1970, 95 percent of U.S. cities with over twenty-five thousand people had made the parking spot as legally indispensable as the front door.” (p 69)

The Institute of Transportation Engineers theorized that every building “generated traffic”, and therefore every type of building should be required to provide at least a specified number of parking spaces. So-called “parking minimums” became a standard feature of the urban planning rulebook, with wide-ranging and long-lasting consequences.

Previously common building types could no longer be built in most areas of most American cities:

“Parking requirements helped trigger an extinction-level event for bite-size, infill apartment buildings …; the production of buildings with two to four units fell more than 90 percent between 1971 and 2021.” (p 180)

On a small lot, even if a duplex or quadplex was theoretically permitted, the required parking would eat up too much space or require the construction of unaffordable underground parking.

Commercial construction, too, was inexorably bent to the will of the parking god:

“Fast-food architecture – low-slung, compact structures on huge lots – is really the architecture of parking requirements. Buildings that repel each other like magnets of the same pole.” (p 181)

While suburban development was subsidized through vast expenditures on highways and multi-lane arterial roads, parking minimums were hollowing out urban cores. New retail developments and office complexes moved to urban edges where big tracts of land could be affordably devoted to “free” parking.

Coupled with separated land use rules – keeping workplaces away from residential or retail areas – parking minimums resulted in sprawling development. Fewer Americans lived within safe walking or cycling distance from work, school or stores. Since few people had a good alternative to driving, there needed to be lots of parking. Since new developments needed lots of extra land for that parking, they had to be built further apart – making people even more car-dependent.

As Grabar explains, the almost universal application of parking minimums does not indicate that there is no market for real estate with little or no parking. To the contrary, the combination of high demand and minimal supply means that neighbourhoods offering escape from car-dependency are priced out of reach of most Americans:

“The most expensive places to live in the country were, by and large, densely populated and walkable neighborhoods. If the market was sending a signal for more of anything, it was that.” (p 281)

Is the solution the elimination of minimum parking requirements? In some cases that has succeeded – but reversing a 70- or 80-year-old development pattern has proven more difficult in other areas. 

Resident parking on Wellington Street, South End, Boston, Massachusetts. Photo by Billy Wilson, September 2022, licensed through Creative Commons BY-NC 2.0, accessed at Flickr.

The high cost of free parking

Paved Paradise acknowledges an enormous debt to the work of UCLA professor Donald Shoup. Published in 2005, Shoup’s 773-page book The High Cost of Free Parking continues to make waves.

As Grabar explains, Shoup “rode his bicycle to work each day through the streets of Los Angeles,” and he “had the cutting perspective of an anthropologist in a foreign land.” (p 149)

While Americans get exercised about the high price they occasionally pay for parking, in fact most people park most of the time for “free.” Their parking space is paid for by tax dollars, or by store owners, or by landlords. Most of the cost of parking is shared between those who drive all the time and those who seldom or never use a car.

By Shoup’s calculations, “the annual American subsidy to parking was in the hundreds of billions of dollars.” Whether or not you had a car,

“You paid [for the parking subsidy] in the rent, in the check at the restaurant, in the collection box at church. It was hidden on your receipt from Foot Locker and buried in your local tax bill. You paid for parking with every breath of dirty air, in the flood damage from the rain that ran off the fields of asphalt, in the higher electricity bills from running an air conditioner through the urban heat-island effect, in the vanishing natural land on the outskirts of the city. But you almost never paid for it when you parked your car ….” (p 150)

Shoup’s book hit a nerve. Soon passionate “Shoupistas” were addressing city councils across the country. Some cities moved toward charging market prices for the valuable public real estate devoted to private car storage. Many cities also started to remove parking minimums from zoning codes, and some cities established parking maximums – upper limits on the number of parking spaces a developer was allowed to build.

In some cases the removal of parking minimums has had immediate positive effects. Los Angeles became a pioneer in doing away with parking minimums. A 2010 survey looked at downtown LA projects constructed following the removal of parking requirements. Without exception, Grabar writes, these projects “had constructed fewer parking spaces than would have been required by [the old] law. Developers built what buyers and renters wanted ….” (p 193) Projects which simply wouldn’t have been built under old parking rules came to market, offering buyers and tenants a range of more affordable options.

In other cities, though, the long habit of car-dependency was more tenacious. Grabar writes:

“Starting around 2015, parking minimums began to fall in city after city. But for every downtown LA, where parking-free architecture burst forth, there was another place where changing the law hadn’t changed much at all.” (p 213)

In neighbourhoods with few stores or employment prospects within a walking or cycling radius, and in cities with poor public transit, there remains a weak market for buildings with little or no parking. After generations of heavily subsidized, zoning-incentivized car-dependency,

“There were only so many American neighborhoods that even had the bones to support a car-free life …. Parking minimums were not the only thing standing between the status quo and the revival of vibrant, walkable cities.” (p 214)

There are many strands to car culture: streets that are unsafe for people outside a heavy armoured box; an acute shortage of affordable housing except at the far edges of cities; public transit that is non-existent or so infrequent that it can’t compete with driving; residential neighbourhoods that fail to provide work, shopping, or education opportunities close by. All of these factors, along with the historical provision of heavily subsidized parking, must be changed in tandem if we want safe, affordable, environmentally sustainable cities.

Though it is an exaggeration to say “parking explains the world”, Grabar makes it clear that you can’t explain the world of American cities without looking at parking.

In the meantime, sometimes it works to use parked cars to promote car-free ways of getting around. Grabar writes,

“One of [Janette] Sadik-Khan’s first steps as transportation commissioner was taking a trip to Copenhagen, where she borrowed an idea for New York: use the parked cars to protect the bike riders. By putting the bike lanes between the sidewalk and the parking lane, you had an instant wall between cyclists and speeding traffic. Cycling boomed; injuries fell ….” (p 256)

A street-wise cat I knew forty years ago would have understood.


Photo at top of page: Surface parking lot adjacent to Minneapolis Armory, adapted from photo by Zach Korb, August 2006. Licensed via Creative Commons BY-NC-2.0, accessed via Flickr. Part of his 116-photo series “Downtown Minneapolis Parking.”

What we know, and don’t know, about bees

Also published on Resilience

It will be several more weeks before bees start visiting flowers in my part of the world. But while I wait for gardens and meadows to come alive again, it’s been a joy to read Stephen Buchmann’s new book What a Bee Knows. (Island Press, March 2023)

Buchmann sets the scene in his opening chapter, describing how a ground-nesting bee cautiously emerges from her nest after looking and listening for possible predators:

“The female bee briefly shivers the powerful flight muscles within her thorax to warm up. Ready, she launches herself skyward and hovers in midair. Performing an aerial pirouette, she flies left, then back to the center, and then to the right of her nest. She repeats these back-and-forth, ever-wider zigzags, all while facing her nest and flying higher with each pass. In fact, she is memorizing the locations of the physical landmarks around her nest. These could be small stones, live or dead plants, bits of wood, or similar debris. She quickly creates a mental map of her home terrain. In less than a minute, she has memorized all the visual imagery, the spatial geometry, and the smells of her immediate surroundings.” (What a Bee Knows: Exploring the Thoughts, Memories, and Personalities of Bees, page 2)

Bees use a wide range of senses to navigate through the world, sometimes in ways we can scarcely imagine. As a pollination ecologist with decades of research experience, Buchmann is an ideal guide to this world, at once both familiar and alien, in our own backyards.

Let’s start with that word “knows”. Buchmann cites his own experience and the work of many other researchers to make the case that bees form, memorize, and use mental maps; they can count; they feel pain; they can react to changes by enacting new plans, even when the plans will not bear fruit for most of a bee’s lifetime; and they can likely pass some cognitive tests that are beyond the ability of dogs and cats. All very impressive, for a group of insects whose tiny brains have hardly changed in structure for a hundred million years.

That brain must manage a range of sensory inputs. A bee’s eyes – far larger, proportionally, than ours – see in three colours, ultraviolet, green and blue. In some respects a bee’s vision is low-resolution, but it provides high-speed imagery which allows a bee to distinguish flowers, and other insects, while zooming through meadows at 20 kilometers/hour or faster.

Honey Bee at Borage flower. Buchmann writes: “Compared with the size of their heads, bee have immense faceted eyes. Their vision, however, is much coarser than our own; they can recognize the shape of a flower only from a few inches away. Bee color vision is shifted into the ultraviolet (UV) part of the spectrum, but they are blind to red colors. Astonishingly, they can recognized patterns of polarized light across an otherwise uniform blue sky.” (What a Bee Knows, p 47)

Nearly all species of bees are vegans, though they evolved from predatory wasps. These wasps dined on tiny thrips, which tended to come with a tasty dusting of nutritious pollen. Over time, the prevailing theory goes, proto-bees learned to stop chasing thrips and just go straight to flowers for meals of pollen and nectar. Today bees attach a tiny ball of “bee bread” – a mix of nectar and pollen – to each egg, and this supplies all the nutrients a hatching larva needs to develop into an adult flying bee.

Though most flowering plants need bees and/or other pollinators, and bees need flowers, the relationship is complex.

Bee laden with Yellow Salsify pollen. Buchmann writes: “we need to remember that plants and bees have very different evolutionary goals. Bees must collect pollen and nectar to feed their larvae and themselves. … Flowering plants want to minimize pollen wastage.” (p 77)

Flowers need to ensure someone will carry pollen from one flower to another of the same species. That service comes with costs:

“About 3 percent of a flowering plant’s total energy budget is invested in the production of nectar. Pollen, floral oils, resins, and floral scent molecules are even more costly for plants to produce in their strategies for attracting, keeping, and rewarding pollinating bees.” (p 86)

Bees will happily move from flower to flower, picking up and losing pollen along the way. But if a bee takes pollen from a salsify flower, visits a fleabane next, then goes to a dandelion, not many of the pollen grains will make it to the right blossoms to fertilize those flowers. From a flower’s point of view, it’s important that a bee visits mostly flowers of one species on a given day.

Bumblebee on catnip. Buchmann writes: “[Researchers found that] bumblebees had an intermediate level of floral constancy. Bumblebees are considered to be less faithful foragers than honey bees.” (p 137)

Richly attractive scents help flowers keep bees coming back. But how does the bee detect that scent? More to the point, where is a bee’s nose? Buchmann tells us:

“The honey bee’s paired antennae are her nose. Both antennae are covered with thousands of sensory hairs, most of which respond to airborne odors. … Bees’ antennae … provide directional information. Think of smelling in stereo. Their antennae can move independently; therefore, unlike us with our fixed noses, bees can get a three-dimensional impression of an odor field.” (p 59-60)

But if flowers smell so good they keep bees coming back to their species, and only their species, that brings up another problem for bees to solve. How can a bee ensure, before she zooms in for a landing, that another bee hasn’t recently made off with all the pollen?

The answer may be that bees, which pick up a positive electrostatic charge while flying, are able to sense changes in the electrostatic charges of flowers – allowing them to sense which flowers have been recently visited.

Honey Bee on Aster. Buchmann writes: “Plants typically bear flowers at or near their growing tips, and these tips develop the strongest negative charges over an entire plant’s surface. Positively charged flying bumblebees and likely other bees can detect the negative charges on flower surfaces. Across their petals, stamens, and styles, flowers possess fine patterns of differing electrostatic charges.” (p 68)

What a Bee Knows is stuffed with fascinating information. Why does a male bee (drone) have no father, though he does have a grandfather? (It’s because a queen bee lays some fertilized eggs and some unfertilized eggs. All male bees are born from the unfertilized eggs, while all female bees, including queens, are born from fertilized eggs.)

How do honey bees make precisely-engineered, energy- and material-efficient honeycomb cells from beeswax? (Partly through careful teamwork in producing, chewing, and depositing tiny flakes of wax – and partly through the emergent, self-organizing physical properties of beeswax when it is heated to a range of  37°–40°C.)

We might guess that for a scientist with a career in bee research, one of the most satisfying recurring phrases in the book is “we don’t know” – many mysteries remain for bee students to explore. I wish, though, that the book were not so wholly reliant solely on the western scientific tradition, or at least that it had clearly acknowledged that many peoples around the world have likely known things about bees long before any western-trained scientist “discovered” these things. Indeed, much knowledge about bees has likely vanished in recent centuries, along with the traditions and languages of many human cultures.

One other question kept coming to my mind as I read through the book: what about the widely-reported problem of diminishing pollinator populations, which I can see even in my own back yard? As Buchmann reveals in the Epilogue, he too has been concerned about this problem – for decades. In 1996 he co-authored a book entitled The Forgotten Pollinators, and in past twenty-seven years, “unfortunately, things have only gotten worse for pollinators.” (p 211)

For 100 million years, bees and their relatives have made the most of their marvelously capable sensory organs, and a relatively simple, efficient brain. They have adapted to changes in ecosystems while also engineering changes in those ecosystems.

The great majority of flowering plants, including those responsible for most human food, depend on bees and other pollinators – but by our actions we are rapidly killing them off.

As Buchmann puts it, “It’s simple: we need bees more than they need us.”

Will some species of bees find ways to survive, either in spite of us or after we are gone? Will we humans carry on with the practices that are driving so many species towards extinction, thereby promoting, also, our own extinction? The answer to those questions, too, is simple.

We don’t know.


Photos used for this review taken by Bart Hawkins Kreps in Port Darlington, Ontario. Image at top of page: Green Metallic Sweat Bee on Echinacea flower (full-screen image here).

A road map that misses some turns

A review of No Miracles Needed

Also published on Resilience

Mark Jacobson’s new book, greeted with hosannas by some leading environmentalists, is full of good ideas – but the whole is less than the sum of its parts.

No Miracles Needed, by Mark Z. Jacobson, published by Cambridge University Press, Feb 2023. 437 pages.

The book is No Miracles Needed: How Today’s Technology Can Save Our Climate and Clean Our Air (Cambridge University Press, Feb 2023).

Jacobson’s argument is both simple and sweeping: We can transition our entire global economy to renewable energy sources, using existing technologies, fast enough to reduce annual carbon dioxide emissions at least 80% by 2030, and 100% by 2050. Furthermore, we can do all this while avoiding any major economic disruption such as a drop in annual GDP growth, a rise in unemployment, or any drop in creature comforts. But wait – there’s more! In so doing, we will also completely eliminate pollution.

Just don’t tell Jacobson that this future sounds miraculous.

The energy transition technologies we need – based on Wind, Water and Solar power, abbreviated to WWS – are already commercially available, Jacobson insists. He contrasts the technologies he favors with “miracle technologies” such as geoengineering, Carbon Capture Storage and Utilization (CCUS), or Direct Air Capture of carbon dioxide (DAC). These latter technologies, he argues, are unneeded, unproven, expensive, and will take far too long to implement at scale; we shouldn’t waste our time on such schemes.  

The final chapter helps to understand both the hits and misses of the previous chapters. In “My Journey”, a teenage Jacobson visits the smog-cloaked cities of southern California and quickly becomes aware of the damaging health effects of air pollution:

“I decided then and there, that when I grew up, I wanted to understand and try to solve this avoidable air pollution problem, which affects so many people. I knew what I wanted to do for my career.” (No Miracles Needed, page 342)

His early academic work focused on the damages of air pollution to human health. Over time, he realized that the problem of global warming emissions was closely related. The increasingly sophisticated computer models he developed were designed to elucidate the interplay between greenhouse gas emissions, and the particulate emissions from combustion that cause so much sickness and death.

These modeling efforts won increasing recognition and attracted a range of expert collaborators. Over the past 20 years, Jacobson’s work moved beyond academia into political advocacy. “My Journey” describes the growth of an organization capable of developing detailed energy transition plans for presentation to US governors, senators, and CEOs of major tech companies. Eventually that led to Jacobson’s publication of transition road maps for states, countries, and the globe – road maps that have been widely praised and widely criticized.

In my reading, Jacobson’s personal journey casts light on key features of No Miracles Needed in two ways. First, there is a singular focus on air pollution, to the omission or dismissal of other types of pollution. Second, it’s not likely Jacobson would have received repeat audiences with leading politicians and business people if he challenged the mainstream orthodox view that GDP can and must continue to grow.

Jacobson’s road map, then, is based on the assumption that all consumer products and services will continue to be produced in steadily growing quantities – but they’ll all be WWS based.

Does he prove that a rapid transition is a realistic scenario? Not in this book.

Hits and misses

Jacobson gives us brief but marvelously lucid descriptions of many WWS generating technologies, plus storage technologies that will smooth the intermittent supply of wind- and sun-based energy. He also goes into considerable detail about the chemistry of solar panels, the physics of electricity generation, and the amount of energy loss associated with each type of storage and transmission.

These sections are aimed at a lay readership and they succeed admirably. There is more background detail, however, than is needed to explain the book’s central thesis.

The transition road map, on the other hand, is not explained in much detail. There are many references to scientific papers in which he outlines his road maps. A reader of No Miracles Needed can take Jacobson’s word that the model is a suitable representation, or you can find and read Jacobson’s articles in academic journals – but you don’t get the needed details in this book.

Jacobson explains why, at the level of a device such as a car or a heat pump, electric energy is far more efficient in producing motion or heat than is an internal combustion engine or a gas furnace. Less convincingly, he argues that electric technologies are far more energy-efficient than combustion for the production of industrial heat – while nevertheless conceding that some WWS technologies needed for industrial heat are, at best, in prototype stages.

Yet Jacobson expresses serene confidence that hard-to-electrify technologies, including some industrial processes and long-haul aviation, will be successfully transitioning to WWS processes – perhaps including green hydrogen fuel cells, but not hydrogen combustion – by 2035.

The confidence in complex global projections is often jarring. For example, Jacobson tells us repeatedly that the fully WWS energy system of 2050 “reduces end-use energy requirements by 56.4 percent” (page 271, 275).1 The expressed precision notwithstanding, nobody yet knows the precise mix of storage types, generation types, and transmission types, which have various degrees of energy efficiency, that will constitute a future WWS global system. What we should take from Jacobson’s statements is that, based on the subset of factors and assumptions – from an almost infinitely complex global energy ecosystem – which Jacobson has included in his model, the calculated outcome is a 56% end-use energy reduction.

Canada’s Premiers visit Muskrat Falls dam construction site, 2015. Photo courtesy of Government of Newfoundland and Labrador; CC BY-NC-ND 2.0 license, via Flickr.

Also jarring is the almost total disregard of any type of pollution other than that which comes from fossil fuel combustion. Jacobson does briefly mention the particles that grind off the tires of all vehicles, including typically heavier EVs. But rather than concede that these particles are toxic and can harm human and ecosystem health, he merely notes that the relatively large particles “do not penetrate so deep into people’s lungs as combustion particles do.” (page 49)

He claims, without elaboration, that “Environmental damage due to lithium mining can be averted almost entirely.” (page 64) Near the end of the book, he states that “In a 2050 100 percent WWS world, WWS energy private costs equal WWS energy social costs because WWS eliminates all health and climate costs associated with energy.” (page 311; emphasis mine)

In a culture which holds continual economic growth to be sacred, it would be convenient to believe that business-as-usual can continue through 2050, with the only change required being a switch to WWS energy.

Imagine, then, that climate-changing emissions were the only critical flaw in the global economic system. Given that assumption, is Jacobson’s timetable for transition plausible?

No. First, Jacobson proposes that “by 2022”, no new power plants be built that use coal, methane, oil or biomass combustion; and that all new appliances for heating, drying and cooking in the residential and commercial sectors “should be powered by electricity, direct heat, and/or district heating.” (page 319) That deadline has passed, and products that rely on combustion continue to be made and sold. It is a mystery why Jacobson or his editors would retain a 2022 transition deadline in a book slated for publication in 2023.

Other sections of the timeline also strain credulity. “By 2023”, the timeline says, all new vehicles in the following categories should be either electric or hydrogen fuel-cell: rail locomotives, buses, nonroad vehicles for construction and agriculture, and light-duty on-road vehicles. This is now possible only in a purely theoretical sense. Batteries adequate for powering heavy-duty locomotives and tractors are not yet in production. Even if they were in production, and that production could be scaled up within a year, the charging infrastructure needed to quickly recharge massive tractor batteries could not be installed, almost overnight, at large farms or remote construction sites around the world.

While electric cars, pick-ups and vans now roll off assembly lines, the global auto industry is not even close to being ready to switch the entire product lineup to EV only. Unless, of course, they were to cut back auto production by 75% or more until production of EV motors, batteries, and charging equipment can scale up. Whether you think that’s a frightening prospect or a great idea, a drastic shrinkage in the auto industry would be a dramatic departure from a business-as-usual scenario.

What’s the harm, though, if Jacobson’s ambitious timeline is merely pushed back by two or three years?

If we were having this discussion in 2000 or 2010, pushing back the timeline by a few years would not be as consequential. But as Jacobson explains effectively in his outline of the climate crisis, we now need both drastic and immediate actions to keep cumulative carbon emissions low enough to avoid global climate catastrophe. His timeline is constructed with the goal of reducing carbon emissions by 80% by 2030, not because those are nice round figures, but because he (and many others) calculate that reductions of that scale and rapidity are truly needed. Even one or two more years of emissions at current rates may make the 1.5°C warming limit an impossible dream.

The picture is further complicated by a factor Jacobson mentions only in passing. He writes,

“During the transition, fossil fuels, bioenergy, and existing WWS technologies are needed to produce the new WWS infrastructure. … [A]s the fraction of WWS energy increases, conventional energy generation used to produce WWS infrastructure decreases, ultimately to zero. … In sum, the time-dependent transition to WWS infrastructure may result in a temporary increase in emissions before such emissions are eliminated.” (page 321; emphasis mine)

Others have explained this “temporary increase in emissions” at greater length. Assuming, as Jacobson does, that a “business-as-usual” economy keeps growing, the vast majority of goods and services will continue, in the short term, to be produced and/or operated using fossil fuels. If we embark on an intensive, global-scale, rapid build-out of WWS infrastructures at the same time, a substantial increment in fossil fuels will be needed to power all the additional mines, smelters, factories, container ships, trucks and cranes which build and install the myriad elements of a new energy infrastructure. If all goes well, that new energy infrastructure will eventually be large enough to power its own further growth, as well as to power production of all other goods and services that now rely on fossil energy.

Unless we accept a substantial decrease in non-transition-related industrial activity, however, the road that takes us to a full WWS destination must route us through a period of increased fossil fuel use and increased greenhouse gas emissions.

It would be great if Jacobson modeled this increase to give us some guidance how big this emissions bump might be, how long it might last, and therefore how important it might be to cumulative atmospheric carbon concentrations. There is no suggestion in this book that he has done that modeling. What should be clear, however, is that any bump in emissions at this late date increases the danger of moving past a climate tipping point – and this danger increases dramatically with every passing year.


1In a tl;dr version of No Miracles Needed published recently in The Guardian, Jacobson says “Worldwide, in fact, the energy that people use goes down by over 56% with a WWS system.” (“‘No miracles needed’: Prof Mark Jacobson on how wind, sun and water can power the world”, 23 January 2023)

 


Photo at top of page by Romain Guy, 2009; public domain, CC0 1.0 license, via Flickr.

Profits of Utopia

Also published on Resilience

What led to the twentieth century’s rapid economic growth? And what are the prospects for that kind of growth to return?

Slouching Towards Utopia: An Economic History of the Twentieth Century, was published by Basic Books, Sept 2022; 605 pages.

Taken together, two new books go a long way toward answering the first of those questions.

Bradford J. DeLong intends his Slouching Towards Utopia to be a “grand narrative” of what he calls “the long twentieth century”.

Mark Stoll summarizes his book Profit as “a history of capitalism that seeks to explain both how capitalism changed the natural world and how the environment shaped capitalism.”

By far the longer of the two books, DeLong’s tome primarily concerns the years from 1870 to 2010. Stoll’s slimmer volume goes back thousands of years, though the bulk of his coverage concerns the past seven centuries.

Both books are well organized and well written. Both make valuable contributions to an understanding of our current situation. In my opinion Stoll casts a clearer light on the key problems we now face.

Although neither book explicitly addresses the prospects for future prosperity, Stoll’s concluding verdict offers a faint hope.

Let’s start with Slouching Towards Utopia. Bradford J. Delong, a professor of economics at University of California Berkeley, describes “the long twentieth century” – from 1870 to 2010 – as “the first century in which the most important historical thread was what anyone would call the economic one, for it was the century that saw us end our near-universal dire material poverty.” (Slouching Towards Utopia, page 2; emphasis mine) Unfortunately that is as close as he gets in this book to defining just what he means by “economics”.

On the other hand he does tell us what “political economics” means:

“There is a big difference between the economic and the political economic. The latter term refers to the methods by which people collectively decide how they are going to organize the rules of the game within which economic life takes place.” (page 85; emphasis in original)

Discussion of the political economics of the Long Twentieth Century, in my opinion, account for most of the bulk and most of the value in this book.

DeLong weaves into his narratives frequent – but also clear and concise – explanations of the work of John Maynard Keynes, Friedrich Hayek, and Karl Polanyi. These three very different theorists responded to, and helped bring about, major changes in “the rules of the game within which economic life takes place”.

DeLong uses their work to good effect in explaining how policymakers and economic elites navigated and tried to influence the changing currents of market fundamentalism, authoritarian collectivism, social democracy, the New Deal, and neoliberalism.

With each swing of the political economic pendulum, the industrial, capitalist societies either slowed, or sped up, the advance “towards utopia” – a society in which all people, regardless of class, race, or sex, enjoy prosperity, human rights and a reasonably fair share of the society’s wealth.

DeLong and Stoll present similar perspectives on the “Thirty Glorious Years” from the mid-1940s to the mid-1970s, and a similarly dim view of the widespread turn to neoliberalism since then.

They also agree that while a “market economy” plays an important role in generating prosperity, a “market society” rapidly veers into disaster. That is because the market economy, left to its own devices, exacerbates inequalities so severely that social cohesion falls apart. The market must be governed by social democracy, and not the other way around.

DeLong provides one tragic example:

“With unequal distribution, a market economy will generate extraordinarily cruel outcomes. If my wealth consists entirely of my ability to work with my hands in someone else’s fields, and if the rains do not come, so that my ability to work with my hands has no productive market value, then the market will starve me to death – as it did to millions of people in Bengal in 1942 and 1943.” (Slouching Towards Utopia, p 332)

Profit: An Environmental History was published by Polity Books, January 2023; 280 pages.

In DeLong’s and Stoll’s narratives, during the period following World War II “the rules of the economic game” in industrialized countries were set in a way that promoted widespread prosperity and rising wealth for nearly all classes, without a concomitant rise in inequality.

As a result, economic growth during that period was far higher than it had been from 1870 to 1940, before the widespread influence of social democracy, and far higher than it has been since about 1975 during the neoliberal era.

During the Thirty Glorious Years, incomes from the factory floor to the CEO’s office rose at roughly the same rate. Public funding of advanced education, an income for retired workers, unemployment insurance, strong labor unions, and (in countries more civilized than the US) public health insurance – these social democratic features ensured that a large and growing number of people could continue to buy the ever-increasing output of the consumer economy. High marginal tax rates ensured that government war debts would be retired without cutting off the purchasing power of lower and middle classes.

Stoll explains that long-time General Motors chairman Alfred Sloan played a key role in the transition to a consumer economy. Under his leadership GM pioneered a line-up ranging from economy cars to luxury cars; the practice of regularly introducing new models whose primary features were differences in fashion; heavy spending on advertising to promote the constantly-changing lineup; and auto financing which allowed consumers to buy new cars without first saving up the purchase price.

By then the world’s largest corporation, GM flourished during the social democratic heyday of the Thirty Glorious Years. But in Stoll’s narrative, executives like Alfred Sloan couldn’t resist meddling with the very conditions that had made their version of capitalism so successful:

“There was a worm in the apple of postwar prosperity, growing out of sight until it appeared in triumph in the late 1970s. The regulations and government activism of the New Deal … so alarmed certain wealthy corporate leaders, Alfred Sloan among them, that they began to develop a propaganda network to promote weak government and low taxes.” (Profit, page 176)

This propaganda network achieved hegemony in the 1980s as Ronald Reagan and Margaret Thatcher took the helm in the US and the UK. DeLong and Stoll concur that the victory of neoliberalism resulted in a substantial drop in the economic growth rate, along with a rapid growth in inequality. As DeLong puts it, the previous generation’s swift march towards utopia slowed to a crawl.

DeLong and Stoll, then, share a great deal when it comes to political economics – the political rules that govern how economic wealth is distributed.

On the question of how that economic wealth is generated, however, DeLong is weak and Stoll makes a better guide.

DeLong introduces his discussion of the long twentieth century with the observation that between 1870 and 2010, economic growth far outstripped population growth for the first time in human history. What led to that economic acceleration? There were three key factors, DeLong says:

“Things changed starting around 1870. Then we got the institutions for organization and research and the technologies – we got full globalization, the industrial research laboratory, and the modern corporation. These were the keys. These unlocked the gate that had previously kept humanity in dire poverty.” (Slouching Towards Utopia, p. 3)

Thomas Edison’s research lab in West Orange, New Jersey. Cropped from photo by Anita Gould, 2010, CC BY-SA 2.0 license, via Flickr.

These may have been necessary conditions for a burst of economic growth, but were they sufficient? If they were sufficient, then why should we believe that the long twentieth century is conclusively over? Since DeLong’s three keys are still in place, and if only the misguided leadership of neoliberalism has spoiled the party, would it not be possible that a swing of the political economic pendulum could restore the conditions for rapid economic growth?

Indeed, in one of DeLong’s few remarks directly addressing the future he says “there is every reason to believe prosperity will continue to grow at an exponential rate in the centuries to come.” (page 11)

Stoll, by contrast, deals with the economy as inescapably embedded in the natural environment, and he emphasizes the revolutionary leap forward in energy production in the second half of the 19th century.

Energy and environment

Stoll’s title and subtitle are apt – Profit: An Environmental History. He says that “economic activity has always degraded environments” (p. 6) and he provides examples from ancient history as well as from the present.

Economic development in this presentation is “the long human endeavor to use resources more intensively.” (p. 7) In every era, tapping energy sources has been key.

European civilization reached for the resources of other regions in the late medieval era. Technological developments such as improved ocean-going vessels allowed incipient imperialism, but additional energy sources were also essential. Stoll explains that the Venetian, Genoese and Portuguese traders who pioneered a new stage of capitalism all relied in part on the slave trade:

“By the late fifteenth century, slaves made up over ten percent of the population of Lisbon, Seville, Barcelona, and Valencia and remained common in southern coastal Portugal and Spain for another century or two.” (p. 40)

The slave trade went into high gear after Columbus chanced across the Americas. That is because, even after they had confiscated two huge continents rich in resources, European imperial powers still relied on the consumption of other humans’ lives as an economic input:

“Free-labor colonies all failed to make much profit and most failed altogether. Colonizers resorted to slavery to people colonies and make them pay. For this reason Africans would outnumber Europeans in the Americas until the 1840s.” (p. 47)

While the conditions of slavery in Brazil were “appallingly brutal”, Stoll writes, Northern Europeans made slavery even more severe. As a result “Conditions in slave plantations were so grueling and harsh that birthrates trailed deaths in most European plantation colonies.” (p 49)

‘Shipping Sugar’ from William Clark’s ‘Ten views in the island of Antigua’ (Thomas Clay, London, 1823). Public domain image via Picryl.com.

Clearly, then, huge numbers of enslaved workers played a major and fundamental role in rising European wealth between 1500 and 1800. It is perhaps no coincidence that in the 19th century, as slavery was being outlawed in colonial empires, European industries were learning how to make effective use of a new energy source: coal. By the end of that century, the fossil fuel economy had begun its meteoric climb.

Rapid increases in scientific knowledge, aided by such organizations as modern research laboratories, certainly played a role in commercializing methods of harnessing the energy in coal and oil. Yet this technological knowhow on its own, without abundant quantities of readily-extracted coal and oil, would not have led to an explosion of economic growth.

Where DeLong is content to list “three keys to economic growth” that omit fossil fuels, Stoll adds a fourth key – not merely the technology to use fossil fuels, but the material availability of those fuels.

By 1900, coal-powered engines had transformed factories, mines, ocean transportation via steamships, land transportation via railroads, and the beginnings of electrical grids. The machinery of industry could supply more goods than most people had ever thought they might want, a development Stoll explains as a transition from an industrial economy to a consumer economy.

Coal, however, could not have powered the car culture that swept across North America before World War II, and across the rest of the industrialized world after the War. To shift the consumer economy into overdrive, an even richer and more flexible energy source was needed: petroleum.

By 1972, Stoll notes, the global demand for petroleum was five-and-a-half times as great as in 1949.

Like DeLong, Stoll marks the high point of the economic growth rate at about 1970. And like DeLong, he sees the onset of neoliberalism as one factor slowing and eventually stalling the consumer economy. Unlike DeLong, however, Stoll also emphasizes the importance of energy sources in this trajectory. In the period leading up to 1970 net energy availability was skyrocketing, making rapid economic growth achievable. After 1970 net energy availability grew more slowly, and increasing amounts of energy had to be used up in the process of finding and extracting energy. In other words, the Energy Return on Energy Invested, which increased rapidly between 1870 and 1970, peaked and started to decline over recent decades.

This gradual turnaround in net energy, along with the pervasive influence of neoliberal ideologies, contributed to the faltering of economic growth. The rich got richer at an even faster pace, but most of society gained little or no ground.

Stoll pays close attention to the kind of resources needed to produce economic growth – the inputs. He also emphasizes the anti-goods that our economies turn out on the other end, be they toxic wastes from mining and smelting, petroleum spills, smog, pervasive plastic garbage, and climate-disrupting carbon dioxide emissions.

Stoll writes, 

“The relentless, rising torrent of consumer goods that gives Amazon.com its apt name places unabating demand on extractive industries for resources and energy. Another ‘Amazon River’ of waste flows into the air, water, and land.” (Profit, p. 197)

Can the juggernaut be turned around before it destroys both our society and our ecological life-support systems, and can a fair, sustainable economy take its place? On this question, Stoll’s generally excellent book disappoints.

While he appears to criticize the late-twentieth century environmental movement for not daring to challenge capitalism itself, in Profit’s closing pages he throws cold water on any notion that capitalism could be replaced.

“Capitalism … is rooted in human nature and human history. These deep roots, some of which go back to our remotest ancestors, make capitalism resilient and adaptable to time and circumstances, so that the capitalism of one time and place is not that of another. These roots also make it extraordinarily difficult to replace.” (Profit, p. 253)

He writes that “however much it might spare wildlife and clean the land, water, and air, we stop the machinery of consumer capitalism at our peril.” (p. 254) If we are to avoid terrible social and economic unrest and suffering, we must accept that “we are captives on this accelerating merry-go-round of consumer capitalism.” (p. 256)

It’s essential to curb the power of big corporations and switch to renewable energy sources, he says. But in a concluding hint at the so-far non-existent phenomenon of “absolute decoupling”, he writes,

“The only requirement to keep consumer capitalism running is to keep as much money flowing into as many pockets as possible. The challenge may be to do so with as little demand for resources as possible.” (Profit, p. 256)

Are all these transformations possible, and can they happen in time? Stoll’s final paragraph says “We can only hope it will be possible.” Given the rest of his compelling narrative, that seems a faint hope indeed.

* * *

Coming next: another new book approaches the entanglements of environment and economics with a very different perspective, telling us with cheerful certainty that we can indeed switch the industrial economy to clean, renewable energies, rapidly, fully, and with no miracles needed.



Image at top of page: ‘The Express Train’, by Charles Parsons, 1859, published by Currier and Ives. Image donated to Wikimedia Commons by Metropolitan Museum of Art.

 

“Getting to zero” is a lousy goal

Also published on Resilience

In an alternate reality, gradually moving toward a zero-carbon-emission economy and arriving there in two or three decades would be a laudable accomplishment.

In an alternate reality –   for example, the reality that might result from turning the clock back to 1975 – a twenty-five year process of eliminating all anthropogenic greenhouse gas emissions could avert a climate crisis.

But in our reality in 2022, with far too much carbon dioxide already flowing through the atmosphere and the climate crisis worsening every year, knowingly emitting more greenhouse gases for another two decades is a shockingly cavalier dance with destruction.

This understanding of the climate crisis guides the work of Bruce King and Chris Magwood. Their own field of construction, they write, can, and indeed must, become a net storer of carbon – and not by 2050 by rather by the early 2030’s.

Their new book Build Beyond Zero (Island Press, June 2022) puts the focus on so-called “embodied emissions”, also known more clearly as “upfront emissions”. The construction industry accounts for up to 15 percent of global warming emissions, and most of the emissions occur during manufacturing of building materials.

No matter how parsimonious new buildings might be with energy during their operating lifetimes, an upfront burst of carbon emissions has global warming impact when we can least afford it: right away. “A ton of emissions released today,” write King and Magwood, “has far more climate impact than a ton of emissions released a decade from now.”

Emissions released today, they emphasize, push us immediately closer to climate crisis tipping points, and emissions released today will continue to heat the globe throughout the life of a building.

Their goal, then, is to push the construction industry as a whole to grapple with the crucial issue of upfront emissions. The construction industry can, they believe, rapidly transform into a very significant net sequesterer of carbon emissions.

 That goal is expressed in their “15 x 50” graph.

By 2050, Bruce King and Chris Magwood say, the construction industry can and should sequester a net 15 gigatonnes of carbon dioxide annually. Graphic from Build Beyond Zero, Island Press, 2022, page 238.

A wide range of building materials are now or can become net storers of carbon – and those that can’t must be rapidly phased out of production or minimized.

The bulk of Build Beyond Zero consists of careful examination of major categories of building materials, plus consideration of different stages including construction, demolition, or disassembly and re-use.

Concrete – by far the largest category of building material by weight and by current emissions –is a major focus of research. King and Magwood outline many methods that are already available to reduce the carbon intensity of concrete production, as well as potential methods that could allow the net storage of carbon within concrete.

Equally important, though, are construction materials that can reduce and in some cases eliminate the use of concrete – for example, adobe and rammed earth walls and floors.

By far the largest share of carbon sequestration in buildings could come from biogenic sources ranging from timber to straw to new materials produced by fungal mycelia or algae.

Harvesting homes

“Tall timber” is a popular buzzphrase for building methods that can sequester carbon within building structures, but King and Magwood are more excited about much smaller plant materials such as wheat straw or rice hulls. Their discussion of the pros and cons of increased use of wood products is enlightening.

“Assessing the degree of carbon storage offered by timber products is not at all straightforward. Far from being the poster child for carbon-storing building, the use of timber in buildings requires a very nuanced understanding of supply chain issues and forest-level carbon stocks in order to be certain we’re not doing harm in the process of trying to do good.” (Build Beyond Zero, page 111)

First, when trees are cut down typically only half of the above-ground biomass makes it into building products; the rest decomposes and otherwise emits its stored carbon back into the atmosphere. Second, particularly where a large stand of trees is clear-cut and the ground is exposed to the elements, much of the below-ground stored carbon also returns to the atmosphere. Third, even once a replacement stand of trees has grown up, a monoculture stand seldom stores as much carbon as the original forest did, and the monoculture is also a big loss for biodiversity.

To the extent that we do harvest trees for construction, then, “We need to take responsibility for ensuring that we are growing forests at a rate that far exceeds our removals from them. Notice that we are talking about growing forests and not just planting trees.” (page 115)

This careful nuance is not always evident in their discussion of agricultural residues, in my opinion. The “15 X 50” goal includes the conversion of huge quantities of so-called “residues” – wheat straw, rice hulls, and sunflower stalk pith, to give a few examples – into long-lasting building materials. But what effects would this have on the long-term health of agricultural soils, if most of these so-called residues are routinely removed from the agricultural cycle rather than being returned to the soil? What level of such total-plant harvesting is truly sustainable?

Yet there is obvious appeal in the use of more fast-growing small plants as building material. Straw can sequester about twice as much carbon per hectare per year as forests do, while “the carbon sequestration and storage efficiency of hemp biomass is an order of magnitude higher than that of trees or straw.” (page 99)

There are many existing methods to turn small plants into building materials, ranging from structural supports to insulation to long-lasting, non-toxic finishes. It is reasonable to hope for the creation of many more such building materials, if industry can develop new carbon-emissions-free adhesives to help shape fibers and particles into a myriad of shapes. King and Magwood note that existing industrial practices are likely to act as hurdles in this quest:

“Nature provides plenty of examples and clues for making nontoxic bioadhesives in species such as mussels and spiders. However, the introduction and scaling of these potentially game-changing materials is so far hampered in the same way as bioplastics: by an extremely risk-averse construction industry and by a petrochemical industry keen to keep and expand market share ….” (page 162) 

Straw-bale construction project in Australia, 2012. Photo by Brett and Sue Coulstock, accessed via Flickr under Creative Commons license.

We don’t have 30 more years

Build Beyond Zero is a comprehensive and clear overview of construction practices and their potential climate impact in the near future. It does not, however, provide any “how-to” lessons for would-be builders or renovators to use in their own projects. For that purpose, both King and Magwood have already published extensively in books such as Essential Hempcrete Construction: The Complete Step-by-Step Guide; Essential Prefab Straw Bale Construction: The Complete Step-by-Step Guide; and Buildings of Earth and Straw: Structural Design for Rammed Earth and Straw Bale Architecture.

In Build Beyond Zero, King and Magwood offer an essential manifesto for anyone involved in commissioning or carrying out construction or renovation, anyone involved in the production of building materials, anyone involved in the establishment or modification of building codes, anyone involved in construction education.

It’s time for everyone involved with construction to become climate-literate, and to realize that upfront carbon emissions from buildings are as important if not more important than operating emissions during the buildings’ lifetimes. It’s time to realize that construction, perhaps more than most industries, has the capability of going beyond zero to become a significant net storer of carbon.

That opportunity represents an urgent task:

“It has taken more than 30 years for energy efficiency to approach a central role in building sector education …. We can’t wait that long to teach people how to make carbon-storing buildings. If we follow the usual path, the climate will be long past repair by the time enough designers and builders have learned how to fix it.” (page 173)

With global greenhouse gases already at catastrophic levels, we have dug ourselves into a deep hole and it’s nowhere near enough to gradually slow down and then stop digging deeper – we also need to fill that hole, ASAP.

As Build Beyond Zero puts it, “‘Getting to zero,” to repeat one more time, is a lousy goal, or anyway incomplete. You make a mess, you clean it up, as my mother would say. You don’t just stop messing, you also start cleaning.”


Photo at top of page: Limestone quarry and cement kiln, Bowmanville Ontario, winter 2016.

Segregation, block by block

Also published on Resilience

Is the purpose of zoning to ensure that towns and cities develop according to a rational plan? Does zoning protect the natural environment? Does zoning help promote affordable housing? Does zoning protect residents from the air pollution, noise pollution  and dangers from industrial complexes or busy highways?

To begin to answer these questions, consider this example from M. Nolan Gray’s new book Arbitrary Lines:

“It remains zoning ‘best practice’ that single-family residential districts should be ‘buffered’ from bothersome industrial and commercial districts by multifamily residential districts. This reflects zoning’s modus operandi of protecting single-family houses at all costs, but it makes no sense from a land-use compatibility perspective. While a handful of generally more affluent homeowners may be better off, it comes at the cost of many hundreds more less affluent residents suffering a lower quality of life.” (M. Nolan Gray, page 138)

Arbitrary Lines by M. Nolan Gray is published by Island Press, June 2022.

The intensification of inequality, Gray argues, is not an inadvertent side-effect of zoning, but its central purpose.

If you are interested in affordable housing, housing equity,  environmental justice, reduction of carbon emissions, adequate public transit, or streets that are safe for walking and cycling, Arbitrary Lines is an excellent resource in understanding how American cities got the way they are and how they might be changed for the better. (The book doesn’t discuss Canada, but much of Gray’s argument seems readily applicable to Canadian cities and suburbs.)

In part one and part two of this series, we looked at the complex matrix of causes that explain why “accidents”, far from being randomly distributed, happen disproportionately to disadvantaged people. In There Are No Accidents Jessie Singer writes, “Accidents are the predictable result of unequal power in every form – physical and systemic. Across the United States, all the places where a person is most likely to die by accident are poor. America’s safest corners are all wealthy.” (Singer, page 13)

Gray does not deal directly with traffic accidents, or mortality due in whole or part to contaminants from pollution sources close to poor neighbourhoods. His lucid explanation of zoning, however, helps us understand one key mechanism by which disadvantaged people are confined to unhealthy, dangerous, unpleasant places to live.

‘Technocratic apartheid’

Zoning codes in the US today make no mention of race, but Gray traces the history of zoning back to explicitly racist goals. In the early 20th century, he says, zoning laws were adopted most commonly in southern cities for the express purposes of enforcing racial segregation. As courts became less tolerant of open racism, they nonetheless put a stamp of approval on economic segregation. Given the skewed distribution of wealth, economic segregation usually resulted in or preserved de facto racial segregation as well.

The central feature and overriding purpose of zoning was to restrict the best housing districts to affluent people. Zoning accomplishes this in two ways. First, in large areas of cities and especially of suburbs the only housing allowed is single-family housing, one house per lot. Second, minimum lot sizes and minimum floor space sizes ensure that homes are larger and more expensive than they would be if left to the “free market”.

The result, across vast swaths of urban America, is that low-density residential areas have been mandated to remain low-density. People who can’t afford to buy a house, but have the means to rent an apartment, are unsubtly told to look in other parts of town.

Gray terms this segregation “a kind of technocratic apartheid,” and notes that “Combined with other planning initiatives, zoning largely succeeded in preserving segregation where it existed and instituting segregation where it didn’t.” (Gray, page 81) He cites one study that found “over 80 percent of all large metropolitan areas in the US were more racially segregated in 2019 than they were in 1990. Today, racial segregation is most acute not in the South but in the Midwest and mid-Atlantic regions.” (Gray, page 169)

Public transit? The numbers don’t add up.

From an environmental and transportation equity point of view, a major effect of zoning is that it makes good public transit unfeasible in most urban areas. Gray explains:

“There is a reasonable consensus among transportation planners that a city needs densities of at least seven dwelling units per acre to support the absolute baseline of transit: a bus that stops every thirty minutes. To get more reliable service, like bus rapid transit or light-rail service, a city needs … approximately fifteen units per acre. The standard detached single-family residential district—which forms the basis of zoning and remains mapped in the vast majority of most cities—supports a maximum density of approximately five dwelling units per acre. That is to say, zoning makes efficient transit effectively illegal in large swaths of our cities, to say nothing of our suburbs.” (Gray, page 101)

Coupled with the nearly ubiquitous adoption of rules mandating more parking space than would otherwise be built, the single-family housing and minimum lot size provisions of zoning are a disaster both for affordable housing and for environmentally-friendly housing. Typical American zoning, Gray says, “assumes universal car ownership and prohibits efficient apartment living. But it also just plain wastes space: if you didn’t know any better, you might be forgiven for thinking that your local zoning ordinance was carefully calibrated to use up as much land as possible.” (Gray, page 96)

Zoning regimes came into wide use in the mid-twentieth century and became notably stricter in the 1970s. In Gray’s view the current housing affordability crisis is the result of cities spending “the past fifty years using zoning to prevent new housing supply from meeting demand.” This succeeded in boosting values of properties owned by the already affluent, but eventually housing affordability became a problem not only for those at the bottom of the housing market but for most Americans. That is one impetus, Gray explains, for a recent movement to curb the worst features of zoning. While this movement is a welcome development, Gray argues zoning should be abolished, not merely reformed. Near the end of Arbitrary Lines, he explains many other planning and regulatory frameworks that can do much more good and much less harm than zoning.

There is one part of his argument that I found shaky. He believes that the abolition of zoning will restore economic growth by promoting movement to the “most productive” cities, and that “there is no reason to believe that there is an upper bound to the potential innovation that could come from growing cities.” (Gray, page 72) At root the argument is based on his acceptance that income is “a useful proxy for productivity” – a dubious proposition in my view. That issue aside, Arbitrary Lines is well researched, well illustrated, well reasoned and well written.

The book is detailed and wide-ranging, but unlike a typical big-city zoning document it is never boring or obscure. For environmentalists and urban justice activists Arbitrary Lines is highly recommended.


Image at top of page: detail from Winnipeg zoning map, 1947, accessed via Wikimedia Commons.

The high cost of speed

Also published on Resilience

Imagine that we used a really crazy method to establish speed limits. We could start by recording the speeds of all drivers on a given stretch of roadway. Then, without any clear evidence of what a safe speed might be, we might argue that the great majority of people drive too fast, and therefore the maximum legal speed will be set as that speed exceeded by 85 percent of drivers. Only the slowest 15 percent of drivers, in this scenario, would be considered to be driving within the legal limit.

If you have a passing familiarity with the legal framework of car culture, you will recognize the above as a simple inversion of the common 85th percentile rule used by traffic engineers throughout North America. Following this guideline, driver speeds are recorded, engineers determine the speed exceeded by only 15 per cent of the drivers, and that speed is deemed an appropriate speed limit for the given roadway. All the other drivers – 85 per cent – will then be driving within the speed limit.

Two recent books argue that the 85th percentile guideline is as arbitrary and misguided as it sounds. In There Are No Accidents, (Simon & Schuster, 2022; reviewed here last week), Jessie Singer summarizes the 85th percentile rule this way:

“Most speed limits are not based on physics or crash test expertise but simply the upper limit of what most amateur drivers feel is safe. A speed limit is the perceived safe speed of a road, not the actual risk of traveling that speed on that road.” (Singer, page 95)

Singer draws on the work of Eric Dumbaugh, who has a PhD in civil engineering and teaches urban planning at Florida Atlantic University. Dumbaugh has analyzed tens of thousands of traffic crashes in urban environments in the US. He concluded that the traffic engineering guidelines used for decades are based on false information, are often misapplied, and result in dangerous conditions on urban roadways. Absent physical evidence of what constitutes a safe driving speed, engineers simply assume that most drivers drive at a safe speed. Dumbaugh doesn’t mince words:

“Traffic engineering is a fraud discipline. It presumes knowledge on road safety that it doesn’t have and it educates people generation after generation on information that is incorrect.” (quoted by Singer, page 96)

The dangerous conditions on roadways have contributed to thirty thousand or more deaths in the US every year since 1946. But the engineers who design the roadways cannot be faulted, so long as they have applied the rules passed down to them in standard traffic engineering manuals.

Confessions of a Recovering Engineer was published by Wiley in 2021.

Similar themes are also a major focus in an excellent book by Charles Marohn Jr., Confessions of a Recovering Engineer (Wiley, 2021). Marohn was trained as a civil engineer, and for the first part of his career he worked as a traffic engineer designing what he saw at the time as “improvements” to roadways in small cities. Over time he began to question the ideas he had absorbed in his education and the guidelines that he followed in his engineering practice.

Marohn is now founder and president of Strong Towns. He has emerged as one of the most vociferous critics of the planning principles underlying American suburbia, and the design guidelines used to justify the arterial roads in those suburbs. He writes,

“The injuries and deaths, the destruction of wealth and stagnating of neighborhoods, the unfathomable backlog of maintenance costs with which most American cities struggle, are all a byproduct of the values at the heart of traffic engineering.” (Marohn, page 5)

These values are held so widely and deeply, Marohn says, that they are seldom questioned or even acknowledged. These values include :

“• Faster speeds are better than slower speeds..
• Access to distant locations by automobile is more important than access to local destinations by walking or biking. …
• At intersections, minimizing delay for automobile traffic is more important than minimizing delay for people walking or biking.” (Marohn, page 12)

Working from his own experience as a traffic engineer, Marohn explains the order in which issues are considered when designing a new or “improved” roadway. First the engineer decides on a “design speed” – a driving speed which the road should facilitate. Next to be established is the traffic volume – all the traffic typically traveling the route at present, plus all the additional traffic the engineer anticipates in the future. At that point the engineer will choose a design based on official guidelines for that design speed and that traffic volume; so long as the guidelines are followed, the design will be deemed “safe”. Finally, the engineer will estimate how much it will cost.

Marohn argues that the questions of whether traffic should move slow or fast, and whether all existing traffic should be accommodated or instead should be restricted, are not technical issues – they are questions of values, questions of public policy. Therefore, he says, issues of the desired traffic speed and desired traffic volume should be dealt with through the democratic process, with public input and with the decisions made by elected officials, not by engineering staff.

Image courtesy of Pixabay.

Some sins are forgiven

In the early days of car culture, traffic casualties happened at a far higher rate per passenger mile than they do in recent decades. Part of the improvement is due to changes in vehicle design – padded surfaces, seat belts, air bags. Part of the improvement can be attributed to what is called “forgiving design”, at least as applied on rural highways. Examples of forgiving design are gradually sloped embankments, which reduce the likelihood of rollovers if a driver veers off the road; wider lanes which lessen the chance of sideswiping; centre barriers which prevent head-on collisions; straightening of curves to improve sightlines; and removal of roadside obstacles such as large trees which an errant driver might hit.

On highways these forgiving design principles make sense, Marohn believes, but on urban arterial roads they are disastrous. He coined the word “stroad” for urban routes that combine the traffic complexity of streets with the high design speeds of inter-city roads. Stroads feature the wide lanes, cleared sightlines and levelized topography of highways, giving drivers the impression that higher speeds are safe. But stroads also have many intersections, turning vehicles, and access points for pedestrians. This means that the higher speeds are not safe, even for the drivers. And vulnerable road users – pedestrians and cyclists – often pay with their lives.

Most stroads should be converted into streets, Marohn says. “Instead of providing drivers with an illusion of safety, designers should ensure the drivers on a street feel uncomfortable when traveling at speeds that are unsafe.” (Marohn, page 43) To ensure that the mistakes of pedestrians and cyclists, and not just drivers, are forgiven, he advocates these guidelines: “Instead of widening lanes, we narrow them. Instead of smoothing curves, we tighten them. Instead of providing clear zones, we create edge friction. Instead of a design speed, we establish a target maximum travel speed.” (Marohn, page 41)

On a typical urban street, with stores, offices, schools, restaurants, and many people moving around outside of cars, that target maximum speed should be low: “Traffic needs to flow at a neighborhood speed (15 mph [24 kph] or less is optimum) to make a human habitat that is safe and productive.” (Marohn, page 56)

In recent years there has been a substantial rise in pedestrian and cyclist fatalities, even as motorist fatalities have continued a long downward trend. The rising death toll among vulnerable road users was particularly noticeable during and following the pandemic. In Marohn’s words we find a good explanation:

“Most [traffic fatalities] happen at nonpeak times and in noncongested areas. … the traffic fatality rate is much higher during periods of low congestion. This is … because the transportation system is designed to be really dangerous, and traffic congestion, along with the slow speeds that result, is masking just how dangerous it is.” (Marohn, 117)

With many businesses closed and many people working from home, there was much less traffic congestion. And without congestion acting as a brake, people drove faster and more pedestrians were killed. That wasn’t intentional, but it was predictable – it was no accident.

* * *

As Jessie Singer explains, we find an extensive matrix of causes that contributes to “accidents” when we look beyond the individual making a mistake. That matrix very often includes racial and economic inequality, which is why poor people suffer more in nearly every accident category than rich people do.

Both racial and economic factors come into play in the current wave of pedestrian deaths. In the major city closest to me, Toronto, pedestrian deaths occur disproportionately among racialized, poor, and elderly people. These deaths also occur most often on wide arterial roads – stroads – in older suburbs.

Marohn’s words again are enlightening: “as auto-oriented suburbs age and decline … they are becoming home to an increasing number of poor families, including many who do not own automobiles.” (Marohn, page 43) When these residents need to walk across four, five or six lane high-speed arterial roads, the predictable result is pedestrian deaths among the most vulnerable. An obvious, though politically difficult, solution is to redesign these roads to bring speeds down to a safe level.

The inequality that contributes to “accidents” is buttressed in most North American cities by an elaborate legal framework telling people where they are allowed to live and work. That legal framework is zoning. In the next installment of this discussion we’ll look at the history and consequences of zoning.


Image at top of page is in public domain under Creative Commons CC0, from pxhere.

Dangerous roads are no accident

Also published on Resilience

If you watch network television you can see auto companies spending a lot of money making our roads more dangerous. One slick ad after another glorifies massive cars and trucks as they careen around curves, bounce over bumps and potholes, and send up clouds of dust on always-open roads. The message is clear: it’s really cool to buy the biggest, most menacing vehicle you can afford, and drive it as aggressively as you can get away with.

It’s not that the car companies want to cause more serious injuries, but a simple logic is at work. The outsized profits from sales of big SUVs and trucks go to the bank accounts of car companies, while the hospital and funeral expenses of crash victims are charged to someone else.

There Are No Accidents, by Jessie Singer, is published by Simon & Schuster, February 2022

The way to reduce the horrific human cost of crashes, Jessie Singer explains, is simple: make the companies who produce dangerous vehicles accountable for their damages.

Singer’s book There Are No Accidents was spurred by the killing of one pedestrian by  motor vehicle, and traffic violence is one major subject she covers. Yet the book covers so many related subjects, and covers them so well, that one review cannot do the book justice.

What we call “accidents,” Singer says, usually result from a non-intentional act – a mistake – in a dangerous context. When we focus only on the person closest to the accident, who is often the person making the mistake, it’s easy to find one person to blame. But in so doing we typically overlook the more powerful people responsible for the dangerous conditions. These powerful people might be manufacturers of dangerous products, regulators who permit dangerous products or practices, or legislators who set up rules that make it difficult for accident victims to win redress. 

With this basic framework Singer looks at the history of workers’ compensation in the United States:

“By the end of the First World War, in most of the United States, when a worker had an accident, employers were legally required to provide compensation for medical care and lost work. For employers, this was a massive shift in their economic calculus. … The decline in work accidents was dramatic. Over the next two decades, deaths per hour worked would fall by two-thirds.” (all quotes in this article are from There Are No Accidents)

She also examinations the rise and fall in prescription and street drug overdoses, and the peculiar laws that conveniently overlook accidental discharge of firearms.

In all these disparate cases, a person making a mistake might pay with their life. But many social actors together set up the dangerous conditions. Economic inequality, racial prejudice and social stigmas act as multipliers of these conditions.

“Accidents are the predictable result of unequal power in every form – physical and systemic,” Singer writes. “Across the United States, all the places where a person is most likely to die by accident are poor. America’s safest corners are all wealthy.”

She also examines why “black people die in accidental fires at more than twice the rate of white people.” And why “Indigenous people are nearly three times as likely as white people to be accidentally killed by a driver while crossing the street.”

A sudden epidemic of traffic violence

About a century ago, a new and very dangerous condition began to kill people in rapidly growing numbers. “While the accidental deaths and injuries of workers generally declined from 1920 onward,” Singer writes, “accidental death in general rose – driven by huge numbers of deaths of car drivers, passengers, and pedestrians.”

Majority opinion did not, at the time, blame the children who played in streets, or “distracted walkers” who dared to stroll while engrossed in conversation. Outraged observers would occasionally pull a driver out of a car and beat him following the killing of a pedestrian, but there was also a clear recognition that the problem went beyond the actions of any individual driver. Thus citizens, editorialists, and city councils responded to the epidemic of traffic violence by calling for mandatory speed regulators in all cars to keep streets safe for people.

It took a concerted publicity campaign by the auto industry to shift the blame to “jaywalkers” or the occasional “nut behind the wheel”, and away from dangerous vehicles and dangerous traffic laws. Within a generation streets had become the precinct of drivers, with the ultimate price often paid by individual victims who still had to walk, because they couldn’t afford to drive dangerous vehicles themselves.

Eventually public demand and legislative requirements resulted in automakers introducing a wide variety of safety improvements to their cars. Notably, though, these improvements were focused almost solely on the safety of the people inside the cars.

And in the past twenty-five years there has been a large increase in the number of pedestrians killed by motorists: “Between 2009 and 2019, the U.S. Department of Transportation (DOT) reported a massive 51 percent rise in the number of pedestrians killed in the United States, from a little over 4,000 a year to more than 6,000.”

The increasing carnage was abetted by simple of facts of physics which both automakers and regulators had understood for decades:

“As long ago as 1975, the U.S. DOT itself figured out that three factors most determined whether or not a person was injured in a car accident: how much the vehicle weighed, how high it was off the ground, and how much higher its front end was compared to a pedestrian. By 1997, the department demonstrated that large vehicles such as SUVs and pickup trucks were significantly more likely to kill a pedestrian in a crash than smaller cars.” 

The automakers knew this, but they also knew they could make bigger profits by marketing bigger vehicles while escaping accountability for the greater numbers of pedestrians killed.

It didn’t have to be this way. Some countries took a different course. “Since 1997 in Europe and 2003 in Japan, vehicles have also been tested and rated for how safe they are for pedestrians, too, should the driver hit someone,” Singer writes. The National Highway Traffic Safety Administration proposed similar rules in the US but General Motors objected and the matter was dropped.

During the same period that US pedestrian fatalities were climbing steeply, “Pedestrian fatalities fell by more than a third in a decade in Europe and by more than half since 2000 in Japan.”

Love and rage

Eric James Ng was a middle-school math teacher, a fan of punk music, an activist, and he rode his bike everywhere, every day, through New York City.

Jessie Singer writes, “Eric was sixteen when I met him working at a summer camp. … Eric was magnetic, and I fell in love, right away. I still feel proud to say he loved me, too.

“Eric was killed at age twenty-two.”

He was killed while riding his bike on one of the busiest bike routes in the US, when a drunk driver mistook the paved bike lane for a car route and drove down that lane at high speed. The same type of “accidents” had happened before and would happen again, in spite of safety advocates urging that concrete bollards be installed at potential motor vehicle access points. But those life-saving bollards would not be installed until 2017, after a driver intentionally turned down onto the bike lane and intentionally hit people, killing eight people and injuring eleven others. Then, within a few days, new barricades were installed at dozens of intersections between the bicycle lane and motor vehicle driveways – exactly the type of barricades that would have saved Eric James Ng’s life.

Anger is a natural reaction to lives cut short and deaths that came far too soon, caused in significant part by dangerous conditions that were clearly known but tolerated due to lack of political will. Jessie Singer’s book would be a powerful and enlightening read even if it were a pure expression of anger, but it is so much more than that.

Eric James Ng, she writes, signed his emails with the phrase “love and rage.” That signature would make a fitting tag for her book too.

“In making recommendations after an accident,” she writes, “two goals are central: that we are guided by empathy and that we aim to repair harm.”

That empathy shines through every chapter of There Are No Accidents. Singer wants us to “Remember that the people who die most often by accident are often the most vulnerable – the youngest and the oldest, the most discriminated against and least wealthy – and start there. Start by concerning yourself with vulnerability.”

And if we truly want to change the dangerous conditions that make mistakes deadly, we need to look beyond the individual making a mistake or the individual victim. “Blame is a food chain. Always look to the top. Who has the most power? Who can have the greatest effect? The answer is very rarely the person closest to the accident ….”


In motor vehicle crashes, speed kills and higher speeds kill more. In the next installment we’ll consider how speed limits are set on roads and streets.

The toxic cloud called ‘Internet’

Also posted on Resilience.

The global electronics network is a sort of “bad news, good news” story in Jonathan Crary’s telling.

The bad news is that “the internet complex is the implacable engine of addiction, loneliness, false hopes, cruelty, psychosis, indebtedness, squandered life, the corrosion of memory, and social disintegration”; and that “the speed and ubiquity of digital networks maximize the incontestable priority of getting, having, coveting, resenting, envying; all of which furthers the deterioration of the world – a world operating without pause, without the possibility of renewal or recovery, choking on its heat and waste.”

The good news? The internet complex will soon collapse. 

Scorched Earth, by Jonathan Crary, published by Verso, April 2022.

Crary opens his forthcoming book Scorched Earth: Beyond the Digital Age to a Post-Capitalist World with these words: “If there is to be a livable and shared future on our planet, it will be a future offline, uncoupled from the world-destroying systems and operations of 24/7 capitalism.”

If you’re looking for a careful, thorough, let’s-consider-both-sides sort of discussion, this is not the book you want. “My goal here is not to present a nuanced theoretical analysis,”Crary writes.

Rather, he wants to jar people out of the widespread faith that because we’ve grown accustomed to the internet, and because we’ve allowed it to infiltrate nearly every hour of our lives, and because it may be hard to imagine a future without the internet, therefore the internet should and will endure.

Do some good things happen on and through the Internet? Of course – but Crary is not impressed by arguments that the internet is a liberating, empowering technology for progressive movements:

“Part of the optimistic reception of the internet was the expectation that it would be an indispensable organizing tool for non-mainstream political movements … [I]t should be remembered that broad-based radical movements and far larger mass mobilizations were achieved in the 1960s and early ’70s without any fetishization of the material means used for organizing.” (Scorched Earth, p. 11)

Likewise he comments that the anti-globalization rallies of the late 1990s happened before the pandemic of smart phones, and the huge protests against the US attack on Iraq in 2003 pre-dated the onset of so-called social media. Since then, he laments, the “stupefying” effects of Internet 2.0 have dissipated people’s energies into clicktivism, leaving less time and energy for the building of personal, in-the-flesh networks that might truly challenge the direction of capitalism.

References to material pollution are scattered throughout the brief book, but Crary focuses more of his attention on the pollution of minds, emotions and perceptions. Some parts of his critique are now shared by many, both within and outside the big tech complex. He calls attention, for example, to a pervasive erosion of self-esteem: “Each of us is demeaned by the veneration of statistics – followers, clicks, likes, hits, views, shares, dollars – that, fabricated or not, are on ongoing rebuke to one’s self-belief.” (Scorched Earth, p. 24)

Less widely understood is the immense effort put into data collection, including eye tracking, facilitated by the acquiescence of hundreds of millions of people who make their self-surveillance devices available to trackers at all times:

“We often assume that internet ‘surfing’ means the possibility of following random, uncharted visual itineraries …. From the standpoint of the bored individual, hours spent in this way may seem to be a desultory waste of time, but it is time occupied in a contemporary mode of informal work that produces value as marketable information for corporate and institutional interests. (Scorched Earth, p. 100)

The value exploited by corporate interests includes finely tuned means to convince people to buy things they don’t need, which neither they nor our ecosystems can afford.

Another section was particularly thought-provoking and sobering to me, as a nature photographer who publishes online. Crary explains that internet researchers collect reams of data on “what colors and combinations of colors and graphics are most or least eye-catching.” That information is in turn funneled back into UXD – User Experience Design – to make screen time as addictive as possible and unmediated experience of nature a fading memory:

“The ubiquity of electroluminescence has crippled our ability or even motivation to see, in any close or sustained way, the colors of physical reality. Habituation to the glare of digital displays has made our perception of color indifferent and insensitive to the delicate evanescence of living environments.” (Scorched Earth, p. 106)

Internet 2.0, in sum, turns us into willing accomplices of corporate consumerism, while undermining our self-esteem, sapping our abilities to appreciate the non-virtual world around us, and sucking up time we might otherwise devote to real community. Facebook, Twitter and their ilk have pulled off one of history’s spectacular cons – getting us to refer to their sociocidal enterprise as “social media” and getting us to believe it is “free”. 

Stockpile of mobile phones for recycling/disposal, September 2017.  Photo from Wikimedia Commons.

‘The Cloud is an ecological force’

In just 124 pages Crary bites off a lot – more, in fact, than he really tries to chew. From the outset, he portrays the internet complex as a final disastrous stage in global capitalism. He notes that “the internet’s financialization is intrinsically reliant on a house-of-cards world economy already tottering and threatened further by the plural impacts of planetary warming and infrastructure collapse.” (Scorched Earth, p. 7)

But what is the physical infrastructure of the internet complex? Crary doesn’t delve into that issue. A recently published article by Steven Gonzalez Monserrate, however, makes an illuminating companion piece to Crary’s book.

Entitled “The Cloud Is Material: Environmental Impacts of Computation and Data Storage”, Monserrate’s research is available here. MIT Press has also published a shorter article adapted from the full paper. Quotes cited here are taken from the full paper.

Monserrate’s central point is that, like a cl0ud of water molecules, “the Cloud of the digital is also relentlessly material”, and further that “the Cloud is not only material, but is also an ecological force”.

Crary refers to the capitalist industrial system, of which the internet complex is now one major component, as “choking on its heat and waste”. Monserrate helps us to quantify that heat and waste.

Discussing what data center technicians refer to as a “thermal runaway event”, Monserrate writes “The molecular frictions of digital industry … proliferate as unruly heat. … Heat is the waste production of computation, and if left unchecked, it becomes a foil to the workings of digital civilization.”

In most of the data centers that keep the Cloud afloat, he adds, “cooling accounts for greater than 40 percent of electricity usage.”

Can’t the network servers and their air conditioners be switched over to renewable energy in generally cool environments? It’s not so easy, Monserrate tells us. Because of network signal latency issues, large portions of the Cloud are located as close to financial and government centers as possible. The state of Virginia’s “data center alley,” he says, was “the site of 70 percent of the world’s internet traffic in 2019”. That degree of concentrated electricity consumption is difficult if not impossible to service without huge coal, gas or nuclear generators.

The energy demands go far beyond air conditioning:

“The data center is a Russian doll of redundancies: redundant power systems like diesel generators, redundant servers ready to take over computational processes should others become unexpectedly unavailable, and so forth. In some cases, only 6–12 percent of energy consumed is devoted to active computational processes. The remainder is allocated to cooling and maintaining chains upon chains of redundant fail-safes to prevent costly downtime.” (Monserrate, “The Cloud is Material”)

Keeping your cat videos available on demand around the world, keeping Amazon’s gazillion products available for your order at 3 a.m., keeping all of Netflix’ and Hulu’s videos ready for bingeing, and keeping this entire data stream transparent to both commercial and military surveillance – well, that results in a lot of coal and gas going up as carbon dioxide emissions.

One result: “the Cloud now has a greater carbon footprint than the airline industry.”

Like the cell phones that Apple, Google and Samsung encourage you to replace every two or three years, every physical component of the internet complex has to be mined, refined, chemically transformed, assembled, packaged and shipped, before it soon becomes outmoded. Monserrate cites a Greenpeace study estimating that “less than 16 percent of the tons of e-waste generated annually is recycled.” And that recycling is often done by the lowest-paid workforces in the world, in enterprises that don’t respect the health of the workforce or the environment.

“The refuse of the digital is ecologically transformative,” Monserrate concludes.

Life without Internet

So is the Internet destined to be but one brief blip in human history? The answer seems clear to Crary – the internet will collapse along with the industrial complex that supports it:

“The internet complex, now compounded by the Internet of Things, struggles to conceal its fatal dependence on the rapidly deteriorating built world of industrial capitalism. Contrary to all the grand proposals, there never will be significant restoration or replacement of all the now broken infrastructure elements put in place during the twentieth century.” (Scorched Earth, p. 63)

Personally I am cautious about making such firm predictions, though I don’t see how the internet will persist long in its current form. Total disappearance is just one potential outcome, however. The current internet industrial complex, as Monserrate describes, includes a vast amount of redundancy, and perhaps that will make it possible to transition to a still-useful internet with only a fraction of the energy and material throughput.

In a transformed economic system, without the built-in impulsion to sell hardware and software “upgrades” to consumers on an annual basis, and without the created “need” to have every video snippet available anywhere anytime, and without the motive to maintain a vast surveillance and behavior modification apparatus – perhaps a future civilization could benefit from many of the significant benefits of the internet without paying a soul- and ecosystem-crushing price. (On this subject, see for example the research by Kris De Decker in “How to Build a Low-Tech Internet”.)

But if we don’t redirect our global economic system, and fast, the whole toxic cloud may crash whether we like it or not. And perhaps, on balance, that will be a very good thing.

“If we’re fortunate,” Crary dares to hope, “a short-lived digital age will have been overtaken by a hybrid material culture based on both old and new ways of living and subsisting cooperatively.”


Photo at top of page: A young man burning electrical wires to recover copper at Agbogbloshie, Ghana, as another metal scrap worker arrives with more wires to be burned. September 2019. Photo by Muntaka Chasant, licensed via Creative Commons, accessed through Wikimedia Commons.