DC’s gas utility has promised to transition its business model away from selling gas, a necessary step if the District is to achieve its commitment of carbon neutrality by 2050. As we outlined in our earlier post, ending DC’s reliance on methane gas would reduce emissions of the greenhouse gases that cause climate change. It would also improve indoor air quality and public health.
If DC successfully transitioned away from gas for heating, hot water, and cooking, what would that look like in practice?
Getting rid of gas stoves and the associated health problems means cooking with electric or induction stoves, neither of which burn fossil fuels. Induction cooktops work by creating a magnetic field that transfers heat directly to the pot or pan. When heating on a gas stove, only about half the energy is transferred to the pot and the rest warms the air in the kitchen. An induction stove transfers about 90% of the energy directly to the pot.
Cavemen needed fire, we don’t
The share of gas used for cooking is fairly low. The primary use for gas in buildings is heating, either with furnaces or boilers. Electric air-source heat pumps provide heat more efficiently and with less carbon pollution than gas. In recent decades, the efficiency and reliability of heat pumps have dramatically improved, allowing them to provide heat inside even amid frigid temperatures outside.
Heat pumps require only one unit of energy in the form of electricity to generate about three units of energy in the form of heat. The extra, non-electric energy comes from removing heat from the outside air, which is a source of essentially free energy.
Here is how it works. Let’s say the outside air is 25 degrees. To absorb heat from such cold air and transfer it indoors, the heat pump uses a refrigerant fluid. The refrigerant is even colder than the outside air, say 10 degrees, so it absorbs heat from the relatively warmer outdoor air.
The refrigerant is then compressed, which raises the temperature to between 120 and 140 degrees. The now-hot refrigerant is sent indoors through copper pipes, and the heat is transferred to indoor air while the refrigerant is sent back outside. Outdoors again, the pressure of the refrigerant is reduced and its temperature falls below that of the outdoor air, making it again ready to absorb heat from the outside air.
The cycle repeats over and over, providing exceedingly energy-efficient heating. The process requires electricity, but the electricity itself does not produce heat. It only transfers heat from outdoors to indoors. Heat pumps are about three times more efficient than electric baseboard radiators or gas furnaces.
The same heat pump that warms a home in the winter can cool it in the summer, using the same process, but in reverse. Having the same system for heating and cooling can save money and allow buildings to move beyond gas and operate entirely on electricity. Indoor heat pump water heaters use the same technology, eliminating the need for gas-fired water heaters.
Homes not fueled by gas avoid the cost of gas lines, gas servicing, and gas metering. A study by the Rocky Mountain Institute found that using electricity for heating, hot water, and air conditioning reduces homeowner costs in new buildings. Electric retrofits of existing homes can save money for homeowners who would otherwise need to replace both a furnace and an air conditioner. Electric retrofits can also save money for those combining rooftop solar and electrification, according to the study.
Electrification necessitates efficiency
Decreased gas use increases reliance on electricity, and with DC’s electric mix moving toward 100% renewable sources because of the clean energy law going into effect this year, replacing gas with electricity means a substantial drop in greenhouse gas emissions. But electrification alone isn’t enough for DC to meet its climate goals.
Additional electricity demand resulting from switching off gas will require energy efficiency measures such as air-sealing and insulation of homes and other buildings. Increased efficiency will save money for utility ratepayers and keep costly upgrades of the electric distribution system to a minimum. While some costs will be borne by owners, governments should provide subsidies as well.
The District’s Clean Energy DC plan calls for a package of incentives targeting energy use reductions in existing buildings, with the program set up by 2020. If successful, it will pave the way for widespread electrification.
Efficiency efforts are already underway for large buildings. The DC Department of Energy and Environment is starting to set up the Building Energy Performance Standards program, which was created by the clean energy law and requires increased efficiency in buildings over 50,000 square feet beginning in 2021. The efficiency requirements will apply to buildings of 25,000 square feet in 2023 and 10,000 square feet in 2026.
Shutting off the pipeline, from California to Britain
In August, the California Public Utilities Commission issued a unanimous decision directing the state’s $1 billion energy efficiency program to start funding gas-to-electricity fuel switching programs. The commission’s order noted that gas is “a barrier to California’s progress on climate and energy goals.”
In March, the Conservative Party-led government in Great Britain announced a prohibition on gas in new residential buildings, moving the country toward heat pumps, increased efficiency and other alternatives to gas.
The Building Decarbonization Coalition concluded in white papers released earlier this year that fuel-switching from gas to electricity “will save consumers billions of dollars compared to other carbon reduction strategies” in part because “electric appliances have lower lifetime costs than fossil fuel appliances, especially considering the avoided costs of gas infrastructure.”
Fossil fuel interests claim ending the addiction to dirty fuels like methane gas is too expensive. In truth, acting now is far cheaper than waiting to address the problem and its devastating consequences in a much warmer future.
How did I not already know the health and environmental impacts of residential gas burning?
One of the things not mentioned in the article is that, IIRC, gas tends to be much cheaper than electricity for the same amount of heat. So even if electricity is more efficient, gas is often still cheaper. At least cheaper when you don’t factor in externalities.
This explains some of the difficulty finding places in the SF Bay that have gas ranges. That's the one thing that I personally would have a hard time giving up; electric is fine for ovens and pressure cookers, but I can't stand the experience of cooking anything that requires changing heating levels on electric or induction ranges.
> will require energy efficiency measures such as air-sealing and insulation of homes and other buildings.
That’s its own can of worms, our breathing and general living in a home puts a colossal amount of humidity in the air that the wood and gypsum board absorbs to saturation, and with houses no longer built to breathe, but rather hermetically sealed, we get mold problems that didn’t exist before
Its been a huge issue here where a bunch of buildings have "failed" IE full of mosture creating mold/rot etc. Now they have to have ventilation fans running the whole time etc. It seems like a real balancing act. I have not cooked on an induction but recently switched to gas and cooking on gas is so much nicer.
The new print issue of the magazine has a short thought-experiment article by me, on what happened after the fall of the Roman Empire. (As I point out, this concerned the Western Empire only—the one based in Italy, and the one Edward Gibbon described in The Decline and Fall. The Eastern Empire, based in Constantinople, had many more centuries to run.)
In a first round of reader responses, historians and others reacted (mainly) to the article’s (intentionally overstated) headline, “The End of the Roman Empire Wasn’t That Bad.” And in a second round, a veteran of governance issues named Eric Schnurer argued that a renewed focus on local-level renewal and innovation was proper, since localities were the only places where innovation ever occurred.
Here is another round, on the point I mainly hoped the article would raise: how Americans, ever optimistic about the rebound capacity of their perpetually self-reinventing system, should think about the possibility that “it’s different this time,” and that national-level governance might finally be strained beyond its rebound abilities. Over to the readers:
1) Civil servants still want to serve. In my article I quoted Philip Zelikow, of the University of Virginia, on the difference between national-level and local officials. At the state, local, and regional level, Zelikow said, elected and career officials had no choice but to work together and actually solve problems. Whereas at the national level, politics was more and more about culture war—“who you like, who you hate, which side you’re on,” as Zelikow put it.
A career official in a national-level agency replies:
In November I will mark 32 years of federal service.
My grandparents came here with nothing. I’m an age of rising tides, my parents had the grit and good fortune to grant me and my brothers and sisters every reasonable opportunity, and then some.
That’s fundamentally why I entered public service, and that’s fundamentally why I remain in public service. I am grateful and feel a responsibility to give back.
Your essay, comparing our federal state to Rome in its age of decline strikes a chord, and in doing so fills me with an undeniable melancholy.
I push back against Zelikow’s “which side are you on” fatalism about national governance, even as I admit I see evidence of it all around me.
I’m not tossing in the towel, yet.
2) ‘Optimates’ vs. ‘Populares’: The Battle Goes On. From a history professor, of my own Boomer generation:
I have been thinking about that [Roman] period quite a bit lately, as we see the collapse of societal norms and the failure of many central governments to actually govern.
I see the present as actually more in parallel to the fall of the Republic in the first century BCE.
At that time the empire had begun to take form, with vast amounts of wealth pouring into the center, but mainly enriching the Senatorial oligarchs. The men who had fought the wars were forced off their land, which came to be farmed on vast plantations by slaves. The new global order failed the yeomen, mainly because the rich, who controlled the government, refused to relinquish any of their wealth to help the impoverished citizens.
The society broke into two warring parties, Optimates and Populares (the Best and the People). They engaged in wars with each other, mobilizing personal armies, and violence came to be used as a means of government with leaders of each side being killed by mobs, culminating in the death of Julius Caesar. The society had become so divided that in the end the only way to govern was by autocratic rule: Augustus.
I fear that we are near that point, and that a demagogue will arise who has more shrewdness than our current demagogue-wannabe. Trump has blazed the pathway that others can well follow.
Trump’s party represents the Optimates—the wealthy, but we could just as well see a leader representing the Populares come to power. Think if Huey Long had been successful in the 1930s. Populism can cut both ways, call them National Populism and Social Populism ….
We are seeing the breakdown of Liberal Democracy across the world, as happened in the 1930s. It was finally restored after a decade of slaughter. It may not be restored again. At the least, something new has to take form, and that will not come from our generation.
One interesting parallel to the period that you do discuss in your piece is that the “barbarians” were not invading the Empire to loot and pillage. Mainly they wanted to share in the wealthy and stable Roman society, get a bit of land for their people and be secure from tribes like the Huns on the other side of the border. They knew Rome very well, many of their leaders had been leaders in the Roman armies and many were Roman citizens. The Vandals were not really that vandalous …
In the same way, people are now migrating en masse into Europe and the US in pursuit of better lives, to participate in the wealthy and stable Western societies, to escape poverty and brutality.
Climate change plays a significant role in driving people out of their homelands, and that will only become worse over time. Another factor of course is Western as well as internecine wars (think Iraq and Syria) and Western support of brutal governments (Central America)
But the influx of a mass of outsiders into the Roman Empire (especially the western part) did ultimately lead to the breakdown of the wealth and stability they had come for.
There were many reasons for this, including inter-tribal battling among the newcomers and the disappearance of the Roman legions as a controlling force, but there was a continuing social disintegration and insecurity. The stable Roman civitas crumbled, quickly in some places (Britain) and more slowly in others (Gaul). I am not bringing this up to agree with Trump’s mantra to ‘build the wall’ (which is folly—the Romans tried in some places), but rather to stress that we must have a rational immigration policy and consensus that prevents destabilization. Mass immigration creates nationalist anger which is fuel for nationalist demagogues.
As the Roman society disintegrated, government did become ever more localized. That worked for awhile in some places (like France), but in time trade shrank, education declined, government services passed away and instability increased.
One could imagine some parts of the US doing quite well for a time without a federal government, but other parts might do very poorly. Infrastructure would fall apart, as it did in post-Roman Europe. More people would flow across unpoliced borders, adding to the disruption and to the reactions. This would not play well in a society as well armed as the US.
No one knew that “Rome had Fallen” when Odoacer brushed aside the grandly named Romulus Augustulus in 476, only that the Germans now ruled Italy in name as they had in fact for the past decades. Even in our own long lives, can we know what history might see as having passed in our lifetimes, perhaps that we are now at the transition from the 500-year Modern Age into what-we-do-not-know (as John Lukacs has written)? Life went on, as for the frog in boiling water whom you have analyzed …
Several hundred years after the fall of Rome new forms and new states began to take shape amid the ruins, and by the 12th century Western Europe was again thriving. But it was a long and difficult time between the fall of the Empire and the rise of Europe. I would not wish that on my children and grandchildren, or on theirs.
The long term results of the failure of governance we are living through will be regrettable, though perhaps as necessary as the Dark Ages.
3) The new corporate “nationality.” A Westerner who has lived for years in Japan writes about the local-vs.-national tensions within the United States:
One idea is to reorganize the 50 states into 7 regions that match the baby bells created when AT&T was broken up …. The merits to such a reorganization is to unify many basic services: do we really need 50 DMVs and 50 Medicaid programs and who knows how many other layers of bureaucracy that gets repeated state by state. This could enhance basic services at the sub national level …. on the other hand it may create the equivalent of 7 proconsuls competing among themselves to follow Rome’s decline into empire. …
What seems more likely to me to occur over the next 50 years, and something that I oppose, is a rift, with sovereign-individual stance married to the corporatization of society ….
Instead of citizenship being based on contiguous borders our lives are bounded by what membership card(s) we carry. I can go to an Amazon condominium after buying dinner at Whole Foods paid by my Amazon coins via my Kindle and travel in my Amazon car ad infinitum. And if I am a Sapphire member, better deals as I jump from location to location but stay in the Amazon or Apple or Goggle or Facebook or whatever bubble. When a person uses an “out-of-service” provider, of course rates go up, and pity the people who cannot afford/are rejected in their membership bids. Blade Runner marries Brave New World.
Finally, on the question if this time is different compared to other times due to change! change! change! yes and no. I believe that in past periods starting around 1870, in these early periods, the degree of change was much greater than now. No electricity vs. wifi and rechargeable batteries; no telephones/movies/radios vs watching reality TV on your cell phone, etc., etc.
But the pace of change does seem to be much faster and disconcerting for all generations. This deserves further explanation but who has the time to read, let alone write …
4) Let’s talk about ideology, and class. Another academic writes (in a message I am substantially boiling down):
1. I have spent the past seven years years studying the Eastern Roman Empire, which is usually called “Byzantium,” and which Gibbon himself dismissed as basically the thousand-year decline of the Roman Empire.
His is a monstrous oversimplification and it has degraded our understanding of ancient/medieval history ever since Gibbon's own day (1776), just as Adam Smith’s dismissal of the timelessness of mercantilism has degraded our English-speaking understanding of ancient/medieval economics ever since the same time (1776). [JF note: On the Adam Smith point, check out this article by me, from 25+ years ago.]
Given what is already well-known about how the US so-called “founding fathers” (itself an egregious simplification of the revolutionary generation) understood the transition of Republican Rome into the Empire, before we sink our teeth into late antique history, it might be worth remembering that our understanding of the past, especially the more distant past, is ALWAYS (and has always been) subject to the political machinations of the present, and even historians’ own careers aren’t guided so much by how well they interpret the past, but by how well their interpretations suit the sensibilities of the times in which they happen to be writing …
[JF: Leaving out point #2, a long discourse on the difficulty of understanding the real life of peasants in different eras of history.] …
3. Generations are important for understanding deep history. For the past 70 years, young generations of Americans have been told that they ought to be living better than their parents. That was fine for the Boomers and for Gen Xers, but this is clearly not the case for Millennials.
So we were lied to. Big surprise: so were the generations who fought for and against Prohibition, Slavery and Unionization (and for Odoacer as well, arguably). Why else would (according to the 1860 US census) a majority of non-slave-owning Southern Whites sign up to fight for the cause of Confederate slavery at the outbreak of the American Civil War? …
4. Let’s not forget the power of ideology in the present. In the 5th century present, Christianity (and Judaism and the various forms of Paganism) were as much part and parcel of social cornerstones as the ideology of the “American Dream,” “Intersectionality,” and “MAGA” are today ….
The point is that we should never underestimate the power of ideology to bind people to a common cause; whether in the 5th century, the 11th century, or the 21st century. Ultimately, we as historians dismiss the significance of religion (and collective conviction) at our own peril.
5. Finally, class. With the rapid adoption of Christian laws and social structures throughout the Roman Empire during and after the 4th century, the rigid laws fossilized a system of land-owners (fief-holders) and land-workers (peasants).
The road to serfdom is something that ever since Hayek, has been capitalized by the likes of Ayn Rand and her disciples, but it truly begins with the rules that one class lives by and another class lives above.
This may sound quite Marxist, but that’s because it is. Without centralized regulations, we automatically return to system of land-owners and toilers, whether we call them ancient/medieval sharecroppers or modern bartenders. When ideology is co-opted by the elites to perpetuate their children to inherit their elite status (whether we call it aristocracy or meritocracy), we return to the so-called “dark ages.”
This is not simply “Marxism,” it is historical materialism. And it is the only actually reliable guide to studying the past that we have ever truly innovated since the time of Marcus Aurelius.
5) “I believe in America.” And, finally, quite a different view of the ever-present, ever-reinterpreted past:
As the famous first line in the movie The Godfather reads, “I believe in America.”
While many of us continue to do so, an alarming number of Americans have fallen victim to the in-vogue critique that “woe is me” and things are awful.
For some, this is a reality. I read stories about the homeless problem in major US cities, how drug addiction and tolerance of theft is literally robbing thriving communities of their once proud fortitude of citizenship. I read daily how big tech companies are continuing to mislead the American public about how they monitor and police speech and content their employees regard as offensive, and God knows what with our personal information ….
But what I mostly don’t see now is pride: pride in how fortunate we are to live in this country. It’s called gratitude ….
Talk to someone middle-aged who grew up in Soviet Eastern Europe, and you’ll find out quickly why they left for America. We now live in a world where we can get anything we want at any time of the day. Nearly all buildings and houses have central air-conditioning. Transportation is readily available for everyone. The economy is currently booming with employment we haven’t seen in three generations. Murder rates are at all-time lows. There hasn’t been a serious threat to the homeland in nineteen years. There’s a new superhero movie out every three months in theaters. Netflix programming has people indulging on their couches more than ever.
Most people who are angry and disheartened have never known a world like the Dark Ages, the Black Plague, Serfdom, Smallpox, the Great Depression or WWII or even the height of the Cold War. And we have room to complain that America sucks?
One of the reasons the Roman Empire fell was not because of physical over-extension by the state (which is true), but by its people taking for granted what the Roman Empire had done for the modern world …
Is it any coincidence many of the Founding era sought to emulate Roman law and antiquity as they established the republican virtues and culture of the 1780s-1820s? And what’s more, many of the Founders warned, much like the scholars of latter day Rome, what would likely be the downfall of the continent and our country: indifference and ingratitude from within for what America meant as an idea ….
The truth, in my opinion, is that 9/11 sapped us of our confidence. And the ensuing years of lies, mismanaged wars and bank bailouts, an incoherent foreign policy over multiple administrations, and now the rise of brash and offensive populism in both ideological camps have Americans feeling more anxious than ever ….
Perhaps we should be devoting much more to teaching civics again, and appreciating the separation of powers, appreciating why men like James Madison, George Mason, John Adams, Gouverneur Morris, Benjamin Franklin, Thomas Jefferson, and George Washington matter so much that it is in our individual interest to be informed of who they were and what they did to establish the freedoms we often take for granted.
More so, it’s about time we recognize African American contributions during the Founding era too. In spite of their plight, we should be recognizing Peter Salem, Phyllis Wheatley, James Armistead Lafayette, and James Forten. We should be embracing the fact that the Continental army of 1781 was color-blind; that it stood about 1/5 African American at the Siege of Yorktown is extraordinary. Or that women and some African Americans were voting in New Jersey prior to 1807….
When we stop paying attention to all of the noise, and when we regain our focus; the fog will begin to clear, and King’s pronouncement of seeking to reach “the promised land” will once again ring loudly for those of us who are yearning for a more perfect union: one of freedom and liberty for all.
Thanks to all for responding to the thought-experiment with thoughts, evidence, and opinions.
Since the climate change townhall is happening, here’s a piece I wrote for Wired about it last month, based on some ideas of Jeff Colgan, Jessica Green and Thomas Hale.
Last week, CNN announced plans to host a climate crisis town hall with the Democratic presidential candidates on September 4. MSNBC scheduled a multiday climate change forum with the presidential hopefuls later that month.
In both venues, some version of the perpetual question will undoubtedly be raised: “How will you pay for the costs of dealing with climate change?”
Despite its pervasiveness, this is a profoundly wrongheaded line of inquiry. Asking how to pay for the impact of climate change implies that these costs are a matter of choice. The reality is that global warming will impose massive costs, regardless of whether policymakers respond or not. Thus, the real question is not “How would you propose to pay?” but instead “Who is going to pay?” and “How much?”
People are already paying for climate change with their lives. Rising temperatures are killing more than 150,000 people every year. This death toll is estimated to increase to 1.5 million people annually by the turn of the century. Some are confronting the likelihood of failed crops; others have been forced to flee floodplains.
Those currently paying for the effects of climate change are the most vulnerable—people in the developing world, the poor, the sick, the elderly, and the very young. As the world changes, more people are going to suffer the cost of heat waves, rising water, damaged or dying ecosystems, and flooded coastal cities. This will create what political science and public policy experts describe as “existential politics,” in which different groups fight to preserve their entire way of life.
On one side of this existential fight will be those who want things to continue mostly as they are. Oil companies have trillions of dollars worth of petroleum still in the ground. An entire energy infrastructure has been built on the back of fossil fuel extraction. If fossil fuels become “stranded assets”— economic assets that suddenly lose most or all of their value—crucial sectors of today’s economy will be utterly transformed, hurting the interests of the businesses that run them. Unsurprisingly, these businesses are fighting back. So, too, are industrial workers such as coal miners whose way of life is threatened.
Meanwhile, others will suffer the effects of continued inaction. People who live on coasts will face the risks and costs of flooding, while many of those who live inland will have to deal with changing weather patterns, droughts, and unbearable heat waves.
This fight has already started to play out. Fossil fuel interests are rich, politically influential, and well organized. They are able not only to pay for lobbyists in Washington, DC, but to organize an entire political movement at the state level. The Koch-funded “grassroots” organization Americans for Prosperity pushes to protect fossil fuel interests in individual states. The group has become intimately intertwined with the Republican party.
The interests on the other side are broader, less well organized, and less influential. This is in part because everyday Americans don’t really understand that they will be on the hook for many of the costs of climate change unless there is a dramatic change in policy.
If we continue on our current trajectory, the lives of ordinary voters will be fundamentally transformed while fossil fuel companies continue to make vast profits. Any serious policy response to global warming needs to transfer some of the costs from voters to the fossil fuel interests, where they belong.
Some might disagree with this approach, advocating instead for a consensus among all parties. The problem with this rejoinder: The politics of global warming are necessarily divisive, and one side of the divide is already mobilizing to protect its own narrow interests.
To fight global warming, we need to organize a broad public counterweight against the sectoral interests that are trying to block action. Building an effective “Green New Deal” will require financial resources to unite a coalition in favor of climate action, and to split the counter-coalition. Such policy will also need to remake the international political economy to build both cross-national solidarities and domestic alliances.
Yet before all of this can be done, it is crucial to change the terms of debate and acknowledge reality. We are going to have to pay for global warming, one way or another. The key question is who will pay—and how we can distribute those costs fairly.
ButmanyNativeadvocates, myself included, werenotsatisfied. Warren still has work to do, and demanding she do what’s left is beyond reasonable. In all of her apologizing, Warren has never let go of her family story. After spending her entire adult life repeating a lie, I simply want Warren to tell the truth.
In 1836, Warren’s great-great-great-grandfather, a white man named William Marsh, enlisted himself in a Tennessee militia to fight in the “Cherokee War,” an occupation of Cherokee land in the lead-up to the Trail of Tears. Decades later, his grandson John Houston Crawford moved his family onto Indian Territory and squatted on Cherokee land in a move that, with no record of a permit, was almost certainly illegal.
The Crawfords were just some of the tens of thousands of white squatters who outnumber Cherokees on our own land. While Cherokee Nation beseeched Congress to enforce our treaty rights and kick them out, the squatters pushed Congress to divide up our treaty territory and create a path to white land ownership; the squatters won.
Pauline’s youngest child, Elizabeth, grew up with her mother’s version of the story. And though the family had no evidence or relationship to the tribe, Elizabeth Warren never questioned it, she wrote in her memoir. It was her family story, she would say.
The story of Warren‘s family traces the history of Cherokee Nation, but we sit on opposite sides of that history. Like many other white families, Warren’s ancestors replaced the truth of their complicity in Cherokee dispossession with a tale of being Cherokee. If that’s not wrong, if that’s not racist, I don’t know what is.
The monster I am trying to wrestle to the ground is not one white woman who claimed to be Cherokee. It is the hundreds of thousands of white people claiming to be Cherokee and the broad social acceptance that emboldens them. It threatens the future of my tribe.
I do not fault Warren for believing what she was told as a child. But in 2019, Warren isn’t a kid anymore. She is a United States senator running for president. If she is not in a position that demands accountability and truth, who is?
The center of this controversy is not Warren’s political career, it is Cherokee sovereignty and self-determination. The monster I am trying to wrestle to the ground is not one white woman who claimed to be Cherokee. It is the hundreds of thousands of white people claiming to be Cherokee and the broad social acceptance that emboldens them. It threatens the future of my tribe. Warren is just the most public example.
I already know what people will say. They will say that many people have Cherokee ancestors but don’t have evidence, falsely believing that Cherokees were too primitive to have a paper trail when our literacy rates were higher than those of white people. They will say their great-grandmother was too proud to sign the Dawes Rolls, falsely believing the U.S. government gave Indians the option when some who refused were arrested. They will say the DNA test proves Warren is Cherokee, falsely believing that Western science knows Indigenous communities better than we know ourselves.
Tribal affiliation and kinship determine Cherokee identity — not race or biology. At a time when the far right is equating Native identity with race to undermine Native rights, the myths that lie in the wake of Warren’s missteps are extremely dangerous. Yes, she apologized, but we are left cleaning up the mess she made.
Warren’s policy platform and admission to harm is a good first step. But a complete apology is working to repair the harm you caused. There is no one in the world who has more power to correct the harmful myths perpetuated by this saga than Elizabeth Warren herself.
She simply needs to state she does not have a Cherokee ancestor and that she was wrong to claim one. Until then, Cherokee people will be left fighting the mountain of confusion she caused. And I am terrified we will lose.
Rebecca Nagle is a writer, advocate and citizen of Cherokee Nation living in Tahlequah, Oklahoma.
Research assistance by Cherokee genealogist Twila Barnes.
Do you have a compelling personal story you’d like to see published on HuffPost? Find out what we’re looking for here and send us a pitch!
Atlanta has some of the worst traffic in the United States. Drivers there average two hours each week mired in gridlock, hung up at countless spots, from the constantly clogged Georgia 400 to a complicated cluster of overpasses at Tom Moreland Interchange, better known as “Spaghetti Junction.” The Downtown Connector — a 12-to-14-lane megahighway that in theory connects the city’s north to its south — regularly has three-mile-long traffic jams that last four hours or more. Commuters might assume they’re stuck there because some city planner made a mistake, but the heavy congestion actually stems from a great success. In Atlanta, as in dozens of cities across America, daily congestion is a direct consequence of a century-long effort to segregate the races.
For much of the nation’s history, the campaign to keep African-Americans “in their place” socially and politically manifested itself in an effort to keep them quite literally in one place or another. Before the Civil War, white masters kept enslaved African-Americans close at hand to coerce their labor and guard against revolts. But with the abolition of slavery, the spatial relationship was reversed. Once they had no need to keep constant watch over African-Americans, whites wanted them out of sight. Civic planners pushed them into ghettos, and the segregation we know today became the rule.
At first the rule was overt, as Southern cities like Baltimore and Louisville enacted laws that mandated residential racial segregation. Such laws were eventually invalidated by the Supreme Court, but later measures achieved the same effect by more subtle means. During the New Deal, federal agencies like the Home Owners’ Loan Corporation and the Federal Housing Administration encouraged redlining practices that explicitly marked minority neighborhoods as risky investments and therefore discouraged bank loans, mortgages and insurance there. Other policies simply targeted black communities for isolation and demolition. The postwar programs for urban renewal, for instance, destroyed black neighborhoods and displaced their residents with such regularity that African-Americans came to believe, in James Baldwin’s memorable phrase, that “urban renewal means Negro removal.”
This intertwined history of infrastructure and racial inequality extended into the 1950s and 1960s with the creation of the Interstate highway system. The federal government shouldered nine-tenths of the cost of the new Interstate highways, but local officials often had a say in selecting the path. As in most American cities in the decades after the Second World War, the new highways in Atlanta — local expressways at first, then Interstates — were steered along routes that bulldozed “blighted” neighborhoods that housed its poorest residents, almost always racial minorities. This was a common practice not just in Southern cities like Jacksonville, Miami, Nashville, New Orleans, Richmond and Tampa, but in countless metropolises across the country, including Chicago, Cincinnati, Denver, Detroit, Indianapolis, Los Angeles, Milwaukee, Pittsburgh, St. Louis, Syracuse and Washington.
While Interstates were regularly used to destroy black neighborhoods, they were also used to keep black and white neighborhoods apart. Today, major roads and highways serve as stark dividing lines between black and white sections in cities like Buffalo, Hartford, Kansas City, Milwaukee, Pittsburgh and St. Louis. In Atlanta, the intent to segregate was crystal clear. Interstate 20, the east-west corridor that connects with I-75 and I-85 in Atlanta’s center, was deliberately plotted along a winding route in the late 1950s to serve, in the words of Mayor Bill Hartsfield, as “the boundary between the white and Negro communities” on the west side of town. Black neighborhoods, he hoped, would be hemmed in on one side of the new expressway, while white neighborhoods on the other side of it would be protected. Racial residential patterns have long since changed, of course, but the awkward path of I-20 remains in place.
By razing impoverished areas downtown and segregating the races in the western section, Atlanta’s leaders hoped to keep downtown and its surroundings a desirable locale for middle-class whites. Articulating a civic vision of racial peace and economic progress, Hartsfield bragged that Atlanta was the “City Too Busy to Hate.” But the so-called urban renewal and the new Interstates only helped speed white flight from Atlanta. Over the 1960s, roughly 60,000 whites left the city, with many of them relocating in the suburbs along the northern rim. When another 100,000 whites left the city in the 1970s, it became a local joke that Atlanta had become “The City Too Busy Moving to Hate.”
As the new suburbs ballooned in size, traffic along the poorly placed highways became worse and worse. The obvious solution was mass transit — buses, light rail and trains that would more efficiently link the suburbs and the city — but that, too, faced opposition, largely for racial reasons. The white suburbanites had purposefully left the problems of the central city behind and worried that mass transit would bring them back.
Accordingly, suburbanites waged a sustained campaign against the Metropolitan Atlanta Rapid Transit Authority (MARTA) from its inception. Residents of the nearly all-white Cobb County resoundingly rejected the system in a 1965 vote. In 1971, Gwinnett and Clayton Counties, which were then also overwhelmingly white, followed suit, voting down a proposal to join MARTA by nearly 4-1 margins, and keeping MARTA out became the default position of many local politicians. (Emmett Burton, a Cobb County commissioner, won praise for promising to “stock the Chattahoochee with piranha” if that were needed to keep MARTA away.) David Chesnut, the white chairman of MARTA, insisted in 1987 that suburban opposition to mass transit had been “90 percent a racial issue.” Because of that resistance, MARTA became a city-only service that did little to relieve commuter traffic. By the mid-1980s, white racists were joking that MARTA, with its heavily black ridership, stood for “Moving Africans Rapidly Through Atlanta.”
Even as the suburbs became more racially diverse, they remained opposed to MARTA. After Gwinnett voted the system down again in 1990, a former Republican legislator later marveled at the arguments given by opponents. “They will come up with 12 different ways of saying they are not racist in public,” he told a reporter. “But you get them alone, behind a closed door, and you see this old blatant racism that we have had here for quite some time.”
Earlier this year, Gwinnett County voted MARTA down for a third time. Proponents had hoped that changes in the county’s racial composition, which was becoming less white, might make a difference. But the March initiative still failed by an eight-point margin. Officials discovered that some nonwhite suburbanites shared the isolationist instincts of earlier white suburbanites. One white property manager in her late 50s told a reporter that she voted against mass transit because it was used by poorer residents and immigrants, whom she called “illegals.” “Why should we pay for it?” she asked. “Why subsidize people who can’t manage their money and save up a dime to buy a car?”
In the end, Atlanta’s traffic is at a standstill because its attitude about transit is at a standstill, too. Fifty years after its Interstates were set down with an eye to segregation and its rapid-transit system was stunted by white flight, the city is still stalled in the past.
It’s a shame because there are some neat things there (K. has a cousin who lives in the city) but that’s been my impression every time I’ve been there: imagine how nice this city would be if the white people hadn’t preferred to gut it rather than share it with black people.
In 2013, federal authorities began investigating shady “body brokers,” businesses that sell donated bodies for research purposes, such as Arizona’s for-profit Biological Resource Center. That company was shuttered following an FBI raid in 2014, but eyewitness details are being made public for the first time—and they’re almost too grisly to believe.
In a sworn statement, former FBI agent Mark Cwynar stated that he saw “various unsettling scenes” at BRC, including “infected heads,” “a bucket of heads, arms and legs,” and a cooler “filled” with male genitals, the Arizona Republic and KTVK report. Additionally, Cwynar says he discovered a macabre wall hanging: a woman’s head sewn onto a man’s torso “in a ‘Frankenstein’ manner.”
According to Reuters, agents ultimately found 1,755 human body parts at the facility, filling 142 body bags weighing 10 tons.
Matthew Parker, another former FBI agent who worked on the BRC case, told Reuters that moving body bags from the facility resulted in a PTSD diagnosis. “I couldn’t sleep at night after seeing that,” said Parker. “It looked like a junkyard chop shop where they are just ripping things apart.”
Cwynar’s testimony has come to light as part of a lawsuit by 33 people who say that BRC acquired their family members’ bodies through “false statements.” Plaintiff Troy Harp, who gave his mother and grandmother’s bodies to BRC, told KTVK that he believed they would be used for scientific research.
“Cancer, and leukemia and whatever else, using sample cells,” said Harp. “That’s what I was told.”
Reuters, however, reports that at least 21 bodies donated to BRC were later used by the U.S. Army for blast experiments to study the effects of roadside bombs.
The BRC case may be a particularly gruesome example, but it points to larger issues of transparency and regulation in this little-known industry. Often offering free cremation services to grieving families, body brokers sell donated bodies on a research market where a human head can be bought for as little as $300. (Documents show that BRC priced whole cadavers at $5,000 in 2013.)
In almost every state, selling non-transplantable human body parts is legal as long as they are not fetuses. In recent years, Arizona and Colorado have passed laws to regulate body brokers, but the vast majority of states do not have explicit rules for how donated cadavers must be stored or sold.
After pleading guilty to illegal control of an enterprise, BRC owner Stephen Gore wrote in a letter to a judge that the business was a “labor of love” that had overwhelmed him. “This was an industry that had no formal regulations to look to for guidance,” he wrote. Gore was ultimately sentenced to one year of deferred jail time and four years of probation.
Harp told KTVK that he wants more federal regulation of the industry.
“This is a horror story,” said Harp. “It’s just unbelievable. This story is unbelievable.”