22 November, 2020

Why I am Bearish on Substack

The big trend in writing and journalism in the year 2020—other than the New York Times continued conquest of everything in print—is the flowering of the Substackerati.[1] Hardly a day goes by without some famous figure announcing their new hope you will become a new subscriber to a new newsletter they are writing on this new thing called Substack. This thing’s rise is a glory to behold—but a glory whose shine I am deeply skeptical of.

I will admit at the start of this post that my bearishness on all things Substack may just come down to an obstinate old-fashionedness on my part. I am a child of the old blogosphere. I am nostalgic for the old ways. It is possible that all I am about to write confuses what I wish for what I see. But my skepticism of the Substack model is rooted in my experience of writing on the web over the last decade and a half, and informed by my research into what made good writers tick in the decades and centuries before that.

I invite you to read some of these investigations (start with “The World Twitter Made.” Also relevant: “Requiem For the Strategy Sphere," "Public Intellectuals Have Short Shelf Lives,” “Life in the Shadow of the Boomers,” “Book Notes: Strategy, a History,” and “On Adding Phrases to the Language.”) A running theme in all of these essays is the importance of seeing individual authors not as individual authors, but as voices in a chorus. No writer is an island. If a "public voice" is inspired to spend hours massaging paragraphs and digging up references, it is because she has something to prove, and more important still, someone to prove it to. She writes in response to ideas she has heard or read. She feels compelled to add her voice to a larger conversation. The best thinkers speak to more than their immediate contemporaries, but without that contemporary argument in the background few would bother speaking at all.

Substack is the medium of the solo artist. High-rolling soloists at that. Like Patreon, Onlyfans, book publishing generally, or any other medium where creators connect with the masses sans bundled packaging, Substack has (and will continue to have) a power-law distribution. The biggest names will earn in their hundred of thousands; the median user is going to scrape away $100-200 a month, at best. If measured in page hits instead of dollars, the same could have been said for the high and low tiers of the old blogosphere as well. Then the world's most popular independent writers occasionally drove national news cycles. After a few weeks of feeble posting the vast majority of bloggers in the lower tier gave up writing altogether (by 2009 Technocrati was reporting that there were 133 million blogs in the world—and a full 95% of them had been abandoned).[2]

However, the blogosphere allowed for a healthy medium layer of independent writers that existed between nationally prominent blogs and your next door neighbor's defunct site on typepad. What allowed this middle tier to thrive? Other middle tier bloggers! Each writer was embedded in her own little archipelago of other writers all working on the same topics. It might be devoted to climate science, counterinsurgency theory, Black politics, New York fashion, Mormon Mommy blogging, Harry Potter themed slash fan-fiction, or something else altogether, but the archipelago was there. Other bloggers—along with a few of the long term commentators shared by the various blogs—were the intended audience of most pieces. Others' pieces were the inspiration for one's own. Bloggers were nodes on a network, and it was the network that sustained them.

The current intellectual sphere (centered on Twitter) makes interaction even easier. Its cost is an eroding sense of community. The borders between different blogging communities were permeable, but they were borders. On Twitter everyone and everything is tossed together in one great jumble. Users are always one bad tweet away from upsetting the entire internet. In this twitter-driven intellectual scene, conversation is vigorous but vapid. Tweeting favors performance over coherence, anger over insight. The show goes on but is ever less worth the watching.

At some point a correction was due. The driving force behind the correction may simply be fatigue with this state of affairs. It may also be rising Zoomers, "social media natives" who joined Facebook and Instagram well aware that their parents and teachers were peering at what they posted, trained from adolescence to shy away from public eyes. Whatever the cause, the new trend is clear: conversations are moving onto platforms like Slack, Discord, and Substack. In place of the easily searched, permanent records of yesterday, we find conversations behind closed doors, reserved for followers, fans, and fellow travelers. If old and existing platforms were designed to catalog your best moments then bounce them across the breadth of the world wide web, this new suite of platforms are intentionally opaque. Even a private bulletin-style message board (of the sort that reached peak popularity c. 2007), just as closed off from the general public as a private Distro or Slack is today, was legible in a way these new chat-apps are not. Those old forums were designed so that members could easily locate past discussions of a certain topic and read them in full. Forum etiquette often demanded they do so. Try to do the same on Slack!

What Slack and Discord are to the old forums, Substack newsletters are to the old blogs. All three are closed off from the outside, difficult to navigate, and impermanent. That impermanence is relative—conversations on Discord do not disappear, and anyone who receives a Substack newsletter can save it if they wish. But most Discord chat messages are buried in the stream, never to be read again. Most Substack send-outs are deleted from inboxes as soon as they are read. None of it is indexed for the search engines. With the rise of these new platforms we see the death of old hopes. The dream of an unbounded internet was realized, and we discovered a nightmare. Scarred,  participants in public intellectual life retreat behind the battlements. We revert to something like the internet of the early aughts, but with apps.

The great question is whether this new internet will be able to sustain meaningful intellectual exchange. By default, Substack splits intellectual activity into vertical silos, with readers at the bottom and authors at the top but no horizontal connections between them. In a world where most content exists behind paywalls and is distributed through private channels, neither the high tempo conversations driven by twitter virality nor the blogophere's slower cycle of post and response will be possible. Both of those systems assume that readers have access to the full conversation taking place. More importantly, both systems assume that writers have full access to the full conversation that prompts them into writing. On Substack, there are too many walls dividing up the garden.

The history of 21st century web publishing is not the rise and fall of individual writers, but the rise and fall of entire communities of writers. This is the central contradiction with Substack's quest to remake intellectual life. It is one thing to pay $10 a month to support your favorite writer. It is harder to pay $10 a month to each of the 10 or 15 other writers that make your favorite writer's writing possible in the first place. Steam rarely rises higher than the source. Those writers are a necessary part of a healthy media ecology. Lose the source, and the steam goes with it.

Prior to blogging, these communities were usually centered on magazines and journals, which gathered the various voices committed to an intellectual project and packaged them together under one masthead. One can imagine something similar happening on Substack: a meta-newsletter that delivers content from the best of various Substackerati. But in making such a pivot, all Substack will have done is recreate a media format that is currently failing: the paywalled online magazine. The start-up costs of a new Substack-based magazine will be substantially lower than hiring web developers to create one's own site. But low enough? Low enough to save a format already dying? Is a mailbox delivery system really enough to distinguish future Substack magazines from existing journals sitting behind paywalls or begging for support on Patreon?

That is a financial take on the problems of a Substack-based epistemic community. But the intellectual problems of such a community may prove just as important. Substack favors those who already have large megaphones. A Substack-based intellectual sphere will be intensely, if unintentionally, hostile towards new blood. Magazines and newspapers solve this problem by packaging new authors that might appeal to their readership in the same issues as big names. The blogosphere solved this problem through comments and trackbacks, which allowed bloggers and their readers to discover other quality writers worth following. There is no mechanism for this sort of thing on Substack. A minor writer on Substack will not grab the attention of a major one; readers will never stumble from the big to the small.

This is a recipe for intellectual sterility. A media ecosystem composed of the New York Times, a few other large newspapers, and a swarm of hungry Substackerati will starve itself out. The big Substack names will continue to rake in subscriptions, of course, but what will they have to talk about? Only the same old ideas they had been playing with for decades. These lone agents will lack a milieu to work against. The dominance of a few big newspapers and a few big newsletters will guarantee that no milieu of new writers will form. The most interesting conversations will be happening in private Slacks and Discords, or on even newer apps like Clubhouse, in all cases available only to the select few who began the game prominent enough to be invited into gilded circles.

I do not think this is sustainable. I admit there is a possibility that I am letting my normative preferences cloud my objective view of the situation. On the other hand, this is an issue in which I have "skin in the game." I am putting my money where my mouth is. I am currently working with a Wordpress development team to move The Scholar's Stage to a more professional domain, complete with a suite of additional features. (Soon I will be releasing some Scholar's Stage polls to discover a bit more about what features and content my readership would most like to see on the revamped site, and what sort of things might induce them to contribute to my Patreon). This decision to not transition to Substack reflects both my skepticism about the new platform and my personal commitment to an intellectual sphere that is both public and healthy. We lose something when intellectual discussion retreats entirely behind the battlements. I do not want to hasten that loss.


If you would like to read more of my musings on the rise and fall of epistemic communities, you will find the essaysThe World Twitter Made," “Requiem For the Strategy Sphere," "Public Intellectuals Have Short Shelf Lives,” “Life in the Shadow of the Boomers,” “Book Notes: Strategy, a History,” "On the Angst of the American Journalist" and “On Adding Phrases to the Language” of great interest. To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] I take this wonderful neologism from Clio Chang,"The Substackerati" Columbia Journalism Review (Winter 2020). 

[2] Douglas Quenqua, "Blogs Falling in an Empty Forest," The New York Times (5 June 2009).

19 November, 2020

Do Not Choose Susan Rice

Image source

There is a grand tradition in American politics of bashing the other side's nominees. In the spirit of that tradition, I have a new piece out in the American Conservative that questions whether Susan Rice is fit to be the Biden administration’s nominee for Secretary of State. Rice is a controversial figure for all sorts of reasons. Most go back to her role in the Clinton administration’s response to the Rwandan genocide, or of her handling of the Libyan and Syrian crises in the Obama administration. Most famously, Rice became the Republican beat-up-doll of choice in 2013, when the terrorist attack in Benghazi was being used for maximum partisan advantage.

I have always considered the Benghazi stuff to be a bit spurious. Larger questions surrounding the wisdom of American intervention in Syria and Libya have far more merit, and I was disappointed—if not surprised—to find that Rice’s memoir does not honestly grapple with the poisonous legacies of America's decision to intervene in either of these conflicts. But my critique of Rice in the American Conservative is not grounded in her behavior in the Near East, but the Far.

 The Biden administration has talked a great deal about the need to ground China policy in a cooperative framework with America’s allies. But it is precisely Susan Rice's track record on this issue that makes Asian diplomats and officials so concerned about her potential return to power. There are two issues at play here here. The first involves Rice’s personal foibles and her consequent dismal reputation across the Indo-Pacific. But also at issue is a broader set of misconceptions about the respective strengths and weaknesses of the Obama and Trump administration’s approach to Asia. My hope is that now that the election is over there is room to be a bit more clear-eyed on these questions. My fear, however, is that the Biden administration will succumb to the sore temptation to abandon any genuine good done over the last four simply because a Trump official was the one who do it. An evenhanded evaluation of the Trump era’s mistakes and accomplishments in this domain is needed, so that new the administration may jettison less helpful schemes without throwing out the policies that increased America's credibility and leverage in the region.

I am sure that today’s piece will be the first of many in this vein. It should be seen as building on earlier work written by James Crabtree and Jeremy Stern earlier this year. [1] Both have pointed out that Trumpish diplomacy was not nearly as subversive in Asia as it was in Europe. I particular recommend reading Stern’s eloquent Palladium essay, “America’s New Post-Western Foreign Policy,” which is one of the best pieces I have read on American foreign policy in the 2010s. I quote from Stern in my new American Conservative piece, but if you only have enough time today to read one essay, read his, not mine. 

Stern observes that while Obama had a declared commitment to Asia and seemed destined to be the harbinger of a new Pacific order, his administration's foreign policy was tradition-bound, wedded by ideology and personal style to the values, institutions, issue sets, and personalities of the transatlantic relationship.

Susan Rice personifies the worst tendencies of this strain of Obama-era diplomacy. This is why so many foreign policy figures in countries like Taiwan, Japan, Australia, India,  and the ASEAN states dislike her. I provide several quotations to this effect in AmCon. Some might accuse me of cherry picking here, and there is some merit to that critique, but if I am cherry picking—well, Susan Rice is just about the only diplomatic figure that I could cherry pick about. If not every diplomat despises Rice, no other American official generates enough discontent that it leaks out into the open press at all (there are a few more common punching bags in private. One day it might be necessary to write about them too).

The one group of people who unequivocally  appreciate Susan Rice are the Chinese. I describe why this is so in my piece:
Above all else, Rice has earned a well-deserved reputation as the senior American official most willing to sacrifice the interests of American partners to chase what Rice calls “expanded cooperation” with Beijing. Susan Rice credits herself with a commanding role in the implementation of Obama’s China strategy. In her memoir Rice describes why she, as National Security Advisor, needed to take control of America’s relationship with China, instead of allowing another NSC “Principal” (like the Secretary of State) to take charge: 
China has long preferred dealing directly with the White House on bilateral affairs . . . [and] given the complexity of the relationship, its many economic and strategic facets, and the need to ensure that multiple disparate agencies sing from the same hymnal, strong White House leadership makes sense. As NSA I embraced this responsibility. 
This framing may seem innocuous, but its sentiments raise alarm bells across Asia. By elevating U.S.-China relations as the bilateral relationship in American foreign policy (Rice often describes it as “the most consequential bilateral relationship in the world”), the one realm of cooperation that demands continuous cross-domain coordination from the White House itself, Rice devalues America’s actual partners in the region. Taiwanese, Filipino, Australian, and Indian diplomats (to say nothing of their Singaporean, Thai, or Vietnamese counterparts) know that if all aspects of the China relationship are cross-linked, then their interests will always be traded out for better Chinese behavior with respect to Iran, North Korea, climate change, the cyber realm, or any other domain the Chinese decide to pull a tantrum over that week. This framing is even more humiliating for the Japanese. It forces them into an undeserved second-class spot. Though Tokyo is America’s most important ally, the hub on which U.S. foreign policy depends, and a power more crucial to American-led financial and macroeconomic coordination than Beijing has ever been, Tokyo was not given the same bilateral access to the Obama White House that Rice arranged for the Chinese Communist Party. Susan Rice owned the China relationship; Japanese concerns did not command the attention of any of the Principals. Obama hardly spoke to Putin without first hearing Merkel’s take; the Japanese were given little input into American strategy for managing China.[2]

The Trump administration, in contrast, made a point of seeking Japanese input on all aspects of China and North Korea policy, both at the level of the President and those a few rungs below him. In two years Trump had quadruple the personal contact with Prime Minister Abe than Obama had during his two terms as President. As I note,
Before Trump, this sort of cooperation and engagement was reserved for the Chinese, favored European allies, and Middle Eastern countries then subject to American counterinsurgency campaigns. Rice’s autobiography reflects this focus. Of its 482 pages, only 13 cover China—most of which are spent celebrating the administration’s 2015 cyber agreement with Beijing (which never had a credible enforcement mechanism, was still not fully implemented at the end of Obama’s tenure, and was abandoned by the Chinese shortly after he left office) and the administration’s unsuccessful efforts to convince the communists to take a harder line against North Korea. But these 13 pages are a mountain compared to her sparse treatment of America’s Asian partners. Across the book, Japan and India are only given a few scattered mentions. The U.S.-Philippines relationship is reduced to a sentence. There is no entry for “Taiwan” in the index. 
Rice boasts in her book that she “understands the interests and the idiosyncrasies” of the Chinese, but never demonstrates similar knowledge or concern with any other power in the region.[3]

There were other examples I could have given in the column had I the space to do so. Rice's declarations on Asia do not inspire confidence. She was the first American official to adopt China's "new type of great power relations" framing as official American policy. After leaving office she has repeatedly affirmed the supremacy of the U.S.-China bilateral relationship and continued to argue that all aspects of that relationship should be cross linked to each other. See this interview for a good example of that sort of talk—and see that terrible quote at the end:
Rice also told Rose that although China has been more aggressively building a presence in the South China Sea, she believes the U.S. has “managed” them in that regard.[4]

We managed them! America did not manage the Chinese in the South China Sea; they managed us. That an American official could look at the aftermath of the Scarborough Shoal incident and spin it as some sort of victory in relationship management—well, is it any surprise that she has such a low reputation in the region?

Biden ran on a return to normalcy. He also ran a friend to the American alliance system. The tension here is that America's Indo-Pacific partners do not want a restoration of the 2016 status quo. They have benefited in too many ways from being at the diplomatic center of America's newfound strategic approach to China. The Trump administration's abrasive, tackless, and at times malicious dealings with our European partners were unnecessary, and evils done there must be undone. But another factor behind European angst is the larger shift in American attention, resources, and commitments from West to East. This shift must not end. It would be a mistake for Biden to restore the old Obama administration's obsession with European issues and Middle Eastern crises on the one hand, and the privileged position it gave to Chinese perspectives on the other. That is what is actually at stake in Rice's appointment.

If you found this take on American policy in Asia of interest, you might also find the posts  "Why I Fear For Taiwan," "Losing Taiwan is Losing Japan," "The Road to Beijing Runs Through Tokyo," and "Give No Heed to the Walking Dead." To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

 [1] Jeremy Stern, “America’s New Post-Western Foreign Policy,” Palladium (4 September 2020); James Crabtree, "Asian leaders underestimate the danger of Trump's reelection," Nikkei (27 February 2020); "Biden Has a Serious Credibility Problem in East Asia," Foreign Policy (10 September 2020).

 [2] Tanner Greer, "Susan Rice is Asia's Worst Nightmare," American Conservative (19 November 2020).

[3] ibid

[4]  "Susan Rice: "We can’t afford to play fast and loose" with China," CBS News (17 January 2017).

02 November, 2020

Plagues of Hate

Samuel Cohn's Epidemics: Hate and Compassion from the Plague of Athens to AIDS is true door-stop of a book, encyclopedic in ambition, coming in at a full 650 pages of prose and citations. In a new book review over at the Washington Examiner I describe the book's origins:

In the summer of 2009, Samuel Cohn, historian of plague and malady, was contacted by the New York Times. A new strain of deadly flu was then sweeping the land, and the New York Times was concerned that the popular name for the new disease, "Mexican swine flu,” might lead to hate crimes targeting a vulnerable minority. This is what had happened to the Jews during the Black Death, the editors reminded Cohn, and the world needed a man with Cohn’s expertise to warn that it might happen again. The only problem: Cohn did not think it would happen again. In the historical case studies he was most familiar with, epidemics led not to acts of hate but to acts of compassion. Often that compassion crossed class and racial lines.

The New York Times declined to publish Cohn’s more optimistic op-ed. For that, we should be thankful. As Cohn tells us in the introduction to his 656-page tome, Epidemics: Hate and Compassion from the Plague of Athens to AIDS, the New York Times’s rejection launched a decade long research program to discover who was right. This sort of research would not have been possible in an earlier era: To write his book, Cohn used computer search tools to locate all mentions of plague and sickness in the classical corpus, scan through hundreds of medieval documents, and scour thousands of epidemic reports in databases of newspapers from 19th- and 20th-century Russia, Italy, Germany, England, America, Canada, Australia, and India.[1]

This book review builds on the earlier spate of books I read back in March on epidemics in world history (see the post "Bullet Reviews: A Bunch of Posts on Epidemic and Disaster Response"). I secured a copy of Epidemics: Hate and Compassion for review back in April, but due to its immense size and other writing commitments, only finished it last month. I am glad I read it, however, and wished many more had read it before our current pandemic began. 

Cohn's central finding is that the vast majority of epidemics had a positive, "pro-social" effect on the peoples plagued by them. We do not to appreciate this, for even the historically minded among us tend to be most aware of the outliers. Thucydides' eloquent description of the disintegration of Athenian society after being ravaged by a still-unidentified disease is well known; the Black Plague, with its mass attacks on Jews, is another common historical touchstone. How Europeans reacted to the hundreds of epidemics between those two dates is not often reported. Nor is the social reaction to the diseases of the next millennium especially well represented in our historical memory. This is partially due to a historiographical quirk. Modern historical study of disease owes much to the social impact of the AIDS epidemic of the 1980s, which prompted historians to search back for past examples of disease related stigma and persecution. 

One finds what one looks for. American commentators in the 1980s were quick to connect the American response to AIDS to the Medieval response to the Black Death (a claim Cohn rejects as utterly sophistical).[2] They were also able to find many case studies of hate and discrimination in other epidemics across Western history. Cohn does not reject these comparisons outright (though in some cases, such as his chapter on syphilis in Early Modern Europe, he comes close to doing so), but he does suggest they suffer from a narrow, blinkered view of the human past:

With this picture of disease-fueled hate, little attention was paid to how different social classes might align as the targets or perpetrators of violence. More astoundingly, these views have reflected little, if at all, on the power of past plagues to ignite compassion, bringing volunteers to make sacrifices for complete strangers across class, ethnic, and racial divides. Nor do these accounts recognize epidemics’ political effects that mobilized citizens to combat governmental neglect in medical and social services, dilapidated hygienic infrastructures, and unjust, abusive, and ineffectual controls that stigmatized and persecuted sectors of the population. Yet from the 1990s, writing on AIDS began to shift from a view darkly centered on blame to one forged by compassion and political activism.[3]
As that last sentence hints, even the response to AIDS was not the simple story of discrimination and hate we are familiar with today. More recent historians of the disease describe how in America and Europe AIDS strengthened LGBTQ solidarity and the organizational capacity of LGBTQ organizations, spurred mass voluntarism to aid AIDS patients, led to legislation like the American Disabilities Act, and ultimately improved the relationship between gay patients and citizens with medical and government authorities. In Africa, AIDS was a more general crisis, and did not have any special association with homosexuality, but produced similar waves of voluntarism and association-forming on behalf of AIDS patients and orphans.[4]   

If the traditional narrative surrounding AIDS only tells half the story, attempts to ret-con the hateful half of that story onto epidemics past is even worse practice. With a few exceptions (which we will get to a minute), these were isolated incidents out of sync with the general social response to the diseases in question. Most epidemics bring out the best in humanity. Shared suffering creates solidarity, compassion, and an admirable determination to sacrifice for the sake of fellow human beings. If you have read very much in the disaster-management literature this will not surprise you. Resilience and selflessness are hallmarks of human behavior in the face of most terrible disasters.[5]

 But not every disaster. Cohn identifies three diseases that reliably brought out the violent side of our natures: plague, cholera, and smallpox. With the exception of the Black Plague, this violence was all a 19th century affair. As I write in my review:

Significant disease-related violence would not return to European shores until the 1800s, when large-scale riots, several with tens of thousands of rioters, accompanied breakouts of cholera, and then, later, a new strain of the bubonic plague. Similar disturbances on a slightly smaller scale would occur when these diseases arrived in North America and the Indian subcontinent. None of this violence was targeted at the sick, nor was any of it directed toward a scapegoated minority. In all of these cases, the main target of popular anger was the state. Again and again, poor communities subjected to quarantine, centralized isolation schemes, or invasive surveillance would rise in revolt against government action they believed was discriminatory or oppressive. Over the last two centuries, Cohn found, the main cause of pandemic-related disorder was not disease but government attempts to control diseased citizens.

The only modern case study to fit the received understanding of epidemic violence partially came with the smallpox epidemic of 1880. Cohn chronicles 72 separate episodes in which people murdered smallpox patients or burned down the medical stations, churches, or hospitals that sheltered them in order to keep the infection from spreading to their communities. This happened in the high age of lynch mobs, and it comes as no surprise to learn that a disproportionate percentage of those murdered were black. [6]

 Cohn admits that he does not know what made the reactions to these diseases so much more explosive than reactions to equally virulent pandemics like Yellow Fever, though he does suggest that the 19th century settinga time when disease was known to be caused by contagious bacteria, but in which trust in governments and doctors was lowplays a part. I speculate on some possible reasons in my review and encourage you to read my speculations over at the Washington Examiner. Here I would like to raise a few ancillary points that did not fit inside its confines. 

First: the encyclopedic, comparative, and quantitative nature of Cohn's study makes it extremely useful, but in some ways painfully limited. Cohn castigates other historians for not checking their generalizations about diseased societies against comparative data. He is right to do so. But reliance on broad yet shallow sweeps means that Cohn is unable to even attempt answers to some of the interesting questions posed by his study. Smallpox did not universally lead to hatred, violence, or abandonment. The peak for this sort of thing was the 1880s, and most specifically, the United States in the 1880s. But why? What about American society in that decade led Americans to treat smallpox victims so viciously, yet respond to a Yellow Fever epidemic that same decade so nobly?  Why was the response to smallpox more measured later and earlier in the century? Answering this question would require a deep dive into the era, a dive deeper than Cohn's methods allow. Hopefully some graduate student looking for a dissertation topic might pick up Cohn's book (or my review of it!) and spend a few years trying to figure this out. 

Second: the other striking thing about this 1880 wave of violence is that no one has reported on it before. There are no monographs on this particular smallpox epidemic, and no secondary source before Cohn has described the violence that came from it. Findings like these make it difficult for me to take books like Peter Turchin's Ages of Discord seriously. Folks like Turchin try to measure and model cycles of "sociopolitical instability" in American life. The 72 violent attacks that accompanied this smallpox epidemic seem like a necessary data point in any attempt to quantify social violence, but no one had any idea that this wave of violence even happened until Cohn published his book two years ago. How many other waves of violence are out there, real but not yet known?

Third: In the face of disease and disaster, the historical norm is solidarity and selflessness. Most disasters bring societies closer together. I recall William Yang's many tweets from Wuhan at the peak of the outbreak there, each a testament to the bravery and sacrifice to the people of that city. There, at least, the coronavirus was a "pro-social" event. I am disturbed by America's failure to reap the same benefits from disaster. It is true that we have not seen mass panic or violent scapegoating of the sort feared back in March (one can hear Cohn reply in the background, "well of course you haven't, violent, disease-inspired scapegoating of minorities has only happened a few times all of human history!)"  But the great crisis of our times has not brought Americans together. It has not increased our sense of solidarity or occasioned regular, spectacular acts of selflessness that cross over lines of race, class, and creed. We have seen bravery from emergency responders, nurses, and doctorsbut those are people we pay to be brave. Their spirit has not spread. Shared suffering has not bound us together. America stands more divided now than at any time in living memory.

These are the hallmarks of a sick society. This is an affliction of the spirit, not the lungs. All the worse for us. There are no vaccines for sickness of the soul.

If you would like to read my past reviews of books on pandemic or resilience, see the posts "A Bunch of Books on Pandemics and Disaster Response" and "On Cultures That Build." If, on the other hand, you want to further explore the implications of this post's last paragraph, you might find "On Days of Disorder" and "On Sparks Before the Prairie Fire" of interest.  To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] Tanner Greer,  "Do Plagues Make us Hate?" Washington Examiner (29 October 2020).

[2] Writes he:

So the annihilation of hundreds of communities down the Rhine in 1348–9 with men, women, and children burnt alive on islands or in their synagogues, ending medieval Jewish civilization in its heartland, was the equivalent of burning one house in Arcadia, Florida, where no one was injured?
Samuel Cohn, Epidemics: Hate and Compassion from the Plague of Athens to AIDS (Oxford: Oxford University Pres, 2018),  547, 

 [3] ibid., 557.

[4] ibid, 550-557.

[5] See Norris R. Johnson, “Panic and the Breakdown of Social Order: Popular Myth, Social Theory and Empirical Evidence,” Sociological Focus 20, no. 3 (1985), 171–83; Lee Clarke, “Panic: Myth or Reality?,” Contexts 1, no. 3 (2002), 21–26; Erik Auf der Heide, “Common Misconceptions About Disasters: Panic, the ‘Disaster Syndrome’ and Looting,” in The First 72 Hours: A Community Approach to Disaster Preparedness, ed. Margaret O'Leary (Lincoln: iUniverse Publishing, 2004), 340–80; Anthony R. Mawson, Understanding Mass Panic and Other Collective Responses to Threat and Disaster," Psychiatry: Interpersonal and Biological Processes 68, iss 2 (2005), 95-113; Ben Sheppard et al., “Terrorism and Dispelling the Myth of a Panic Prone Public,” Journal of Public Health Policy 27, no. 3 (2006), 219–45; Lee Clarke and Caron Chess, "Elites and Panic: More to Fear than Fear Itself," Social Forces 87, iss 2 (2008),  993–1014; Chris Cocking, John Drury, and Steve Reicher, “The Psychology of Crowd Behaviour in Emergency Evacuations: Results from Two Interview Studies and Implications for the Fire and Rescue Services,” The Irish Journal of Psychology 30, no. 1–2 (2009), 59–73; Drury, Cocking, and Reicher, “Everyone for Themselves? A Comparative Study of Crowd Solidarity Among Emergency Survivors,British Journal of Social Psychology 48 (2009): 487–506.

Rebbeca Solnit's A Paradise Built in Hell: The Extraordinary Communities That Arise in Disaster  (New York: Penguin, 2006) covers many of the same themes from a more historical, and less sociological perspective. 

[6] Greer, "Do Plagues Make us Hate?"

28 October, 2020

On Life in the Shadow of the Boomers

Image source

Ideology, which was once the road to action, has become a dead end.

—Daniel Bell (1960)

Yuval Levin's 2017 book Fractured Republic: Renewing America's Social Contract in the Age of Individualism has several interesting passages inside it, but none so interesting as Levin's meditation on the generational frame that clouds the modern mind. Levin maintains that 21st century Americans largely understand the last decades of the 20th century, and the first decades of the 21st, through the eyes of the Boomers. Many of the associations we have with various decades (say, the fifties with innocence and social conformity, or the sixties with explosive youthful energy), says Levin, had more to do with the life-stage in which Boomer's experienced these decades than anything objective about the decades themselves:

Because they were born into a postwar economic expansion, they have been an exceptionally middle-class generation, targeted as consumers from birth. Producers and advertisers have flattered this generation for decades in an effort to shape their tastes and win their dollars. And the boomers’ economic power has only increased with time as they have grown older and wealthier. Today, baby boomers possess about half the consumer purchasing power of the American economy, and roughly three-quarters of all personal financial assets, although they are only about one-quarter of the population. All of this has also made the baby boomers an unusually self-aware generation. Bombarded from childhood with cultural messages about the promise and potential of their own cohort, they have conceived of themselves as a coherent group to a greater degree than any generation of Americans before them. 
Since the middle of the twentieth century they have not only shaped the course of American life through their preferences and choices but also defined the nation’s self-understanding. Indeed, the baby boomers now utterly dominate our understanding of America’s postwar history, and in a very peculiar way. To see how, let us consider an average baby boomer: an American born in, say, 1950, who has spent his life comfortably in the broad middle class. This person experienced the 1950s as a child, and so remembers that era, through those innocent eyes, as a simple time of stability and wholesome values in which all things seemed possible. 
By the mid-1960s, he was a teenager, and he recalls that time through a lens of youthful rebellion and growing cultural awareness—a period of idealism and promise. The music was great, the future was bright, but there were also great problems to tackle in the world, and he had the confidence of a teenager that his generation could do it right. In the 1970s, as a twenty-something entering the workforce and the adult world, he found that confidence shaken. Youthful idealism gave way to some cynicism about the potential for change, recreational drugs served more for distraction than inspiration, everything was unsettled, and the future seemed ominous and ambiguous. His recollection of that decade is drenched in cold sweat
In the 1980s, in his thirties, he was settling down. His work likely fell into a manageable groove, he was building a family, and concerns about car loans, dentist bills, and the mortgage largely replaced an ambition to transform the world. This was the time when he first began to understand his parents, and he started to value stability, low taxes, and low crime. He looks back on that era as the onset of real adulthood. By the 1990s, in his forties, he was comfortable and confident, building wealth and stability. He worried that his kids were slackers and that the culture was corrupting them, and he began to be concerned about his own health and witness as fifty approached. But on the whole, our baby boomer enjoyed his forties—it was finally his generation’s chance to be in charge, and it looked to be working out. 
As the twenty-first century dawned, our boomer turned fifty. He was still at the peak of his powers (and earnings), but he gradually began to peer over the hill toward old age. He started the decade with great confidence, but found it ultimately to be filled with unexpected dangers and unfamiliar forces. The world was becoming less and less his own, and it was hard to avoid the conclusion that he might be past his prime. He turned sixty-five in the middle of this decade, and in the midst of uncertainty and instability. Health and retirement now became prime concerns for him. The culture started to seem a little bewildering, and the economy seemed awfully insecure. He was not without hope. Indeed, in some respects, his outlook on the future has been improving a little is he contemplates retirement. He doesn’t exactly admire his children (that so-called “Generation X”), but they have exceeded his expectations, and his grandchildren (the youngest Millennials and those younger still) seem genuinely promising and special. As he contemplates their future, he does worry that they will be denied the extraordinary blend of circumstances that defined the world of his youth. 
The economy, politics, and the culture just don’t work the way they used to, and frankly, it is difficult for him to imagine America two or three decades from now. He rebelled against the world he knew as a young man, but now it stands revealed to him as a paradise lost. How can it be regained? This portrait of changing attitudes is, of course, stylized for effect. But it offers the broad contours of how people tend to look at their world in different stages of life, and it shows how Americans (and, crucially, not just the boomers) tend to understand each of the past seven decades of our national life. This is no coincidence. We see our recent history through the boomers’ eyes. Were the 1950s really simple and wholesome? Were the 1960s really idealistic and rebellious? Were the 1970s aimless and anxious? Did we find our footing in the 1980s? Become comfortable and confident in the 1990s? Or more fearful and disoriented over the past decade and a half? As we shall see in the coming chapters, the answer in each case is not simply yes or no. But it is hard to deny that we all frequently view the postwar era in this way—through the lens of the boomer experience. 
The boomers’ self-image casts a giant shadow over our politics, and it means we are inclined to look backward to find our prime. More liberal-leaning boomers miss the idealism of the flower of their youth, while more conservative ones, as might be expected, are more inclined to miss the stability and confidence of early middle age—so the Left yearns for the 1960s and the Right for the 1980s. But both are telling the same story: a boomer’s story of the America they have known. The trouble is that it is not only the boomers themselves who think this way about America, but all of us, especially in politics. We really have almost no self-understanding of our country in the years since World War II that is not in some fundamental way a baby-boomer narrative. [1]

When I first read this passage in 2018 I experienced it as a sort of revelation that suddenly unlocked many mysteries then turning in my mind. 

To start with: The 1950s did not seem like an age of innocent idyll or bland conformity to the adults who lived through it. It was a decade when intellectual life was still attempting to come to terms with the horrors of World War II and the Holocaust. Consider a few famous book titles:  Orwell's 1984 (published 1949), Hersey's The Wall (1950), Arendt's The Origins of Totalitarianism (1951), Chambers' Witness (1952), Miller's The Crucible (1953), Bradbury's Fahrenheit 451 (1953), Golding's Lord of the Flies (1954), Pasternak's Doctor Zhivago (1957), and Shirer's Rise and Fall of the Third Reich (1960) were all intensely preoccupied with the weaknesses of liberalism and the allure of totalitarian solutions.  For every optimistic summons to Tomorrowland, there was a Lionel Trilling, Reinhold Niebuhr, or Richard Hofstadter ready to declare Zion forever out of reach, hamstrung by the irony and tragedy of the American condition. Nor was it the wholesome era of memory. An age we associate with childlike obedience saw its children as anything but obedient—witness the anxiety of the age in films like The Wild One (1953), Rebel Without a Cause (1955), and Blackboard Jungle (1955). This age of innocence saw the inaugural issue of Playboy, the books Lolita (1955) and Peyton Pace (1956) hitting the New York Times Fiction best seller list, the Kinsey reports topping the Non-fiction best seller list, and Little Richard inaugurating rock 'n roll with the lyrics

Good Golly Miss Molly, sure like to ball
When you're rocking and rolling
Can't hear your mama call.

And that is all without considering a lost war in Korea, the tension of the larger Cold War, and the tumult of the Civil Rights revolution. We may think of the 1950s as an age of conformity, purity, and stability, but those who lived through it as adults experienced it as an age of fragmentation, permissiveness, and shattered innocence.[2]

Levin explains why our perception of the era differs so much from the perceptions of the adults who lived through it. We see it as an age of innocence because we see it through the eyes of the Boomers, who experienced this age as children. But his account also helps explain something else—that odd feeling I have whenever I watch Youtube clips of a show like What's My Line. Though products of American pop culture, those shows seem like relics from alien world, an antique past more different in manners and morals from the America of 2020 than many foreign lands today. However, this eerie feeling of an alien world does not descend upon me when I see a television show from the 1970s. The past may be a different country, the border line is not crossed until we hit 1965. 

This observation is not mine alone. In his new book, The Decadent Society: How We Became Victims of Our Own Success, Ross Douthat describes it as a more general feeling, a feeling expressed in many corners on the 30 year anniversary of the 1985 blockbuster Back to the Future. The plot of that film revolves around a contemporary teenager whisked back via time machine to the high school of his parents, 30 years earlier. When the film's anniversary hit in 2015, many commented that the same plot could not work today. The 1980s simply seemed far too similar to the 2010s for the juxtaposition to entertain. Douthat explains why this might be so: 

A small case study: in the original Back to the Future, Marty McFly invaded his father’s sleep dressed as “Darth Vader from the planet Vulcan.” The joke was that the pop culture of the 1960s and 1970s could be passed off as a genuine alien visitation because it would seem so strange to the ears of a 1950s teen. But thirty years after 1985, the year’s biggest blockbuster was a Star Wars movie about Darth Vader’s grandkid… which was directed by a filmmaker, J. J. Abrams, who was coming off rebooting Star Trek… which was part of a wider cinematic landscape dominated by “presold” comic-book properties developed when the baby boomers were young. A Martina McFly visiting the Reagan-era past from the late 2010s wouldn’t have a Vader/ Vulcan prank to play, because her pop culture and her parents’ pop culture are strikingly the same.... 
Even the exceptions to this rule, the still-creative portions of pop cinema, are often tethered to the boomer era. When big-screen science fiction isn’t just a straight-up eighties-vintage franchise movie— a Star Wars or Star Trek or Alien or Predator— it’s usually a strange multilayered exercise in recursion, like Denis Villeneuve’s Blade Runner: 2049, which trades on a peculiar nostalgia for an eighties dystopia that’s tellingly more technologically proficient than our own, or Steven Spielberg’s Ready Player One, in which the hero’s journey of the future takes place inside a virtual world built from the pop culture that the youthful Spielberg helped create. And then there are still-stranger cases, like the succes de scandale of 2019, Todd Phillips’s Joker, which both its fans and detractors treated as something novel and radical— an upending of superhero clich├ęs in the service of a politically engaged or politically dangerous message of despair or revolution. In reality, the movie was just a competent, handsome imitation of Scorsese’s harsh depictions of 1970s New York, embedded in the endlessly rebooting DC Extended Universe to make it marketable, linked to the problems of 2019 primarily by wishful thinking.... 
The reality of recurrence may be slightly harder for progressives to acknowledge than conservatives, because progressivism is more invested in its supposed position at the vanguard of cultural change, pressing boldly on to new frontiers. This makes it difficult for the left to recognize the generational recycling of its ambitions and anxieties: the fact that many progressive “breakthroughs” are just the culture cycling back to something that we did not that long ago— up to and including kick-ass female action heroes such as Wonder Woman (who followed a path blazed by Sigourney Weaver’s Ripley in the Alien movies, or the robot-wrangling Sarah Connor in the Terminator movies, or even the blaster-wielding Princess Leia in Star Wars forty years ago) or the African American heroes in Black Panther. (In truth, black stars were arguably more important in the years of Eddie Murphy and Richard Pryor and The Cosby Show and the young Denzel Washington than in our officially representation-obsessed age.)[3]

 But Douthat does not just think we are stuck recycling the pop-culture of Boomers past. He sees it too in the broader realm of social values and culture wars:

 Famous 1970s-era texts such as Christopher Lasch’s The Culture of Narcissism, Tom Wolfe’s essay “The ‘Me’ Decade and the Third Great Awakening,” and Robert Bellah and his coauthors’ sociology of American religion, Habits of the Heart seem entirely relevant to American culture today, whereas their equivalents from the 1950s— The Lonely Crowd, say, or The Man in the Gray Flannel Suit— feel like dispatches from a lost world. A book like Ta-Nehisi Coates’s Between the World and Me earned frequent comparisons to James Baldwin’s The Fire Next Time because its indictment of American racism could have been written in 1975 as easily as in 2015. A popular problems-of-feminism Atlantic cover story such as Anne-Marie Slaughter’s “Why Women Still Can’t Have It All” essay from 2012 could have its cultural references tweaked and be dropped into 1978 or 1994 without anyone noticing. 
The same goes on on the right, where Jordan Peterson’s popular tracts against the dangers of postmodernism are fresh and shocking only if you don’t remember the 1980s; if you do, they’re mostly a reminder that it’s been almost forty years since postmodernism was actually radical and new. More generally, the conservative critique of academic liberalism was distilled in the three decades between William F. Buckley’s God and Man at Yale in 1955 and Allan Bloom’s The Closing of the American Mind in 1987, and everything in the three decades since Bloom just recycles or reiterates their points. This is somewhat defensible because the academic politics that conservatives are critiquing keep cycling through the same recurring patterns too, with the campus battles of the 1960s giving way to the PC wars of the 1980s giving way to our own social justice struggle sessions... And, of course, all of these battles are happening on the same elite campuses, the same Very Important Schools as held sway over American higher education and high culture sixty years ago. There is no list more decadent in its stagnation and repetition than the U.S. News & World Report college rankings.[4]
Douthat extends this critique to politics proper.  The white-black wage gap and residential segregation have "neither worsened nor improved" since the 1970s; the male-female wage gap has held flat since the 1980s, and public opinion on abortion "has been remarkably stable" since that same decade. Thus

Against these backdrops, the similarities between the Clarence Thomas Supreme Court confirmation fight in 1991 and the 2018 Brett Kavanaugh confirmation fight, between the current Black Lives Matter moment and the O. J. Simpson– and Rodney King– era debates about police brutality in the mid-1990s, between abortion debates in 1990 and the abortion debate today— even between the sexual scandals of Donald Trump and the sexual scandals of Bill Clinton (albeit with the parties supporting the priapist reversed)— are not coincidental. They reflect what Barzun calls the constant “deadlocks of our time”: the persistent controversies that await some new dispensation to be transcended or resolved. 

That pattern extends beyond culture wars to other ideological debates in American politics, where the left-wing and right-wing coalitions have generally been locked in place since the Reagan Revolution, stalemated not only politically but also intellectually, cycling through the same domestic arguments, the same basic range of issues and ideas. The reason that American conservatives are so persistently nostalgic for the Reagan presidency, now more than thirty years gone, and the reason that liberals remain fascinated with their 1960s-era icons is that so little has changed politically since the upheavals that took place between Jack Kennedy’s assassination in ’63 and Reagan’s ’80 victory.

Or to borrow Mark Steyn’s time-travel conceit from an earlier chapter: most of today’s policy arguments, rhetorical frames, constituencies, and interests groups would all be more recognizable to a time traveler from the early 1980s than the debates of the late 1970s would have been to a voyager from the Depression era arriving in the age of Carter. The overall battle lines have shifted, mostly in the more individualistic direction: the right won some economic victories in the 1980s and 1990s, the left won some cultural victories in the 1990s and 2000s. But many, many arguments (over race, abortion, taxes, welfare) look very much as they did two generations ago.[5]

I have no quarrel with the broad contours of Douthat's argument, and see in it an explanation for the eeriness of 1950s pop culture. Films, television shows, and advertisements from that era were neither marketed at Boomers nor created by them. It was truly a different world. Americans of our age live imprisoned in the world of the Boomers. 

Yet Douthat veers off course when he argues that this sort generational cultural domination is a novel American experience. To portray the pre-Boomer past as a set of successive 20-year culture revolutions, Douthat must play funny with the chronology. This can be seen in the passages I've excerpted above but is most obvious in his discussion of literature:

 When high-end literature was being redefined by James Joyce and Virginia Woolf, F. Scott Fitzgerald and Ernest Hemingway, great novels from just 20 years earlier— Henry James’s The Ambassadors, Edith Wharton’s The House of Mirth— seemed like relics of another age. And twenty years after Hemingway published his war novel For Whom the Bell Tolls, a new war novel, Catch-22, made it seem preposterously antique.

Now try to spot the big, obvious, defining differences between 2012 and 1992. Movies and literature and music have never changed less over a twenty-year period. Lady Gaga has replaced Madonna, Adele has replaced Mariah Carey— both distinctions without a real difference— and Jay-Z and Wilco are still Jay-Z and Wilco. Except for certain details (no Google searches, no e-mail, no cell phones), ambitious fiction from 20 years ago (Doug Coupland’s Generation X, Neal Stephenson’s Snow Crash, Martin Amis’s Time’s Arrow) is in no way dated, and the sensibility and style of Joan Didion’s books from even twenty years before that seem plausibly circa 2012. [6]

There is a slight of hand in that first paragraph, which jumps from Hemingway's first short stories (1921) to For Whom the Bell Tolls (1940), written some twenty years later. Catch-22 (1961) came a full two generations after Hemingway's first foray into war literature. Hemingway's 1953 Nobel Prize specifically cited work he had penned ten years after he wrote For Whom the Bell Tolls. By the time Joseph Heller came on the scene, Hemingway and modernism had ruled America's literary roosts for a full five decades.

Yet it was not just Hemingway. The household "high literature" poets of the 1950s (Wallace Stevens, T.S. Eliot, W.H. Auden, E.E. Cummings, Robert Frost) were members of Hemingway's lost generation. This was also true for leading lights of the Harlem Renaissance, like Langston Hughes and Richard Wright, who still ruled the world of black arts and letters.  In the 1950s, Faulkner and Salinger carried the torch of the self-consciously modernist novel, an innovation of the 1910s, forward into the second half of the 20th century; across the pond novelists like Robert Graves and "Inklings" like Lewis and Tolkien assured that artillery of Flanders Field would echo another generation longer.  The three largest circulars in America—the Luce productions Time, Life, and Fortune—were all born in the twenties or thirties, as were the slightly higher-brow but only slightly less famous Vogue and New Yorker. The monthlies Harpers and Atlantic Monthly were far older than that. But  Harpers was edited by Frederick Lewis Allen, perhaps the most popular Harpers writer of the twenties and thirties, while something similar could be said for Atlantic Monthly's editor Edward Weeks, who began editing that flagship of American life all the way back in 1938. America's most prominent intellectuals (think Reinhold Niebuhr and his circles) all broke their teeth writing up responses to the great 'isms' of the early 20th century; the European emigres (Strauss, Arendt, Marcuse, etc) then colonizing American universities fled these same 'isms' in Europe, and carried their old interwar quarrels with them to America. 

But there is more! Douthat questions why we think post-modernism subversive so many decades after its invention; a Ross Douthat of 1959 would wonder why his fellow intellectuals still bandied about Freud as if he were a living danger. The Douthat of 2019 wonders why American political thought is stuck in the 1980s; the Douthat of 1959 would have traced the links that tie the technocratic Hooverism of the twenties with the technocratic liberalism of his own era. The Douthat of 2019 wonders why sexual ethics and fashion styles changed so drastically in the sixties and seventies but have been kept in stasis since; the Douthat of the 1950s would have marveled at the incredible changes in fashion and sentiment that occurred between 1905 and 1925 and wondered where all of the turn-of-the-century energy had disappeared to. 

This Douthat would tell us how women who had worn no make up at all and five layers of dress in the 1910s wore but one or two layers of dress (and several layers of make up) in the twenties—just as their daughters would in the fifties. He would relate how after hemlines rose for a decade, they finally stabilized in the thirties—and stayed stable through 1960. Short haircuts shocked the world in 1921; how funny that they remained in vogue (if slightly different in style) in 1961.  An American from the 1955 wearing standard business attire could have walked into an office of 1925 without causing much drama; if a man of 1935 walked with his suit and hat into an office 1905 he would be received as a visitor from a different country.

My argument then, is that though "today’s policy arguments, rhetorical frames, constituencies, and interests groups would all be more recognizable to a time traveler from the early 1980s than the debates of the late 1970s would have been to a voyager from the Depression era arriving in the age of Carter," a voyager from 1925 would have had little trouble adjusting to the issues of  1955. 

Such a traveler would get why Reds caused Scares and why the United Nations brought hope. He might be surprised at the success of the Civil Rights movement or the strength of the new economic orthodoxy, but he would certainly understand both as logical developments of very familiar forces. The controversies swirling around rock'n'roll would remind him of earlier disdain for jazz; the drive towards suburban living would remind him that the first American suburbs were built in the '20s. The election of America's first Catholic president would recall the nomination of America's first Catholic presidential candidate. Chicago, New York, and the Deep South were all still controlled by familiar political machines. The South was still Solid, and New England was still the heartland of the Republican vote. America's political and economic elite were still Waspish Northeasterners. The exceptions, rising Hollywood moguls, Detroit CEOS, and Texas oilmen, all helmed industries that began their upward climb in the 1920s. Our twenties man would sympathize with a public gone crazy with consumerism. He likely shares the fifties obsession with style, glam, and smoking. The emptiness of the company man and the struggle of post war life would all make great sense to him; he would read The Lonely Crowd and watch The Man in the Gray Flannel Suit and nod along. He would understand the quiet crisis in meaning 0f the 20th century American boom.

 If the politicians and public intellectuals of the 1950s were a more sober and self-consciously responsible bunch than the leading lights of the twenties, this was only because they had been chastened by experience. The disillusioned and immature avant-garde of the twenties was the disillusioned and mature old guard of the fifties. Maturity had been forced on them by events, but their disillusionment had not changed. Public life still dwelt in the shadow of the Great War. No one believed that the old order that dissolved in 1914 had been reconstituted or replaced. The liberal elite felt, as they had thirty years earlier, that religious faith had been hollowed out by advances in science. That religious passion had been watered down by the demands of pluralism. That politics had been reduced to bland technocracy. That ideology was dead. That individuality was stripped away by specialization and corporate hierarchy. That booze, glamor, and dollar bills were the only salves left to the modern soul. 

The terrible toll of Great Depression and a second World War had taught them to accept that deal. They would defend liberalism despite their disillusion. They would protect the standing order, no matter how hollow they believed it. The alternatives were simply too grim—and the boons of the boom were simply too great. In many ways the 1950s is what you get when you take the uneven affluence of the 1920s and extend the boom out to the rest of the country. It was the prosperous disquiet of the Jazz Age delivered to Tallahassee and Pasadena. Had there been no Depression or war, it is quite possible the youth of the '30s would have created a counterculture to match that of the '60s. Were the story of American culture told by that generation, perhaps we would stereotype the 1920s, not the 1950s, as America's decade of innocence.

But we don't see the 1920s through the eyes of its children. The story of the 1920s is told to us through the eyes of a different generation, the generation that experienced the twenties in their twenties. We see the First World War, the Roaring Twenties, the Great Depression, and Second World War largely through the experiences and stories of the Lost Generation. [7]  Our understanding of the 20th century's first half is just as colored by that generation's preocupations as our understanding of the century's second half is by the Boomers'. 

Not every generation is so lucky. Ross Douthat and Yuval Levin, Gen-Xers both, belong to one of the unlucky cohorts crowded out by the voices of their elders. But in this they are hardly unique. Hear the lament of Daniel Bell (b. 1919) in the 1950s, frustrated at the cultural hegemony of the generation called Lost:

 IT is DIFFICULT for me to know if I am, or am not, of the "young generation." I came to political awareness in the Depression and joined the Young People's Socialist League in 1932, at the precocious age of thirteen. At the age of fifteen I was writing resolutions on the "road to power." At C.C.N.Y., in the late thirties, I was already a veteran of many factional wars. Since graduating, in 1938, I have worked for twenty years, half my life, as a writer or teacher — a respectable period, yet whenever biographical details are printed, I am, almost inescapably, referred to as a young American sociologist, or a young American writer. And so are others of my generation of the same age or slightly older. To take some random examples: Harvey Swados, now thirty-nine, is still called a promising "young" writer although he has published three novels; Richard Hofstadter, who, at the age of forty-two, has published four first-rate historical interpretations, is called a young American scholar; James Wechsler, over forty, a young editor; Saul Bellow, over forty, a young American novelist; Leslie Fiedler, aged forty-three, a young American critic; Alfred Kazin, aged forty-four, a young American critic, etc., etc... 
But, beyond the general change in the tone of the culture, there is a more specific reason why the college generation of "the thirties" has been, until now, at bay. This is because those who dominated "the thirties" were young themselves when they became established, and, until recently, have held major sway in the culture. 
The Partisan Review, for example, is twenty-three years old, yet its editors, William Phillips and Philip Rahv, are not "old" men (say, fifty, give or take a year). Our intellectual nestors—Lionel Trilling, Sidney Hook, Edmund Wilson, Reinhold Niebuhr, John Dos Passos, Newton Arvin, F. W. Dupee, James T. Farrell, Richard Wright, Max Lerner, Elliott Cohen—were in their late twenties and early thirties when they made their mark as a new generation. The reason why there has been no revolt against them, as they, in asserting a radical politics, had ousted their elders, is that they led their own "counter-revolt." They had both Iliad and Odyssey, were iconistic and iconoclastic. They were intense, hortatory, naive, simplistic, and passionate, but, after the Moscow Trials and the Soviet-Nazi pact, disenchanted and reflective; and from them and their experiences we have inherited the key terms which dominate discourse today: irony, paradox, ambiguity, and complexity. 
Curiously, though they—and we—are sadder and perhaps wiser than the first political generations of the century, we are not better or greater.[8]

They are not better or greater! How bitter! how blinded! Bell's judgement has aged poorly. Bell identifies a handful of the thinkers and writers of the "first political generation" that he champions. John Dewey, Thorstein Veblen, and Charles Beard are the only men of this handful who are remembered today.  Even they have fallen off of college syllabi, studied only by specialists in their era. And the great poets, novelists, essayists, philosophers, economists, and theologians of the Lost Generation? They are still read and treasured. They lasted. It is no accident that we remember their times through their eyes. Bell resents the misfortune of living in their shadow too much to acknowledge their greatness. But their merit was plain to see for those with open eyes.

Patterns past mirror futures coming. One suspects that the critics of future centuries will find more grace in the Boomers than Levin or Douthat grant them. But there is a happy note in repetition. If Daniel Bell could call ideology dead, politics stale, and culture stuck in 1960,  then perhaps Douthat's repetition of these same claims marks a similar moment. As the Lost Generation faded from the scene in the 1950s, so the Boomers fade away now. Some other generation will define the terms of the 2020s: and in that reality, there is hope for renewal.

If this post on inter-generational intellectual history caught your interest, you might consider some of my older posts on similar problems: "On the Tolkienic Hero," Book Notes: On Strategy, a History," "On Adding Phrase to the Language," "On Cultures That Build," "A Tour Through Three Centuries of American Political Culture," and "On Sparks Before the Prairie Fire." To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.


[1] Yuval Levin, The Fractured Republic: Renewing America's Social Contract in the Age of Individualism (New York: Basic Books, 2017), 24-28. 

[2]  A concise but convincing depiction of this is found in Brink Lindsay, Age of Abundance: How Prosperity Transformed American Politics and Culture (New York: Hapers, 2007), 93-129. The theme is pursued at far greater length in Alan Petingy, The Permissive Society: America, 1941-1965 (Cambridge: Cambridge University Press, 2009), which I have not read. Jesse Walker's review of the book ("Beyond Pleasantville," Reason, January 2010) comments:

The trouble is, the apparently alien influences that gradually infect Pleasantville don't hail from the future. The townspeople encounter J.D. Salinger and D.H. Lawrence, civil rights oratory and modern art; on the soundtrack, we hear the rockabilly of Gene Vincent and Elvis Presley, the jazz of Miles Davis and Dave Brubeck, the soulful blues of Etta James. None of those were imported from the '90s. All were available, and hin many cases created, in the '50s and early '60s, the very period that produced the sitcoms lampooned in the film. Pleasantville doesn't contrast the repressed '50s with the liberated '90s. It contrasts the faux '50s of our TV-fueled nostalgia with the social ferment that was actually taking place while those sanitized shows first aired.

Alan Petigny, a reporter turned historian who teaches at the University of Florida, examines how deep that ferment went in The Permissive Society, an important new study of the postwar period. The Truman and Eisenhower eras, he writes, were marked by "an unprecedented challenge to traditional moral restraints." Petigny isn't referring to a bohemian subculture or to rock 'n' roll rebellion: There are only a few scattered references to beatniks in this book, and its discussion of pop music devotes more space to Pat Boone than to Elvis Presley. Petigny is talking about the great American middle, whose values in areas ranging from child rearing to religious piety underwent a rapid and radical change long before the love-ins.

[3]   Ross Douthat, The Decadent Society: How We Became the Victims of Our Own Success (New York: Simon & Schuster, 2020), 90-91, 94.

[4] ibid., 96-98.

[5] ibid., 99-100.

[6] ibid., 92.

[7] This is not even a uniquely 20th century phenomena. Why do the political struggles and passions of the 1870s, 80s, and 90s get flattened out of our historical narratives today? Why is it when we think of the Gilded Age we think not of the great partisan controversies that divided the country, but only of industrial titans, wealthy excess, grand strikes, and immigration waves? It is not because the politics of the day did not matter; to the voters and politicians of the day they mattered a lot. It is instead because our narrative of these decades is completely contorted by the generation of Progressives who swept the nation in the 1900s and 1910s. That generation justified the expansive federal programs and reforms on the grounds that the politics of decades past had been a distraction, utterly incapable of dealing with the real issues spawned by industrialization,  urbanization, and mass immigration. Those "real" issues are what the progressive generation remembered most about the era, even they were not seen as the most important issues for the majority of the people who lived through it. As with the 1950s, we understand these times through a narrow, generational lens.

[8] Daniel Bell, The End of Ideology: On the Exhaustion of Political Ideas in the Fifties (New York: Free Press, 1960), 299-300.

25 October, 2020

Rethink What You Know About Xi's Belt and Road


Countries of the Belt and Road.

Earlier this month I wrote:

I wish less analysts asked, "What did Xi hope to accomplish by creating the Belt and Road?" and instead wondered, "What did Xi hope to accomplish by associating the SOE infrastructure-industrial complex so closely with his personal foreign policy?" [1]

This question follows naturally from the arguments of many smart observers and analysts. My understanding of Belt and Road realities builds directly from work done by Lee Jones, Min Ye, Zeng Jingshan, Baogang He, Matt Schrader, Christina Constantinescu, Michele Ruta, Andrew Batson, Mark Akpaninyie, Shahar Hameiri and Yuen Yuen Ang. [2] If the standard image of the Belt and Road is of a highly centralized, carefully orchestrated geoeconomic strategy designed to  increase Beijing's strategic leverage and build the foundation of a new Chinese world order, the picture that emerges from these studies is different.  Their initiative  seems to be more  marketing strategy than power play. Many of the most prominent projects were announced, financed, or began construction before Xi ever uttered the words "One Belt, One Road;" the Ministry of Foreign Affairs has almost no influence over project selection or development. SOEs and state policy banks (the main players in China's "infrastructure industrial complex") are in the driver's seat. Researchers who have interviewed SOE project managers conclude that profit, not geopolitical importance, decides investment and construction decisions. Unsurprisingly, there is almost no correlation between the high level strategic guidance that is supposed to determine outbound investment priorities and actual flows of money. It is quite likely that every single BRI-branded project now under construction would still be under construction if they were never BRI-branded. That branding has less to do with Politburo guidance (much less explicit plans to ensnare poor countries in "debt-traps") than it does with racking up loyalty points to Big Uncle Xi. 

I believe the evidence for this picture is  unassailable. But if you believe, as I now do, that the decentralized tangle of companies, loans, and projects that we call "the Belt and Road" was not created or directed by Xi Jinping or his Central Committee, you are still left with a question: if the Belt and Road can be boiled down to branding, why did Xi Jinping decide to make this branding the cornerstone of his foreign policy? If it is all really is a marketing strategy, just what was Xi trying to market? Maybe the sales pitch is a power play?

I take a stab at these questions in a new essay I have published over at Palladium. I suggest that

There are two apparent rationales for Xi’s decision to claim the globalization of the SOE infrastructure-industrial complex as hisppersonal brainchild. The first is that Xi hoped that this framing might shape the contours of future outbound investment and construction, bending them towards his personal diplomatic priorities.[2]

 These priorities include the "six economic corridors" and other infrastructure in geopolitically important locales. This more or less accords with the standard understanding of the BRI. I note, however, that Xi's directions on this front are limited by the tools he has to do the directing:

Reducing complex political processes down to bland slogans (“belt and road”), numbered lists (“six major economic corridors”), and overly broad policy guidelines (“give priority to projects of strategic importance”) is a strategy of control that Chinese communist leaders often turn to. They lead a party-state whose members number in the tens of millions; most individuals working for it find themselves subject to overlapping, and sometimes conflicting, lines of authority. True centralized direction of this morass is not possible. To form order from chaos, party leaders rely on propaganda and sloganeering to communicate directly to the cadre on the scene. Party leaders dispense with detailed directives that foresee every contingency in hope that cadres will grasp the principles of the party’s guiding ideology and then develop their own path for implementing these principles in their unique situation. When Xi declares that “whether we succeed in our pursuit of peaceful development to a large extent depends on whether we can turn opportunities in the rest of the world into China’s opportunities,” he does so knowing that his statement will repeated and reprinted in state publications that diplomats, SOE managers, military officers, and party bureaucrats must read. It is their responsibility to turn broad and bland platitudes into individual plans of action.[3]

 But there is a catch: this method of control works best for shaping the behavior of cadres who do not face other conflicting incentives. The problem with the BRI is that the "other conflicting incentives" are very clear:  striking rich on the one hand, drowning in a sea of red on the other. This goes a large way towards explaining the mismatch between centralized direction and the initiative's unplanned, chaotic reality.

The other rationale behind Xi's marketing strategy is more interesting:

 Loudly calling attention to SOE projects and investment abroad allows Xi to subvert hostile narratives surrounding China’s rise. Christening foreign development projects as the central plank of Xi’s grand strategy was an attempt to legitimize China’s return to superpower status. “Promoting BRI, boosting win-win cooperation between China and other countries, and pursuing common development,” Xi informs his diplomats, means “tell[ing] the world China’s success stories, [and through these means] promot[ing] mutual understanding and friendship between China and other countries.”

The legitimizing mission of the Belt and Road is seen in Xi’s invitation to “welcome others to join China’s express train of development.” This was an intentional bid for prestige: China was openly offering “a new trail for other developing countries to achieve modernization” on terms not set by the West. Xi made this clear in 2017 when he declared that “the banner of socialism with Chinese characteristics is now flying high and proud for all to see,” and that the party was

Blazing a new trail for other developing countries to achieve modernization. [The Chinese example] offers a new option for other countries and nations who want to speed up their development while preserving their independence; and it offers Chinese wisdom and a Chinese approach to solving the problems facing mankind.

This sort of rhetoric changed the terms of SOE engagement with outside clients. What would have been understood as business deals between individual SOEs and governments who purchased their services have been transformed into diplomatic endorsements of the “Chinese approach to solving the problems facing mankind.” Each Memorandum of Understanding signed by a foreign government would legitimize the Chinese development path—and the Chinese Communists who pioneered it. [4]

 But this legitimizing aspect of the Belt and Road branding is something with a double edged sword. If all the good individual Chinese companies were doing would now be credited to the Communist Party and its development model... then all of the bad they do would be credited to the Party as well! As I argue:

Transforming SOEs cash-cow projects into handmaidens of China’s national rejuvenation had unanticipated consequences. Americans who might have dismissed these projects as an ad-hoc series of bilateral investment agreements now saw them as a challenge to America’s global leadership. Reactions in Tokyo and New Delhi were just as hysterical, and from the Chinese perspective, just as preventable. The natural reaction of the world’s dragon-slayers to BRI publicity was to hunt for BRI projects they might discredit—a task made far easier by the consequences of Xi’s branding campaign.

In the words of one economist, Xi’s decision to associate SOE firm strategy with his personal diplomatic brilliance “gave the SOE infrastructure-complex carte blanche to pursue whatever projects they [could] get away with.” Poor investments that would have once drawn criticism, or at least extra scrutiny, by observers in China were now given a free pass, as few Chinese would risk tarring an initiative the General Secretary had invested so much of his personal prestige into. Outside China, in contrast, critics would now credit sloppiness or malfeasance not to the failings of individual SOES or financial consortiums, but to the malevolence of the Chinese government. Anything that went wrong with any project would now be laid directly at the feet of Xi Jinping.

Thus, the long string of BRI related incidents that have elevated what were essentially commercial or financial disputes into crises in the diplomatic relationship between China and various BRI host countries....

There is little evidence that Beijing ever intended any of its projects to become debt-traps, nor that they would even have the ability for this level of central strategic action. Yet haphazard project selection was an inevitable outcome of Xi’s decision to make the SOEs and policy banks—domestic actors that face no incentive to take the party’s long-term foreign policy priorities seriously—the foundation of his grand strategy.[5]

 For this reasons I judge the Belt and Road a failure, a strategy doomed from conception to cost Chinese diplomacy more than it has gained it:

Branding this inheritance a central plank of Chinese foreign relations was his mistake. The actors were too various, and their relationships too complex, for a simple system of centralized control. This left Xi with little choice but to rely on the propaganda and ideology apparatus to try giving the Belt and Road strategic direction. These tactics for guiding cadre behavior are powerful in the absence of other incentives—but when there were billions of dollars to be made, the siren call of those other incentives sounded far louder than the exhortations of Xi Jinping Thought. The irony is that foreigners did pay attention to Xi’s exhortations. Xi then found himself paying costs for a strategy he could proclaim but could not implement. [6]
I have only reproduced a part of my essay here—I encourage you to head over to Palladium and read the full thing.

Readers interested in exploring more of my writing on the Xi Jinping or his Belt and Road may find the posts "The Utterly Dysfunctional Belt and Road," "Xi Jinping and the Laws of History," "The World That China Wants," and "Two Case Studies in Communist Insecurity" of interest.   To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

 [1] Tanner Greer, "Counting Speeches to Understand Xi Jinping," Scholar's Stage (12 October 2020).

[2] Lee Jones and Shahar Hameiri, “Debunking the Myth of ‘Debt-trap Diplomacy’ How Recipient Countries Shape China’s Belt and Road Initiative,” Chatham House Research Paper (August 2020); Min Ye, The Belt Road and Beyond: State-Mobilized Globalization in China 1998–2018 (Cambridge: Cambridge University Press, 2020);  Andrew Batson, “The Belt and Road is about domestic interest groups, not development,” Andrew Batson’s Blog (2 May 2019); Mark Akpaninyie, “China’s ‘Debt Diplomacy’ Is a Misnomer. Call It ‘Crony Diplomacy,” The Diplomat (12 March 2019); Lee Jones and Zeng Jinghan, “Understanding China’s “Belt and Road Initiative”: Beyond “Grand Strategy” to a State Transformation Analysis," Third World Quarterly (2019);  Baogang He, "The Domestic Politics of the Belt and Road Initiative and its Implications," Journal of Contemporary China, vol 28, iss 16 (2019); Yuen Yuen Ang, “Demystifying Belt and Road,” Foreign Affairs (22 May 2019); Cristina Constantinescu and Michele Ruta, “How Old is the Belt and Road Initiative?” MTI Practice Notes No. 6 (December 2018);  Matt Schrader, “World Bank Offers Timely, Dubious Praise for Belt and Road,” Jamestown China Brief (20 November 2018).

[3] Tanner Greer, "The Belt and Road Has Backfired on Xi," Palladium (24 October 2020).

[2]  ibid.

[4] ibid.

[5]  ibid.

[6]  ibid.