31 March, 2020

Bullet Reviews: A Bunch of Books on Epidemic and Disaster Response

As February turned to March I realized I needed a better understanding of epidemics and disaster response. It was clear to me then that the coronavirus was going to blow up in my own country, that I was going to be voicing opinions about it, and that in consequence I had a responsibility to inform myself as well as I could within the constraints of my budget and schedule. I wanted a stronger grounding in the history and past examples of American disaster response and the basics of epidemiology. Towards that end I bought about ten books, seven of which I have now finished. I do not have time to review them at all length, but I can provide some capsule-reviews for people who are interested in reading more on these topics themselves.

Christian McMillen's Pandemics: A Very Short Introduction and Marta Wayne  and Benjamin Bolker's Infectious Disease: A Very Short Introduction are both excellent little primers. I am an unabashed fan of the Very Short Introduction series. Their basic idea is to find a noted expert in topic X and have them write an accessible-yet-intelligent 100-150 page introduction of their topic of expertise. Many journalists and commentators who spend several hours trawling Wikipedia whenever a new topic hits the news cycle would be far better served by picking up a $6 kindle edition of the relevant VSI volume instead. These two books are oddly complimentary: McMillen is a historian, and his Very Short Introduction is focused on the social history of past pandemics. Wayne and Bolker are an ecologist and geneticist, respectively, and their focus is on modeling the dynamics of disease growth. McMillen devotes chapters to the bubonic plague, smallpox, cholera, malaria, tuberculosis, influenza, and AIDS; Wayne and Bolker also provide 20-page summaries of various diseases, their case studies being influenza, HIV, cholera, malaria, and Bd (the fungal disease wiping out many of the world's amphibious populations).  Together the two books provide a solid introduction to how various types of diseases work and the history of human attempts to treat or contain them.

Nancy Bristow's American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic is written in a dry academic style: it is a history written by a historian for other historians. The book goes all-in on the social justice/critical theory framing common to 21st century historians; this will bother some readers, but the underlying material is interesting enough that they should probably soldier through it. My take on this sort of framing is that it is not too different from the outrageous things 19th century historians would scaffold their historical research with, and in neither case should the silliness of the scaffolding  detract from the quality of the research underneath. Bristow's book is not a chronological narrative account of the pandemic. Rather, she keys in on select groups and tries to reconstruct what they thought and felt about infectious disease before, during, and after the Spanish Flu blew through. Thus she has one chapter focusing on the way public health authorities understood the disease and their role in treating it, another on the different reactions that nurses and doctors had to the epidemic, and so forth. I do not recommend this one to all readers; I think I will do a longer "passages I highlighted" post for it in a week or two that will present the parts most relevant to the current crisis.

David Randall's Black Death at the Golden Gate is a fantastically readable book  that describes the bubonic plague breakout that swept San Fransisco from 1900 to 1907. Randall tells this story from the perspective of the two Public Health Service officers assigned with containing it. Everything about this book is spectacularly well done. Randall is able to capture what the San Fransisco of 1900 felt like, provide a fascinating picture of the Public Health Service and bacteriology in the early 1900s, and narrate the course of the plague and its containment in an almost noveslistic fashion. There has been a lot of hubub over whether today's pandemic was the product of autocracy; this book is an anecdote to that view, narrating in great detail how California's democratically elected politicians did their utmost to hide the plague in their largest city and derail all attempts to face the problem head on. It is also a useful case study in the art of getting things done that I will be reflecting on for some time. I strongly recommend it to all readers.

Lee Clarke's Worst Cases: Terror and Catastrophe in the Popular Imagination is by far the most disappointing book on this list. Disaster sociology is a thriving subfield, and Lee Clarke is one of its brighter lights. Unfortunately he is a scattered writer and his book is a poorly organized mess. Every chapter is an attempt to summarize an important idea in disaster sociology or risk planning. Even when I agreed with Clarke's takewhich was most of the timeI was appalled at how poorly worded and loosely argued it was. You are much better off reading his academic papers, which cover most of the same ground in better prose and with tighter arguments.

Rebecca Solnit's A Paradise Built in Hell: The Extraordinary Communities That Arise in Disaster is another book I strongly recommend, but with caveats. Solnit's central argument is that in the wake of disaster common people do not "panic," nor do they turn into selfish beasts of the Hobbesian sort. Panic, looting, chaos, and violence are rare in almost all disasters; whether the disaster be a pandemic or a plane crash, collective suffering usually turns survivors into a community of the selfless. This is not a thesis unique to Solnit. It is actually the central claim of some 70 years of disaster sociology research. But Solnit wants to use these truths towards very specific political ends. Solnit is a left anarchist, and she sees in disaster communities something like her ideal utopia. Her animus towards central authorities who do dumb and dangerous things to reinstate "order" and control "panic" in the wake of disaster is justified; her unrelenting hatred of capitalist markets is bonkers and detracts from her argument. (e.g., you won't be hearing much in the 80 pages she devotes to Hurricane Katrina to Walmart's response to that Hurricane, even though it was arguably the most effective actor both before and after the Hurricane made land fall).

 As the book was written in the Bush years, Solnit also does not miss a single opportunity to snipe at that administration. This sniping is wonderfully well crafted. Solnit is a writer that writers love to read. For Solnit the essay is an art form; she is as committed to this art as she is to her political beliefs. For some people this will diminish the book's argument. Solnit is allergic to statistics and refuses to reduce events to quantified metrics, even though the sociology research she is building off of is chock full of them. This is probably a stylistic tic (numbers are ugly) but they may be an ideological element to it as well (like capitalist markets, numbers are depersonalizing). If you are familiar with the underlying research, Solnit's numberless approach will not bother you. But if you are a data head who expects a rigorous argument instead of beautiful one, you may be frustrated with Solnit's style.

Amanda Ripley's The Unthinkable: Who Survives When Disaster Strikes is the least immediately relevant of these books to the pandemic at hand. Ripley focuses on disasters with short time scales where quick action saves lives: terrorist attacks, plane crashes, school shootings, earthquakes, avalanches, and so forth. It is a useful compliment to Solnit's book however, as it reinforces how rare "panic" and other dastardly behavior is at the scene of disaster. Far more common than panic is paralysis. Ripley's question: who keeps moving and who does not? She finds her answer through interviews with hundreds of disaster survivors. This book is not rigorous in the strict scientific sense, but I think the stories Ripley has collected are useful nonetheless. Ripley's book is also quite readable, though this has more to do with the inherently fascinating material she presents, not the literary technique she employs.

If you this post on books is your sort of thing, you might also like the posts"Pining for Democracy: A Few Readings," and "Making Sense of Chinese History: A Reading List."  To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

17 March, 2020

Conservatism's Generational Civil War

Image Source

I have a new essay out in the National Review which extends some of yesterday's thoughts on the limits and attractions of the "common good" conservatism to a new topic: the generational divide that currently divides thinkers on the American right. The Sanders/Biden primary has drawn attention to the parallel phenomena on the left, and much (probably too much) has been written about the origins of the left-wing generation gap. Far less has been written about same fracture on the right—even though this generation gap is central to the larger story of American conservatism's current intellectual civil war.

My piece is a formal response to Yuval Levin and Ramesh Ponnuru's recent National Review print piece, "The Next Coalition of the Right:" As I explain,
Their essay is a postmortem of sorts: It sets forth an explanation for why the reformocon attempts to redefine the conservative agenda failed, and seeks to draw lessons from this failure for the future. The trouble: Levin and Ponnuru have learned the wrong lessons from the fall of their movement. At the zenith of the reformocon moment, reformocons were fiercely critical of the Republican establishment for mistaking the problems of the present with the problems of the past. Now the mandala has turned: Today it is the reformocons themselves who are trapped in the lens of a generation out of date.[1]
The reformocons—a portmanteau for "reform conservatives" that has been in use for several years now—were the subject of two dozen glossy magazine profiles between 2012 and 2016. Their ideology revolved around four planks:
  1. The need to reorient conservatism around working class interests 
  2.  An argument for decentralizing American politics, ending culture war controversies and economic wrangles by removing these issues from the purview of the federal government
  3. Policy reforms that unapologetically had an increase in family formation and child-rearing as their goal
  4. A commitment to gradualist, pragmatic, and wonkish policy solutions to the problems of the day.
In the piece I describe these planks and the ideology that holds them together at some length. I encourage you to read the full account there, but for the purposes of this post it is enough to note that the reformocon movement arrived dead on arrival. How extraordinary! In this moment when conservatives are engaged in the most serious war of ideas they have had since the '60s, the reform conservative movement disappears! And this has happened despite the fact many of its central concerns (say, working class Americans) are at the center of current conservative debate, and their diagnosis of what will happen to American society if their policies were not been adopted has been absolutely vindicated by actual events! What happened?

I ask this question in my essay; here I will ask an even darker version of it. Why is it that all of the young conservatives I have met in the past three months have read Bronze Age Mindset, but none have read either of Yuval Levin's two books? Why is BAP selling more volumes than the lot of the reformicon wonks combined? Why will the American Mind's round-up of response essays be read more, and prompt more serious conversation, than all of the responses written to Levin and Ponnuru's essay (mine included) will ever hope to? The question is not just "why did the reformocons lose the battle of ideas?" but "why are they losing the battle of ideas to undisguised fascism?"

This should disquiet. If you are an American conservative who believes in statements like "all men are created equal" or "all men are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness" you should be disquieted. These ideas are losing, and their loss is only tenuously related to the election of Donald Trump.

This is Levin and Ponnuru's first problem: they believe that the election of Donald Trump was the event that ended their movement. From the perspective of a millennial or zoomer on the right, a different story can be told:
From the perspective of the young conservative, the defining event of the last decade was not the election of Donald Trump but the revolution in morals and manners now dubbed the “Great Awokening.” This secular revival has blessed its adherents with a scheme of ethics, aesthetics, eschatology, and soteriology all their own. Essayist Wesley Yang’s thumbnail sketch describes the new dogmas well: The awokened “metastasize a complex and rebarbative set of critiques of power into an active parapolitical program seeking to transform the world along a sweepingly utopian line” that overthrows all orders, hierarchies, laws, and norms that stand between the privileged and the justice the awokened believe they deserve. The zeal of the converted has carried these notions far; the system of ritualized language and punitive surveillance pioneered by young leftists has carried it even further. Its reach now extends well past the realm of the fervent faithful. Few in young America are untouched. Even those who have never formally studied the doctrines of the Great Awokening echo its view of truth, virtue, and evil. It is the default ethos of America’s future — and for the young, America’s present.

It is worth emphasizing that the stunning advance of the woke had very little to do with the federal government. Barack Obama was not the author of the Great Awokening; the former president was a liberal of the old sort, a man who believed himself the living incarnation of the American creed. He was left frustrated and mystified by a generation of young progressives who had left behind their — and his — ancestral faith. No government forced them to leave. The agents of the Awokening made their case the old-fashioned way: In lectures, essays, and op-eds, they convinced; in newspaper headlines, music videos, and YouTube montages, they suggested; in campus protests and corporate HR codes, they coerced. But only rarely was this a matter of state coercion. Social pressure, not federal tyranny, keeps the young woke.
Older conservatives are well placed to understand why this has happened. For decades they have been predicting that a people unmoored from tradition and community will throw themselves at the first totalizing ideology to come along that promises to give their lives a shred of purpose and meaning. For decades they have warned that the gradual secularization of American society, slow collapse in American social capital, and incessant attacks on America’s heritage would produce such a people. As they foresaw, so it has been. What these older conservatives struggle to understand is that the future they imagined does not just describe the world the young conservative has inherited — it describes the young conservatives themselves. The young conservative knows enough to reject the woke vision of the common good. But for what? The young conservative has no answer to this question. Indeed, he is not really a “conservative” at all, if by that we mean someone intent on conserving inherited values, traditions, or culture. None of those things are part of his inheritance. He feels their loss. He too is desperate for something that promises to imbue his life with a shred of purpose or meaning.
What could the reformocon platform ever offer such a person?
In the face of Barack Obama’s political program, conservative debates revolved around an urgent, yet very specific, question: “What must be done to keep the federal government from interfering with our way of life?” The reformocon platform was a laudable attempt to answer this question. It provided tools to keep the federal government at bay in a form that a wide swath of Americans — including those who did not consider themselves conservatives — could endorse. But the problem posed by the Great Awokening to the American Right is more urgent and more fundamental. The central question that absorbs the young conservative is not “How do I stop the government from interfering with my way of life?” but “What should my way of life be in the first place?” [2]
 Bronze Age Mindset has an answer to that question. So do the Catholic integralists. So do the Benedict Optioners, the Intellectual Dark Web types, the rationalists, and the the Thiel-esque techno futurists.  They
are oriented toward resisting not leftist politics but leftist culture. The story of next-generation conservatism, in other words, will be the story of a counterculture. Debates over what shape that counterculture should take cannot be resolved by a more “disciplined” policy environment.
Little wonder then that the reformocon vision of the future struggled to take hold! Reformocons argued for the centrality of community without endorsing any concrete vision of communal life. They described the need to build new institutions without committing themselves to any specific institutions. They authored wonkish proposals to strengthen family formation but painted no picture of families worth forming. The visions of the reformocons were colorless and empty. This was by design: Like a coloring book, every community and family could fill out the pre-printed designs with whatever color palette they treasured most. That worked when conservatives had an organic set of treasured traditions, values, and relationships to fill the blanks in with. Now they do not, and the reformocon platform is found wanting.[3]
 Some reformocon types were a bit better about this than others. In the essay I highlight Michael Lotus and James Bennett's America 3.0 as one of the best books written in the reformocon tradition even though neither author ever identified as a reformocon (I reviewed it here, with high praise). Lotus and Bennett open their book by imagining what a community of Americans in 2040 would look like if their vision was implemented. But that took thirty pages to describe. It is very hard to boil those thirty pages down to a slogan, a chant, or a meme. But of course this is exactly what a successful political movement requires.

The movements I described earlier—those popular with millennial and zoomer intellectual types—have done more than this. They have created an entire aesthetic language to paint their vision of the good. This is why trad accounts spend so much time retweeting images of old french villages and classical architecture; they understand that ideals are felt, not argued for. They do not promote ideas. They promote an ethos.  If defenders of America's heritage cannot develop a vision of the good, an ethos, as compelling as those on offer elsewhere, and find concrete ways to show others this vision at the emotional level, then they too will dwindle into obscurity.

If this post on American conservatism and the millennial crisis in meaning was worth reading, you might consider some of my older posts on the problem:"Questing for Transcendence," "On the American Football Game,"  and "Tradition is Smarter Than You Are." To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] Tanner Greer, "Learning the Wrong Lessons From Reform Conservatism," National Review (17 March 2020).

[2] ibid

[3] ibid

16 March, 2020

Porn Restriction for Realists

A screenshot of a "Girls Do Porn" video uploaded by a pseudonymous user. An American court charged "Girls Do Porn" with sexual trafficking, but these videos are still uploaded on to Pornhub by anonymous users every day (I found this video after about 30 seconds of googling and took a screen shot of it on 15 March 2020). 
Since returning to the United States several months ago I have spent a great deal of time interacting with American conservatives, especially young American conservatives. They are at the center of the intellectual civil war that now divides the American right. These divides and debates are important. I wager that what the right is going through now parallels what it went through in the '60s and '70s. How these debates are decided will shape the future of the right as a political and cultural force for decades to come. I have spent a great deal of time thinking about what these things might mean for the future.

Over the next few months my thoughts will be laid out in a series of different essays and articles. Most will not be published on this blog, but for various magazines and dailies.  One of the over-arching themes of this series is that the debates conservatives are having are not really about the things they claim they are debating. One of the most obvious examples of this has been the four month long spat over banning porn. I am not at all convinced than any of the major participants are actually interested in using the suite of social, economic, and political tools available to lessen pornography's grip on American society. There is a level of unreality to almost all of the things written about the problem, especially on twitter.  Sohrab Ahmari's declaration "the Founding generation would likely have reacted to Pornhub not with high-libertarian nostrums, but with tar and feathers" is a perfect example of the phenomena. These anti-porn manifestos are not about taking Big Porn down; they are about racking up cheap rhetorical points against other factions of the right. The porn debate is fundamentally about this sort of symbolics; when things are said and done, nothing is at stake here but abstract philosophical ideas. Debates like this are pantomimes designed to show-case each party's vision of "the common good." They are very far removed from actually achieving any good, common or otherwise.

Why so many conservatives have latched onto this "common good" frame will be a topic for a future piece. Today it is enough to note that in the American system no vision of the "common good" will go anywhere unless said vision is actually common. 45% of the American public does not believe that viewing porn is wrong in any way whatsoever. There is no evidence a significant percentage of the remainder want to ban it entirely.

A second difficulty is that banning porn is simply hard to do. Consider the examples of various Asian nations that have tried their hand at the art. Japanese pornographers get around Japanese obscenity laws by filming their scenes with select organs blurred; South Korea's anti-porn legal regime simply shifted the focus of the industry from hardcore shorts to film-length softcore erotica (think Basic Instinct). I am sure South Korean legislators did not intend to create the world's premier soft erotica media complex when they created instituted these laws, but nevertheless this is what happened! The Chinese example is little better. There the draconian power of active state censorship has been unleashed on everyone on their side of the Great Firewall, but it has had little practical effect. Hardly a Chinese man under 40 is not a habitual viewer of pornography, and plenty of it is in Chinese.

So an outright ban of pornography is neither politically feasible nor practically possible. The best we can hope to do is try and create a more restrictive online environment than currently exists. This is the thesis of a piece I've have out at the Washington Examiner, which explains what such an environment might look like:
A successful anti-pornography campaign will have to operate within these constraints. That will mean attempting to create a more restrictive online environment than now exists, one where it is substantially more difficult for children to get ahold of pornography, where only adults willing to pay for it can access it, and the industry is held liable for the abuses that it profits from. 
Not too long ago, such an environment was easy to imagine. In the late '90s and early aughts, the adult film industry was very different than it is today. The industry made most of its money through DVD sales or gated websites that required users to pay to access content. Individual performers could create their own sites and make millions through subscriptions, and free pornography was difficult to get ahold of. That world is no more. Performers labor as peons to an unethical global monopoly, and free pornography is everywhere. 
The change was largely the result of technology, and specifically, the ability of websites to host and stream video cheaply. The same developments that made YouTube possible made a host of “tube” pornography sites possible as well. Like YouTube, these websites host free, user-generated content, although in reality, much of their content is pirated from the gated sites.

The company responsible for this state of affairs is Mindgeek, which owns Pornhub and a host of other popular tube sites. In the late aughts, Mindgeek created a half-dozen tube sites and then used the ad profits it made from pirated material to buy out large but struggling studios. By the mid-2010s, the company had a vice grip on the entire industry. Performers now work at a fraction of the wages that they would have earned in 2000. They are filmed by Mindgeek-owned studios, have their performances released by Mindgeek-owned distributors, and then have the same films pirated and uploaded onto Mindgeek-owned tube sites.

The sheer evil of this entire process was put on display in 2019 by a court case filed against a Pornhub content channel named "Girls Do Porn." The channel's producers lied to the women in their videos, asking them to sign complicated, fine-print-filled contracts (some while drunk, others while still legally minors) that gave the producers the right to upload the finished product on “tube” sites, even as they told the girls involved that the scenes they were about to film would only appear in “DVDs for ‘private collectors’ in Australia and New Zealand.” The videos were instead uploaded to a channel that has racked up some 677 million total views. Mindgeek knew about the problem for months but would not remove the channel until the producers were indicted on sex trafficking charges. And although Mindgeek eventually took the “official” videos down, it still presides over a media ecosystem in which pirated copies of them will live on forever. Our task is not to ban adult material but to ban the business model that allows companies like Mindgeek to prosper. [2]

I think that there is a lot to gain from shifting our attack from an industry to a business model. There are several reasons for this. For one thing, a world where the tube-sites are gone and people must go back to paying for their porn is a significant improvement over the world we live in now. This world is possible: it existed two decades ago. Technological change is part of what happened, but only part. Just as important in the creation of the new, porn-flushed world we live are legal protections given to websites like PornHub and X Hamster which allow them to dodge liability for the theft their business model is based on. It also allows them to dodge liability for much worse sins.

I submitted this piece to editors long before the BBC reported the story of Rose Kalemba, who was raped as a 14 year old girl, and whose rapists uploaded the video to PornHub with titles like "teen crying and getting slapped around" and "teen getting destroyed" for all the world to see. The most popular video had 400,000 hits⁠—including hundreds from the poor girl's own school. It took more than a year of protest before PornHub would remove the video (and that only after threatened with litigation).[3]  

This story has outraged every person I have told it to. As it should! Our task is to help the broader world understand that incidents like this are an inevitable result of the 'tube' site business model and the legal protections corporations engaging in this model have been granted. Mindgeek faces no costs for hosting revenge porn and rape videos. Nor do they have the capability to keep such material off of their websites if they wanted to—it would require policing millions of user-posted videos every month. But what if they were held legally liable for the pirated material on their sites? What if they had to pay damages for every instance of revenge porn, every rape video? One questions whether the free-porn model could survive.

It will be hard to build a coalition to against porn. But a coalition against an exploitive group of businesses who are financially viable only because of theft and whose manner of business enables revenge porn and turns a blind eye towards rape videos? That is another matter entirely. The allies we would find in such a fight might surprise: one of the groups most damaged by the rise of the Mindgeek empire has been porn stars themselves. In a world where pornography was hidden behind paywalls, porn studios and porn stars made a lot more money than they do currently. They would prefer a more restrictive pornography regime—though most are afraid to say this openly, as their future in the Mindgeek ecosystem relies on keeping on that company's good side.

For the rest of my thoughts on how to unseat Mindgeek and upend the existing porn ecosystem, I encourage you to read my full piece. It may be unpalatable for a certain sort of conservatives to link arms with feminist activists and porn stars, but if done right we actually have a chance to limit porn's reach into our society.

If you would like to some of my other takes on American political affairs, you might read  "Questing for Transcendence," "The Title IX-ification of American Childhood," "The Problem Isn't the 'Merit,' Its the 'Ocracy',"  "On the Angst of American Journalists,"  To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] Sohrab Ahmari, "Porn Isn't Free Speech," New York Post (9 December 2019). This declaration was originally posted to twitter, but Ahmari has since deleted his entire twitter account.

[2] Tanner Greer, "Pornography Restriction For Realists," Washington Examiner (12 December 2020).

[3] Meghan Mohan, "I was Raped at 14, and the Video Ended Up on a Porn Site,BBC (10 February 2020).

29 February, 2020

The Education China Hands Need, But Most Do Not Get

image source
Today I came across an article three decades old, penned by Simon Leys in 1990. Leys is reviewing Laszlo Ladany's The Communist Party of China and Marxism, 1921-1985: A Self Portrait, a book I have not read but will pick up now. Ladany made his name publishing a newsletter that analyzed the goings-on of Communist China during a period in which China was largely closed off from the world. Leys explains Ladany's approach:
What made China News Analysis so infuriatingly indispensable was the very simple and original principle on which it was run (true originality is usually simple): all the information selected and examined in China News Analysis was drawn exclusively from official Chinese sources (press and radio). This austere rule sometimes deprived Ladany’s newsletter of the life and color that could have been provided by less orthodox sources, but it enabled him to build his devastating conclusions on unimpeachable grounds.

What inspired his method was the observation that even the most mendacious propaganda must necessarily entertain some sort of relation with the truth; even as it manipulates and distorts the truth, it still needs originally to feed on it...The analyst who wishes to gather information through such a process must negotiate three hurdles of thickening thorniness. First, he needs to have a fluent command of the Chinese language. To the man-in-the-street, such a prerequisite may appear like elementary common sense, but once you leave the street level, and enter the loftier spheres of academe, common sense is not so common any longer, and it remains an interesting fact that, during the Maoist era, a majority of leading “China Experts” hardly knew any Chinese. (I hasten to add that this is largely a phenomenon of the past; nowadays, fortunately, young scholars are much better educated.)

Secondly, in the course of his exhaustive surveys of Chinese official documentation, the analyst must absorb industrial quantities of the most indigestible stuff; reading Communist literature is akin to munching rhinoceros sausage, or to swallowing sawdust by the bucketful. Furthermore, while subjecting himself to this punishment, the analyst cannot allow his attention to wander, or his mind to become numb; he must keep his wits sharp and keen; with the eye of an eagle that can spot a lone rabbit in the middle of a desert, he must scan the arid wastes of the small print in the pages of the People’s Daily, and pounce upon those rare items of significance that lie buried under mountains of clichés. He must know how to milk substance and meaning out of flaccid speeches, hollow slogans, and fanciful statistics; he must scavenge for needles in Himalayan-size haystacks; he must combine the nose of a hunting hound, the concentration and patience of an angler, and the intuition and encyclopedic knowledge of a Sherlock Holmes.

Thirdly—and this is his greatest challenge—he must crack the code of the Communist political jargon and translate into ordinary speech this secret language full of symbols, riddles, cryptograms, hints, traps, dark allusions, and red herrings. Like wise old peasants who can forecast tomorrow’s weather by noting how deep the moles dig and how high the swallows fly, he must be able to decipher the premonitory signs of political storms and thaws, and know how to interpret a wide range of quaint warnings—sometimes the Supreme Leader takes a swim in the Yangtze River, or suddenly writes a new poem, or sponsors a ping-pong game: such events all have momentous implications. He must carefully watch the celebration of anniversaries, the non-celebration of anniversaries, and the celebration of non-anniversaries; he must check the lists of guests at official functions, and note the order in which their names appear. In the press, the size, type, and color of headlines, as well as the position and composition of photos and illustrations are all matters of considerable import; actually they obey complex laws, as precise and strict as the iconographic rules that govern the location, garb, color, and symbolic attributes of the figures of angels, archangels, saints, and patriarchs in the decoration of a Byzantine basilica.

To find one’s way in this maze, ingenuity and astuteness are not enough; one also needs a vast amount of experience. Communist Chinese politics are a lugubrious merry-go-round (as I have pointed out many times already), and in order to appreciate fully the déjà-vu quality of its latest convolutions, you would need to have watched it revolve for half a century. The main problem with many of our politicians and pundits is that their memories are too short, thus forever preventing them from putting events and personalities in a true historical perspective. For instance, when, in 1979, the “People’s Republic” began to revise its criminal law, there were good souls in the West who applauded this initiative, as they thought that it heralded China’s move toward a genuine rule of law. What they failed to note, however—and which should have provided a crucial hint regarding the actual nature and meaning of the move in question—was that the new law was being introduced by Peng Zhen, one of the most notorious butchers of the regime, a man who, thirty years earlier, had organized the ferocious mass accusations, lynchings, and public executions of the land reform programs.[1] (emphasis added).
Leys identifies four competencies needed to analyze the intentions and actions of the Communist regime:
  1. Fluency in Chinese 
  2. Stamina sufficient to plow through one jargon-filled document after another 
  3. The ability to 'decode' jargon into sensible, real-world meaning
  4. Sufficient historical knowledge to put official actions, campaigns, and the jargon associated with them in proper context
These same four skills are still the most important an analyst of Communist politics can possess. They empower an analyst to unearth the goals and intentions of the regime as they are communicated to the regime's own cadres. It is the only sure way to understand what the leadership of the Communist Party actually wants done and what measures they advocate to get them done. Yet with the exception of the first of these skills, there are very few ways for a budding "China hand" to become competent in any of them.

I will reserve a full defense of this text-based approach to the study of the Communist Party politics for another day. [2] My main point today is this: even if you believe this approach is the right approach, self-study is about the only way to master it. Historians of modern China who focus on high-level politics learn something of the art, though they are chary to extend their analysis to the modern day. Those who come to the field through political science are even worse off. Political scientists are trained to model and hypothesis test. They search for general rules and underlying patterns; they face immense pressure to present findings relevant to broader literature of their field (say, on "authoritarian resilience," "state building," or "nationalism"), and tend to favor what can be easily quantified over what cannot. What I advocate is different: simply reading Party documents and telling the rest of the world what they mean. Exegesis is not social science. It is unreasonable to expect social scientists to teach their students how to do this.

Yet it is still a skill that must be taught. Who will teach it?

 I can identify about 12 people in the English speaking world who are very good at doing what I have described (for the curious: I am not one of them). Almost all of them work outside of academia. This makes sense: the art of figuring out what the Communists are talking about is a very practical craft, one with high demand in the world of practical affairs. Most of these people either came up in the old times when propaganda documents were the only way to understand anything that was happening inside China, or they learned their craft through difficult course of self-directed trial and error. But even in the world of practice, these people are vastly outnumbered by those without this training. Thus we have dozens of China experts who can deliver a statistical analysis of the universe of 20th century authoritarian regimes but cannot tell you what Party leaders mean when they say the regime is threatened by "political gaps" and "high level blacks," or who can model US-China relations with dollops of game theory but cannot explain the significance of a fiery People's Daily editorial written under the byline  "Zhong Sheng."

The people who have never heard the phrase "high level black" or "Zhong Sheng" are not stupid. They could certainly learn these things if they knew they should: it would only take ten minutes or so of reading to understand the significance of both terms.[3] I suspect that the greater part of what you need to know to interpret Communist documents could be taught in a single graduate-level course or a month long intensive methods training program. But those do not exist. Neither do the books or glossaries that would make it easy to learn these things through self-study. This is an unfortunate reality. It is perhaps the chief bottleneck that keeps America from attaining a true and thorough understanding of the Communist Party of China.

If you are interested in other things I have written about China's Communists, you might also find the posts "A Note on Historical Nihilism," "Xi Jinping Explains His Political Philosophy,"  and "Reflections on China's Stalinist Heritage, Parts I and II" " of interest. To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] Simon Leys, "The Art of Interpreting Nonexistent Inscriptions Written in Invisible Ink on a Blank Page," China File (Or. published in the October 11, 1990 issue of The New York Review of Books).

[2] For convincing defenses see Peter Mattis, "The Party Congress Test: A Minimum Standard For Analyzing China's Intentions," War on the Rocks (8 January 2019); Geremie R. Barmé, “New China Newspeak,” China Heritage, http://chinaheritage.net/archive/academician-archive/geremie-barme/grb-essays/china-story/new-china-newspeak-新华文体; Timothy Heath, "What Does China Want? Discerning the PRC's National Strategy," Asian Security Vol 8, Iss 1 (2012), esp pp. 55-58; Paul Godwin and Alice Miller, China’s Forbearance Has Limits: Chinese Threat and Retaliation Signaling and Its Implications for a Sino-American Military Confrontation (Washington DC: National Defense University Press, 2013), 29-37; John Garver, China’s Quest: The History of the Foreign Relations of the People’s Republic of China (Oxford: Oxford University Press, 2016), passim.

[3] Graham Webster, "Who Speaks for the Chinese Government," Sup China (20 January 2017); Guan Hai and Wei Lu, “Low-Level Red” and Other Concerns," Chinese Media Project (11 March 2019).

28 February, 2020

The New England Colonies: A History Decided by Culture, or by Ecology?

Over at Gene Expression, Razib Khan has up an interesting post that compares and contrasts the genetics of South Africa's Afrikaner population with New England whites:
Afrikaner ancestry is overwhelmingly Northern European. But as you see in the PCA above they are notably African and Asian shifted when compared to their potential ancestral populations (I used Dutch and German individuals above). For me this is the part [of the study] that is important, if not surprising:

The individual with the most non-European admixture had 24.9% non-European admixture, and only a single Afrikaner individual (out of 77) had no evidence of non-European admixture…Amongst the 77 Afrikaners investigated, 6.5% had above 10% non-European admixture, 27.3% between 5 and 10%, 59.7% between 1 and 5% and 6.5% below 1%.

So about 87% of Afrikaners in their sample had between 1 to 10 percent non-European ancestry. As suggested by genealogical evidence, genetics indicates this is a relatively recent admixture, occurring during the 17th and 18th-century. The early decades of the Cape Colony. It’s a mix of diverse Asian and African components. In some ways, it seems that the non-European ancestry in modern Afrikaners is just the same phenomenon which gave rise to the Cape Coloured population, which is a mix of European, Asian (Indian and Austronesian) and African (Bantu and Khoisan).

But, this result is more interesting in light of how it contrasts with another case. Also in the 17th-century, there emerged another European settler society on the edge of a vast ocean rooted in a deeply Calvinist faith. By this, I mean the colonies of New England. Though New England has been reshaped by later migrations, between 1640 and 1790 30,000 English settlers expanded and grew into a region with 750,000 Americans. In the early 19th-century, New England spilled out over much of the northern swath of the United States of America, in part due to the fact that the fertility of New Englanders was quite high (the early Mormons were fundamentally a New England-derived subculture).

And yet unlike the Afrikaners or the whites of Latin America, the scions of New England have no non-European ancestry. One might argue here that this is due to the lack of opportunity, as the number of slaves in New England was always very low, and there were no native peoples. King Philip’s War falsifies the latter contention. There were numerous native people. At least initially. But the New Englanders were very efficient and effective at marginalizing and exterminating the native peoples of the region. To a far greater extent than occurred in the South.[1]

Razib suggests that the almost nonexistent level of admixture between the English settlers in New England and their counterparts in Africa can be attributed to two things:

  1. The New Englanders came over from England as families and congregations, with a favorable male to female ratio
  2. The New Englanders had a confident culture ready to assert itself as itself from the beginning (or as Razib puts it, "by the latter portion of the 18th-century New England was unique because it was beginning to see itself as not just a complement of the metropole, but a potential rival").[2]
While Razib takes an angle informed by population genetics, the general observation that the English colonists in general and the New England settlers in particular had much lower intermarriage rates with indigenous peoples than colonists from other places is not new. One of the most interesting hypotheses for why this is so put forward in Alfred Crosby's Ecological Imperialism: The Biological Expansion of Europe, 900-1900. If you have never read this book before I strongly recommend it—Crosby was a talented prose writer and a thoughtful historian. I blame the lack of attention the book has received outside of historical circles on its wordy and political sounding title. This is a shame. Most of the good ideas you find in Jared Diamond's Guns, Germs, and Steel actually come from Crosby, though Crosby is both the better historian and the finer writer.

The question Crosby sets out to answer in Ecological Imperialism is this: why did European colonists almost completely displace native cultures and populations in Australia, New Zealand, North America, and Argentina, create mestizo populations and cultures in Central America, the Caribbean, the Andes, and coastal Brazil, but fail to even make a dent in most of Africa, interior Brazil, or Asia? For Crosby, the answer to all of these questions can be found by looking at the weeds. 

A wild fact: the majority of the wild flora in the San Joaquin Valley did not exist in California two centuries ago. For most of temperate Canada and the United States, Australia, New Zealand, and Argentina the ratio of invasive to indigenous flora is similar, ranging from 30-60% of the flora in the places most of the population lives. The spread of disease from the Old World to the New is now quite famous: less famous is the spread of European weeds (dandelions, Kentucky bluegrass, nettles, white clover, etc.), pests (rats, roaches, houseflys, etc), and other disruptive species (earthworms, pigs, peach trees, and so forth) that transformed the landscape into something decidedly more European. Most interesting of all for the original question: the displacement of entire local biomes by European biota happened only in those places where the colonists displaced the local people.

Crosby suggests the first caused the second. The Europeans' biological footprint in Africa—outside of the Mediterranean climate of the Cape—is near non-existent. African diseases killed European crops, animals, and colonists. European weeds, which had an advantage in North American biomes whose plants had not faced grazing herds since the ice age, were outclassed by African grasses that evolved next to an even larger contingent of grazers than found in Europe. In Mexico and the Andes the environment was not hostile to the Europeans entirely—but the climate was different enough from Europe, and the local biota resilient enough in face of European imports, that Spanish colonists could not transplant their entire mode of agriculture to Mexico whole-sale (as they did in Argentina). The colonists in these areas needed indigenous crops and styles of farming to survive. They depended on the indigenous farmers for those crops. Eventually, they would intermarry with them. [3]

Of all the "New Europe" zones discussed, the New England colonists had the least number of hiccups in setting up an Old World society somewhere in the New. The New Englanders famously faced far lower rates of disease than settlers in other parts of the Americas, and within a generation of the founding were living generally healthier lives than the English at home.[4] They were also one of the most resistant to adopting local sources of food. Explains David Hackett Fisher:
"The Puritans of Massachusetts created one of the more austere food ways in the Western world, For three centuries, New England families gave thanks to their Calvinist God for cold baked beans and stale brown bread, while lobsters abounded in the waters of Massachusetts Bay and succulent game birds orbited slowly overhead... the coastal waters of New England teemed with mussels, oysters, lobsters, and clams. The rivers were choked with salmon and shad. Wild fowl flourished in abundance. Native delicacies such as glasswort sprouted along the seashore and fiddleheads carpeted the woodlands. 

The Puritans showed little interest in these delights except when driven by hunger to consume them. Shellfish was regarded with grave suspicion. Shad roe, a gourmet's delight, was used as a fertilizer. In the first year John Winthrop complained when he was compelled to eat oysters and wild duck instead of the staples of old England. "My dear wife," he wrote, "we are here in paradise though we have not beef and mutton."[5]
That was true in 1630, when Winthrop wrote those words. It was not true in thirty years later, when the New Englanders had successfully transplanted the entirety of their agricultural system to their new home. Their success here is my favored explanation for why their marriages with non-colonists were so few.

If thoughts on "deep history" are your thing, you might also like the posts "Notes on the Dynamics of Human Civilization," "Geography and Chinese History," "China Was Never an Empire of the Mind," "History is Written by the Losers," "Vengeance as Justice" and "A Tour Through Three Centuries of American Political Culture." To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] Razib Khan, "Afrikaner Genetics Show How Unique New England Culture Is," Gene Expression (27 February 2019).

[2] ibid

[3] This question is posed most clearly in Alfred Crosby, Ecological Imperialism: The Biological Expansion of Europe, 900-1900 (Cambridge: Cambridge University Press, 1986), 145-148; for his answers see 148-195; 269-304. Another famous book that I have not read is also relevant here: William Cronon, Changes in the Land Indians Colonists and the Ecology of New England (New York: Hill and Wang, 1985).

[4] Bernard Bailyn, The Barbarous Years: The Peopling of British North America: The Conflict of Civilizations, 1600-1675 (New York: Random House, 2012), 425-226.

[5] David Hackett Fisher, Albion's Seed: Four British Folkways in America (Oxford: Oxford University Press, 1989), 135-137.

26 February, 2020

Losing Taiwan Means Losing Japan

Image Source
The United States could bounce back from the fall of Taiwan to Communist rule. It would have far more dire consequences for Japan. Consider this post a short, informal primer on why this is so.

Ian Easton explains the PLA's view:
The Course Book on the Taiwan Strait's Military Geography is a restricted-access PLA manual, used to teach senior officer seminars in Beijing… This source [informs] readers that Taiwan is a chokepoint of great utility for blockading Japan. The Taiwan Strait, it notes, is a Japanese maritime lifeline that runs from Europe and the Middle East, and based on PLA studies, Japan receives 90 percent of its oil imports, 99 percent of its mineral resources, and 100 percent of its nuclear fuel needs from ships that travel across these sea lanes. In total, 500 million tons of Japanese imports pass by Taiwanese waters each year, with 80 percent of all Japan’s container ships traveling right through the Strait, the equivalent of one Japanese cargo ship every ten minutes. Consequently, these waters will, “directly affect Japan’s life or death, its survival or demise.”

PLA intentions and plans for a conquered Taiwan are made plain in another internal document, The Japanese Air Self Defense Force, a handbook studied by mid-career officers at the PLA Air Force Command College in Beijing. The stated purpose of the text is to help Chinese pilots and staff officers understand the strengths and weaknesses of their Japanese adversaries. Buried amidst hundreds of pages of detailed maps, target coordinates, organizational charts, weapons data, and jet fighter images are the following lines:

As soon as Taiwan is reunified with Mainland China, Japan's maritime lines of communication will fall completely within the striking ranges of China's fighters and bombers...Our analysis shows that, by using blockades, if we can reduce Japan's raw imports by 15-20%, it will be a heavy blow to Japan's economy. After imports have been reduced by 30%, Japan's economic activity and war-making potential will be basically destroyed. After imports have been reduced by 50%, even if they use rationing to limit consumption, Japan's national economy and war-making potential will collapse entirely...blockades can cause sea shipments to decrease and can even create a famine within the Japanese islands
. [1]
The first PLA document Easton quotes here has the statistics slightly wrong: the larger part of Japan's energy imports travel to the south of Taiwan through the Bashi channel, in the Luzon strait.[2] To get a sense for what those shipping lanes look like, here is a map of A.P Moeller-Maersk, Mediterranean Shipping Co. and CGM SA's Japan bound shipping routes:

Image Source
The Luzon Strait, you will notice, also runs directly adjacent to Taiwan. Chinese control of Taiwan would—in event of conflict—force Japanese shipping out of the South China Sea entirely. This in itself is not a death blow: at some cost, sea traffic that now passes through Malacca and runs adjacent to Taiwan could be rerouted through the Sunda Strait and up the east coast of Mindanao. I am sure someone in Japan must have calculated the likely economic costs of rerouting Japan-bound traffic this way (or in a more extreme circumstance, replacing Middle Eastern energy supplies with North American ones) but I have not yet seen any actual numbers. But given alternate sea lane possibilities, I doubt clearing Japanese shipping out of Taiwanese waters entirely would be enough to threaten Japan with "famine."

But the problem posed by Chinese control of Taiwan is not really limited to the shipping that passes through the Taiwan and Luzon Straits. Navalists like to talk about what they call the "First Island Chain," a group of islands that keeps the PLA Navy and PLA Air Force hemmed into the East and South China Seas. These islands include the Philippines Archipelago, Taiwan and the Pescadores, the Japanese Archipelago, and the Ryukyu Islands, which are Japanese territory. Here is a map of that last group:

Image Source
 In times of peace there is little to stop Chinese naval and air forces from crossing out into the Pacific as they wish, but in times of war things will be different. Over the last few years the Japanese have been quietly stocking these islands with anti-ship and anti-air missile units; were war imminent these deployments would grow. It is very difficult to imagine a significant number of Chinese commerce raiders slipping out to prey on Japanese shipping outside the Taiwan Strait as long as they have to slip between hostile Japanese and Taiwanese island bastions.  It is very easy to imagine this if the Taiwanese side of the equation is no longer hostile to Chinese forces.

This is true for several reasons. One of the more interesting ones involves submarines. Look again at the image at the top of this post; that is a seafloor depth map of the West Pacific. You will notice that the water east of the first island chain is much deeper than the water west of it. This has very practical implications for submarine warfare. The prime reason the Chinese built their most important submarine base in Sanya is because it allows the submarines harbored there to slip into the deeper waters of the South China Sea where detection is far more difficult. For China, this is the cornerstone to a credible seaborne nuclear "second strike." If the Chinese had direct access to the western Pacific—the kind of access possession of Taiwan would give them—their nuclear armed submarines could roam freely across the globe. It would also make detecting and tracking submarines tasked with commerce raiding far more difficult.

The loss of Taiwan would also put to question Japan's ability to hold and defend the Ryukyu islands altogether. Yonaguni, at the tail end of the Ryukyu chain, is less than 70 miles away from Taiwan's east coast. That is almost one fourth the distance between the island and the Chinese coast (approx. 250 miles), and one fifth the distance between the island and Okinawa (330 miles). Okinawa itself is closer to Taiwan's north coast (approx. 370 miles) than it is to the Japanese Archipelago proper (approx. 480). If Taiwan were in hostile hands, Japan would be fatally vulnerable to an island hopping campaign that would rob it of the ability to control its near sea lanes.[3]

Taiwan is the keystone of China's naval containment. Lose Taiwan, and Japan loses the ability to keep the PLA Navy hemmed up against their own coast line. Lose Taiwan, and Japan loses control of its most important supply lanes. Lose Taiwan, and Japan loses the extended island chain defense system that protects its home waters. 

Japanese naval leaders understand this. They always have. It is why the Imperial Japanese Navy insisted upon Taiwan's annexation in 1895, and it is why Taiwan contingencies have been an important part of the Self Defense Force's thinking since the 1950s.[4] They understand—even if most Japanese civilians do not—that the loss of Taiwan would give the Chinese incredible leverage over Japan.

There are some who believe that America could retreat from the defense of Taiwan while keeping the rest of its alliance system in the Far East intact. This is a fantasy. An argument to retreat from Taiwan is an argument to fatally undermine the defense of Japan. In truth, it is an argument to retreat from East Asia. That argument can be made, but I would prefer to see it made openly. 

As a final note: while preparing this post I came across a map of all of the currently existing submarine cables that run next to Taiwan. You will notice the great number that run through the Luzon Strait:

A decade ago an underseas earthquake in the strait knocked out the internet in Taiwan, Japan, South Korea, and Eastern China. I will admit that I do not know how easy it would be to isolate the cables headed towards Japan and knock them out of service, but I am interested in finding out. If you work in that industry or have expertise in underseas infrastructure, please sound off in the comments!

If you found this analysis of Taiwanese and Japanese military affairs of  interest, you might also like the posts "Taiwan Can Win a War With China," "Why Taiwanese Leaders Put Political Symbolism Above Military Power," "Taiwan Will Be Defended by the Bullet or Not At All," and "At What Point is Defending Japan No Longer Worth It." To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] Ian Easton, The China Invasion Threat: Taiwan's Defense and American Strategy in East Asia (Washington DC: Project 2049 Institute, 2017), 27-28.

[2]Euen Graham, Japan's Sea Lane Security: A Matter of Life and Death? (Nissan Institute/Routledge Japanese Studies, 2006), pp.23-26.

[3] The Senkaku Islands are also easier to assault from Taiwan than anywhere else. However, I tend to agree with Todd Hall's assessment that the Senkaku island dispute is more about the symbolics of honor than strict military utility. See Hall, "Why the Senkaku/Dioyu Islands Are Like a Toothpaste Tube," War on the Rocks (4 September 2019).

[3]  E. I Chen, "Japan’s Decision to Annex Taiwan: A Study of Ito-Mutsu Diplomacy, 1894-95," The Journal of Asian Studies (1977), vol 37, iss. 1, pp. 65-67; Graham, Japan's Sea Lane Security, passim.

07 February, 2020

A Note on the Romney Vote

image source.
But if Greatness be so blind
As to trust in towers of air
Then let it be with goodness lined
That at least the fall be fair.
–Sir Henry Wotton

Senator Mitt Romney's vote to impeach President Trump has the chattering classes all in a titter. Romney is being called a man of magnificent character, a profile of courage, a 21st century reincarnation of Sir Thomas Moore. This is all a bit overblown. For Romney, the costs of integrity are small. Mitt Romney is a 72 year old man. He does not face voters again until 2024. He is rich. He has five children who love him dearly, and they have borne him almost thirty grandchildren. Romney could retire tomorrow with full knowledge that his life of service has created  friends and followers who are truly devoted to him, regardless of his (or their) political position. His funeral will be full. Had his GOP enemies the power to strip Romney of his senatorial office, they would be doing him a favor. He would then be allowed to spend the twilight of his mortal life as old men ought: in the warm embraces of fellowship and family. What more could Romney ask for?

Nothing more.

Romney knows this. Romney has been around politics long enough to know false friends from true; he knows that the accolades and acclaim directed his way today come from poisoned pens. The sweet words of talking heads hold little weight; they are given by the same men and women who undeservedly savaged him in years past, and who will just as viciously attack him when the next confirmation vote for supreme court justice rolls around (this man voted for Brett Kavanaugh, you will remember, with the same clear conscience with which he voted against Donald Trump). Romney's impeachment vote made permanent enemies, but only flighty, fair-weather friends.

But why should Romney care?

Mitt Romney is not in the friends-making businesses. The Senator already has those. I served as a missionary in the same Massachusetts congregations that Mitt Romney once presided over as bishop and stake president. From members there I heard stories of Romney's past. Some told tales of incredible generosity on Romney's part. The gratitude and loyalty these people felt towards their old leader ran deep. These people could give a flying flip for Romney's politics. He could sign up as a card-carrying member of the Democratic coalition tomorrow and they would still love him. Their love transcends political squabbling.

In recent years the concept of "FU Money" has gained some currency. Mitt Romney's vote provides us with an alternate conception: the FU Community. Mitt Romney can afford to burn the bridges with his party friends in Washington because those friends in Washington are not the only friends he has got. In times of crisis or need he has other, stronger, less mercenary networks to fall back on. Let CPAC spurn him. Let them say what they will: at the end of the day he still has 30-some adoring grandchildren to dote on, and a community of fellows and followers that only a life of charity could create. 

This is the lesson we should be taking away from all this. Romney's vote was not especially courageous. If anything, given what Romney believes and the privileges he enjoys, it would have been cowardly for him to vote any other way. Mitt Romney's vote was a product of Mitt Romney's life. Romney was a man with no political principles but sterling personal ones. He prized people over programs; his conduct was guided by personal kindness, not political platforms. This sort of leadership has its weaknesses, but this week we saw its strengths. This is the neat thing about a huge family and a lifetime of service: it empowers you look at the world, face it pressures, and say, “Nah, this time I will follow my conscience after all.”

04 February, 2020

Washington DC Meet-Up

A few announcements for the blog readership.

First of all, my latest "Notes From All Over" round up post (basically, a round up of the best reports, studies, essays, etc. I read over the last month) has been posted to Patreon. For the last six months or so I have moved these posts over to Patreon as a reward for those whose support makes The Scholars Stage's continued existence possible.

Second, I am planning on having a meet-up for blog readers who live in the Washington DC metro area later this month. This meet-up is open to all blog readers, be they Patreon supporters or not. This meet up will be held on February 28th, 2020 at 5:00 PM. As with past meet-ups it will probably last several hours; coming late or leaving early is fine. I have not chosen a venue as I do not yet know how many people will be coming. If you are interested in attending, please send me an e-mail with your contact information. Once I have a solid head count, I will inform everyone who has e-mailed me with information on the venue. Of course, if you have a suggestion for a good venue for the meet-up, I encourage you to recommend it to me.

29 January, 2020

Public Intellectuals Have Short Shelf Lives—But Why?

Image Source

Several months ago someone on twitter asked the following question: which public thinker did you idolize ten or fifteen years ago but have little intellectual respect for today? [1] A surprising number of people responded with "all of them." These tweeters maintained that no one who was a prominent writer and thinker in the aughts has aged well through the 2010s.

I am not so harsh in my judgments. There are a few people from the last decade that I am still fond of. But the problem is inevitable. This is not a special pathology of the 21st century: when you read intellectuals of the 1910s talking about the most famous voices of the 1890s and early 1900s you get the same impression. You even get this feeling in a more diluted form when you look at the public writing of the Song Dynasty or Elizabethan England, though the sourcing is spottier and those eras and there was no 'public' in the modern sense for an individual living then to intellectualize to. But the general pattern is clear. Public intellectuals have a shelf life. They reign supreme in the public eye for about seven years or so. Most that loiter around longer reveal themselves oafish, old-fashioned, or ridiculous.

To give you a sense of what I mean by this, consider the career of public intellectual whose career peaked in the early aughts. Thomas Friedman is now the butt of a thousand jokes. He maintains his current position at the New York Times mostly through force of inertia, but secondly through his excellent connections within the Davos class and his sterling reputation among those who think as that class does. But this was not always so. Let us review Friedman's climb to prominence:

Thomas Friedman earned his BA in Mediterranean Studies in 1975; a few years later he obtained a prestigious Marshall scholarship to study at Oxford, where he earned a Masters in Middle Eastern Studies. By age 26 he was a reporter in Beirut, and at age 29 he had won his first Pulitzer (for up close reporting on a war massacre). He would win another Pulitzer as the New York Times' bureau chief in Jerusalem, and at age 36 would write his first award winning book, From Beirut to Jerusalem, a recapitulation of his years of reporting in those two cities. This put Friedman at the top of the "Middle East hand" pack. That is a nice place to be, but it is still far away from the position of household public intellectual.

To get there Friedman would first transition to reporting from Washington DC as a White House correspondent. A few years later (now at age 41) he would be given a foreign affairs column at the New York Times, moving him a step further into the opinion-business. I attribute his transformation from minor public commentator to Voice of the Zeitgeist to two events: first,  the publishing of The Lexus and the Olive Tree in 1999 (when he was 46 years old), the first of several books that would lay out his theory of globalization; second, the terrorist attacks September 11th, which allowed him to write columns that drew on both his long personal experience in the Middle East and his newer interest in globalization. These were the columns that won him his Pulitzer for commentary in 2002 and made him a central voice in the debates over America's response to the terrorist attacks and the the invasion of Iraq. I place Friedman's peak in his 52nd year, when his most famous book, The World is Flat, was published. It was also around this time that opposition to Friedman was at its peak, with bloggers and columnists alike writing long diatribes against him.

Friedman would close out the decade with another book and three documentaries. These were mostly restatements of his columns (which in turn drew heavily from ideas he first introduced and developed between Lexus and The World if Flat). Friedman was still a part of the national conversation, but his perspective had lost its originality. His columns began to bleed together. This is the era when "Friedman Op-Ed Generators" went viral. Increasingly, Friedman was not argued against so much as joked about. By 2013 or so (just as he was turning 60) Thomas Friedman was done. Not technically so—between then and now he would rack up two more books, hundreds of columns, and heaven knows how many appearances at idea festival panels and business school stages. But intellectually Friedman was a spent force. His writing has been reduced to rehashing old rehashes, his columns the rewarmed leftovers of ideas grown old a decade ago. It is hard to find anything in his more recent books or columns that has mattered. He is able to sell enough books to live comfortably, but you will have difficulty finding anyone under 50 who admits they have read them. Friedman lingers still as a public figure, but not as a public intellectual. His thinking inspires no one. The well has run dry.

But why?

The easy answer is that the world of 2019 is not the world of 2002. What seemed compelling at the turn of the millennium is not compelling now. A man whose worldview has not budged in two decades has nothing to say to a world that has changed tremendously in that same time. But this answer is not really sufficient. It is hard to remember now, but there was once a time when the insights of Thomas Friedman read fresh and strikingly original. That his ideas seem so banal and obvious today is in many ways a measure of how successful he was at popularizing them in the early 2000s. The real question to answer is this: why are so many public intellectuals capable of generating insight, originality, or brilliance at the beginning of their careers, but are utterly incapable of fresh thinking a decade later?

Let me offer two hypotheses. One is psychological, the other sociological.

Analytic brilliance is not constant over the course of life. Both general intelligence and more nebulous measures of creativity have clear peaks over the course of a lifespan. Here is how one textbook describes research on this question (I've taken out the parenthetical references to various source studies for ease of reading):
In most fields creative production increases steadily from the 20s to the late 30s and early 40s then gradually declines thereafter, although not to the same low levels that characterized early adulthood. Peak times of creative achievement also vary from field to field. The productivity of scholars in the humanities (for example, that of philosophers or historians) continues well into old age and peaks in the 60s, possibly because creative work in these fields often involves integrating knowledge that has crystallized over the years. By contrast, productivity in the arts (for example, music or drama) peaks in the 30s and 40s and declines steeply thereafter, because artistic creativity depends on a more fluid or innovative kind of thinking. Scientists seem to be intermediate, peaking in their 40s and declining only in their 70s. Even with the same general field, differences in peak times have been noted. For example, poets reach their peak before novelists do, and mathematicians peak before other scientists do.

Still in many fields (including psychology) creative production rises to a peak in the late 30s and early 40s, and both the total number of works and the number of high quality works decline thereafter. This same pattern can be detected across different cultures and historical periods.... 
What about mere mortals? Here researchers have fallen back on tests designed to measure creativity. In one study, scores on a test of divergent thinking abilities decreased at least modestly after about age 40 and decreased more steeply starting around 70. It seems that elderly adults do not differ much from young adults in the originality of their ideas; the main difference is that they generate fewer of them. Generally then, these studies agree with the studies of eminent achievers: creative behavior becomes less frequent in later life, but it remains possible throughout the adult years."[2] 
I suspect the underlying mechanism behind this pattern is brain cell loss. Neuroscientists estimate that the average adult loses around 150,000 brain cells a day; in the fifty years that follow the end of brain maturation (ca. years 25-75), the average brain will lose somewhere between 5-10% of its neurons.[3] Fluid intelligence begins declining in a person's 30s.[4] This implies that most humans reach their peak analytic power before 40. Crystal intelligence holds out quite a bit longer, usually not declining until a person's 60s or 70s. This is probably why historians reach peak achievement so late: the works that make master historians famous tend towards grand tomes that integrate mountains of figures and facts—a lifetime of knowledge—into one sweeping narrative.

Thus most humans develop their most important and original ideas between their late twenties and early forties. With the teens and twenties spent gaining the intellectual tools and foundational knowledge needed to take on big problems, the sweet spot for original intellectual work is a person's 30s:  these are the years in which they have already gained the training necessary to make a real contribution to their chosen field but have not lost enough of their fluid intelligence to slow down creative work. By a person's mid 40s this period is more or less over with. The brain does not shut down creativity altogether once you hit 45, but originality slows down. By then the central ideas and models you use to understand the world are more or less decided. Only rarely will a person who has reached this age add something new to their intellectual toolkit.

Recognizing this helps us make sense of a many interesting aspects of human social life. I think often about Vaisey et al's 2016 study, which demonstrated that most shifts in social attitudes occur not through change in the attitudes at the individual level, but through intergenerational churn.[5] Old attitudes die because generations that hold them literally die off. Such is the stuff of progress and disaster.

Such is also the problem of the public intellectual. A public intellectual's formative insights were developed to explain the world he or she encountered during a specific era. Eras pass away; times change. It is difficult for the brain to keep up with the changes.

Not impossible, just hard. And this bring my second, sociological explanation into play. There are things that a mind past its optimum can do to optimize what analytic and creative power it still has. But once a great writer has reached the top of their world, they face few incentives to do any of these things.

Consider: Thomas Friedman's career began as a beat reporter in a war-zone. He spent his time on Lebanese streets talking to real people in the thick of civil war. He was thrown into the deep and forced to swim. The experiences and insights he gained doing so led directly to many of the ideas that would make him famous a decade later.

In what deeps does Friedman now swim?

We all know the answer to this question. Friedman jets from boardroom to newsroom to state welcoming hall. He is a traveler of the gilded paths, a man who experiences the world through taxi windows and guided tours. The Friedman of the 20th century rushed to the scene of war massacres; the Friedman of the 21st hurries to conference panels. What hope does a man living this way have of learning something new about the world?

More importantly: What incentive does he have to live any other way?

I have noticed that historians who transition from the role of academic scribbler to famed public voice follow a sort of pattern. Their first published work might be a monograph, perhaps a PhD thesis turned book. It will be on some narrow topic no sane person cares about, the product of months spent in one archives in one location. U.S.-British trade relations in the 1890s, perhaps, or state-led cultural imperialism in Japanese Manchuria. They may repeat this feat again, but at some point they transition to something broader⁠—now they are writing a global history of trade regimes under the gold standard, or of empire building in the whole Greater East Asia Co-prosperity sphere. This work will be a brilliant, field-defining piece of scholarship, lauded (or resented) by other luminaries of their sub-discipline, read by scholars and interested laymen alike. That book will be published by an academic press; the next will be aimed at popular audiences. Our historian has now graduated fully to the role of public thinker: her next book will be on the dangers posed by trade wars writ large, or on the nature of modern imperialism. This title will be reviewed in all the famous magazines; people who have never read it will argue about it on twitter. And then everything starts to fall apart.

The trouble is that just as our historian reaches her full stature as a public name, her well of insight begins to run dry. A true fan of her works might trace elements of their name-making title back to the very first monograph she published as a baby academic. She was able to take all of the ideas and observations from her early years of concentrated study and spin them out over a decade of high-profile book writing. But what happens when the fruits of that study have been spent? What does she have to write about when they have already applied their unique form of insight to the problems of the day?

Nothing at all, really. Historians like this have nothing left to fall back on except the conventional opinions common to their class. So they go about repackaging those, echoing the same hollow shibboleths you could find in the work of any mediocrity.

You see this pattern recur again and again in the op-eds of our nation. A once-bold foreign correspondent whose former days of daring-do have already been milked for more than they are worth, a Nobel laureate two decades removed from the economic papers that gave him acclaim, a nationally known historian who has not stepped into an archive since graduate school—the details change but the general pattern is the same. In each case the intellectual in question is years removed from not just the insights that delivered fame, but the activities that delivered insight.

The tricky thing is that it is hard to go back to the rap and scrabble of real research when you have climbed so high above it. Penguin will pay you a hefty advance for your next two hundred pages of banal boilerplate; they will not pay you for two or three years of archival research on some narrow topic no one cares about.  No matter that the process of writing on that narrow topic refills the well, imbuing you with the ideas needed to fill out another two decades of productive writing. The world is impatient. They do not have time to wait for you to reinvent yourself.

There are practical implications for all this. If you are an intellectual, the sort of person whose work consists of generating and implementing ideas, then understand you are working against time. Figure out the most important intellectual problem you think you can help solve and make sure you spend your thirties doing that. Your fifties and sixties are for teaching, judging, managing, leading, and dispensing with wisdom. Your teens and twenties are for gaining skills and locating the problems that matter to you. Your thirties are for solving them.

Public intellectuals who do not wish to transition in the their forties from the role of thinker to mentor or manager are going to have a harder time of it. Optimizing for long term success means turning away from victory at its most intoxicating. When you have reached the summit, time has come to descend, and start again on a different mountain. There are plenty of examples of this—Francis Fukuyama comes to mind as a contemporary one—but it is the harder path. For some, this will be a path worth taking. For others, wisdom is found in ceding the role of public intellect over to younger upstarts and moving to more rewarding positions guiding the next generation of intellectual lights.

If you would like to read some of my other jottings on psychology may find the posts "Historians, Fear Not the Psychologist,"  "Public Opinion in Authoritarian States," and "Taking Cross Cultural Psychology Seriously" of interest. If writing on intellectual life are more up your alley, consider "Questing for Transcendence," "Books Notes--Strategy, a History," "I Choose Hannah Arendt,"   and "On the Angst of American Journalists" instead. To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.

[1] I've forgotten who, and did not bother saving the tweet—if you know who it is sound off in the comments)

[2] Carol Sigelman and Elizabeth Rider, Lifespan Human Development, 6th ed (Belmont, CA: Wadsworth Learning, 2009).

[3] John E Dowling, Understanding the Brain: From Cells to Behavior to Cognition (New York: W. W. Norton & Company, 2018).

[4] John Horn and Raymond Cattel, "Age differences in fluid and crystallized intelligence," Acta Psychologica (1967), vol 26, 107-129. For a very strong counter-statement that argues this fluid v. crystal distinction does not match the complexity of the data, see Joshua Hartshorne and Laura Germine, "When Does Cognitive Functioning Peak? The Asynchronous Rise and Fall of Different Cognitive Abilities Across the Life Span," Psychological Science (2015), vol 26, iss. 4, 433–443.

[5] Stephen Vaisey and Omar Lizardo, "Cultural Fragmentation or Acquired Dispositions? A New Approach to Accounting for Patterns of Cultural Change," Socius: Sociological Research for a Dynamic World (2016), vol 2 .