Several months ago someone on twitter asked the following question: which public thinker did you idolize ten or fifteen years ago but have little intellectual respect for today?  A surprising number of people responded with "all of them." These tweeters maintained that no one who was a prominent writer and thinker in the aughts has aged well through the 2010s.
I am not so harsh in my judgments. There are a few people from the last decade that I am still fond of. But the problem is inevitable. This is not a special pathology of the 21st century: when you read intellectuals of the 1910s talking about the most famous voices of the 1890s and early 1900s you get the same impression. You even get this feeling in a more diluted form when you look at the public writing of the Song Dynasty or Elizabethan England, though the sourcing is spottier and those eras and there was no 'public' in the modern sense for an individual living then to intellectualize to. But the general pattern is clear. Public intellectuals have a shelf life. They reign supreme in the public eye for about seven years or so. Most that loiter around longer reveal themselves oafish, old-fashioned, or ridiculous.
To give you a sense of what I mean by this, consider the career of public intellectual whose career peaked in the early aughts. Thomas Friedman is now the butt of a thousand jokes. He maintains his current position at the New York Times mostly through force of inertia, but secondly through his excellent connections within the Davos class and his sterling reputation among those who think as that class does. But this was not always so. Let us review Friedman's climb to prominence:
Thomas Friedman earned his BA in Mediterranean Studies in 1975; a few years later he obtained a prestigious Marshall scholarship to study at Oxford, where he earned a Masters in Middle Eastern Studies. By age 26 he was a reporter in Beirut, and at age 29 he had won his first Pulitzer (for up close reporting on a war massacre). He would win another Pulitzer as the New York Times' bureau chief in Jerusalem, and at age 36 would write his first award winning book, From Beirut to Jerusalem, a recapitulation of his years of reporting in those two cities. This put Friedman at the top of the "Middle East hand" pack. That is a nice place to be, but it is still far away from the position of household public intellectual.
To get there Friedman would first transition to reporting from Washington DC as a White House correspondent. A few years later (now at age 41) he would be given a foreign affairs column at the New York Times, moving him a step further into the opinion-business. I attribute his transformation from minor public commentator to Voice of the Zeitgeist to two events: first, the publishing of The Lexus and the Olive Tree in 1999 (when he was 46 years old), the first of several books that would lay out his theory of globalization; second, the terrorist attacks September 11th, which allowed him to write columns that drew on both his long personal experience in the Middle East and his newer interest in globalization. These were the columns that won him his Pulitzer for commentary in 2002 and made him a central voice in the debates over America's response to the terrorist attacks and the the invasion of Iraq. I place Friedman's peak in his 52nd year, when his most famous book, The World is Flat, was published. It was also around this time that opposition to Friedman was at its peak, with bloggers and columnists alike writing long diatribes against him.
Friedman would close out the decade with another book and three documentaries. These were mostly restatements of his columns (which in turn drew heavily from ideas he first introduced and developed between Lexus and The World if Flat). Friedman was still a part of the national conversation, but his perspective had lost its originality. His columns began to bleed together. This is the era when "Friedman Op-Ed Generators" went viral. Increasingly, Friedman was not argued against so much as joked about. By 2013 or so (just as he was turning 60) Thomas Friedman was done. Not technically so—between then and now he would rack up two more books, hundreds of columns, and heaven knows how many appearances at idea festival panels and business school stages. But intellectually Friedman was a spent force. His writing has been reduced to rehashing old rehashes, his columns the rewarmed leftovers of ideas grown old a decade ago. It is hard to find anything in his more recent books or columns that has mattered. He is able to sell enough books to live comfortably, but you will have difficulty finding anyone under 50 who admits they have read them. Friedman lingers still as a public figure, but not as a public intellectual. His thinking inspires no one. The well has run dry.
The easy answer is that the world of 2019 is not the world of 2002. What seemed compelling at the turn of the millennium is not compelling now. A man whose worldview has not budged in two decades has nothing to say to a world that has changed tremendously in that same time. But this answer is not really sufficient. It is hard to remember now, but there was once a time when the insights of Thomas Friedman read fresh and strikingly original. That his ideas seem so banal and obvious today is in many ways a measure of how successful he was at popularizing them in the early 2000s. The real question to answer is this: why are so many public intellectuals capable of generating insight, originality, or brilliance at the beginning of their careers, but are utterly incapable of fresh thinking a decade later?
Let me offer two hypotheses. One is psychological, the other sociological.
Analytic brilliance is not constant over the course of life. Both general intelligence and more nebulous measures of creativity have clear peaks over the course of a lifespan. Here is how one textbook describes research on this question (I've taken out the parenthetical references to various source studies for ease of reading):
In most fields creative production increases steadily from the 20s to the late 30s and early 40s then gradually declines thereafter, although not to the same low levels that characterized early adulthood. Peak times of creative achievement also vary from field to field. The productivity of scholars in the humanities (for example, that of philosophers or historians) continues well into old age and peaks in the 60s, possibly because creative work in these fields often involves integrating knowledge that has crystallized over the years. By contrast, productivity in the arts (for example, music or drama) peaks in the 30s and 40s and declines steeply thereafter, because artistic creativity depends on a more fluid or innovative kind of thinking. Scientists seem to be intermediate, peaking in their 40s and declining only in their 70s. Even with the same general field, differences in peak times have been noted. For example, poets reach their peak before novelists do, and mathematicians peak before other scientists do.
Still in many fields (including psychology) creative production rises to a peak in the late 30s and early 40s, and both the total number of works and the number of high quality works decline thereafter. This same pattern can be detected across different cultures and historical periods....
What about mere mortals? Here researchers have fallen back on tests designed to measure creativity. In one study, scores on a test of divergent thinking abilities decreased at least modestly after about age 40 and decreased more steeply starting around 70. It seems that elderly adults do not differ much from young adults in the originality of their ideas; the main difference is that they generate fewer of them. Generally then, these studies agree with the studies of eminent achievers: creative behavior becomes less frequent in later life, but it remains possible throughout the adult years."I suspect the underlying mechanism behind this pattern is brain cell loss. Neuroscientists estimate that the average adult loses around 150,000 brain cells a day; in the fifty years that follow the end of brain maturation (ca. years 25-75), the average brain will lose somewhere between 5-10% of its neurons. Fluid intelligence begins declining in a person's 30s. This implies that most humans reach their peak analytic power before 40. Crystal intelligence holds out quite a bit longer, usually not declining until a person's 60s or 70s. This is probably why historians reach peak achievement so late: the works that make master historians famous tend towards grand tomes that integrate mountains of figures and facts—a lifetime of knowledge—into one sweeping narrative.
Thus most humans develop their most important and original ideas between their late twenties and early forties. With the teens and twenties spent gaining the intellectual tools and foundational knowledge needed to take on big problems, the sweet spot for original intellectual work is a person's 30s: these are the years in which they have already gained the training necessary to make a real contribution to their chosen field but have not lost enough of their fluid intelligence to slow down creative work. By a person's mid 40s this period is more or less over with. The brain does not shut down creativity altogether once you hit 45, but originality slows down. By then the central ideas and models you use to understand the world are more or less decided. Only rarely will a person who has reached this age add something new to their intellectual toolkit.
Recognizing this helps us make sense of a many interesting aspects of human social life. I think often about Vaisey et al's 2016 study, which demonstrated that most shifts in social attitudes occur not through change in the attitudes at the individual level, but through intergenerational churn. Old attitudes die because generations that hold them literally die off. Such is the stuff of progress and disaster.
Such is also the problem of the public intellectual. A public intellectual's formative insights were developed to explain the world he or she encountered during a specific era. Eras pass away; times change. It is difficult for the brain to keep up with the changes.
Not impossible, just hard. And this bring my second, sociological explanation into play. There are things that a mind past its optimum can do to optimize what analytic and creative power it still has. But once a great writer has reached the top of their world, they face few incentives to do any of these things.
Consider: Thomas Friedman's career began as a beat reporter in a war-zone. He spent his time on Lebanese streets talking to real people in the thick of civil war. He was thrown into the deep and forced to swim. The experiences and insights he gained doing so led directly to many of the ideas that would make him famous a decade later.
In what deeps does Friedman now swim?
We all know the answer to this question. Friedman jets from boardroom to newsroom to state welcoming hall. He is a traveler of the gilded paths, a man who experiences the world through taxi windows and guided tours. The Friedman of the 20th century rushed to the scene of war massacres; the Friedman of the 21st hurries to conference panels. What hope does a man living this way have of learning something new about the world?
More importantly: What incentive does he have to live any other way?
I have noticed that historians who transition from the role of academic scribbler to famed public voice follow a sort of pattern. Their first published work might be a monograph, perhaps a PhD thesis turned book. It will be on some narrow topic no sane person cares about, the product of months spent in one archives in one location. U.S.-British trade relations in the 1890s, perhaps, or state-led cultural imperialism in Japanese Manchuria. They may repeat this feat again, but at some point they transition to something broader—now they are writing a global history of trade regimes under the gold standard, or of empire building in the whole Greater East Asia Co-prosperity sphere. This work will be a brilliant, field-defining piece of scholarship, lauded (or resented) by other luminaries of their sub-discipline, read by scholars and interested laymen alike. That book will be published by an academic press; the next will be aimed at popular audiences. Our historian has now graduated fully to the role of public thinker: her next book will be on the dangers posed by trade wars writ large, or on the nature of modern imperialism. This title will be reviewed in all the famous magazines; people who have never read it will argue about it on twitter. And then everything starts to fall apart.
The trouble is that just as our historian reaches her full stature as a public name, her well of insight begins to run dry. A true fan of her works might trace elements of their name-making title back to the very first monograph she published as a baby academic. She was able to take all of the ideas and observations from her early years of concentrated study and spin them out over a decade of high-profile book writing. But what happens when the fruits of that study have been spent? What does she have to write about when they have already applied their unique form of insight to the problems of the day?
Nothing at all, really. Historians like this have nothing left to fall back on except the conventional opinions common to their class. So they go about repackaging those, echoing the same hollow shibboleths you could find in the work of any mediocrity.
You see this pattern recur again and again in the op-eds of our nation. A once-bold foreign correspondent whose former days of daring-do have already been milked for more than they are worth, a Nobel laureate two decades removed from the economic papers that gave him acclaim, a nationally known historian who has not stepped into an archive since graduate school—the details change but the general pattern is the same. In each case the intellectual in question is years removed from not just the insights that delivered fame, but the activities that delivered insight.
The tricky thing is that it is hard to go back to the rap and scrabble of real research when you have climbed so high above it. Penguin will pay you a hefty advance for your next two hundred pages of banal boilerplate; they will not pay you for two or three years of archival research on some narrow topic no one cares about. No matter that the process of writing on that narrow topic refills the well, imbuing you with the ideas needed to fill out another two decades of productive writing. The world is impatient. They do not have time to wait for you to reinvent yourself.
There are practical implications for all this. If you are an intellectual, the sort of person whose work consists of generating and implementing ideas, then understand you are working against time. Figure out the most important intellectual problem you think you can help solve and make sure you spend your thirties doing that. Your fifties and sixties are for teaching, judging, managing, leading, and dispensing with wisdom. Your teens and twenties are for gaining skills and locating the problems that matter to you. Your thirties are for solving them.
Public intellectuals who do not wish to transition in the their forties from the role of thinker to mentor or manager are going to have a harder time of it. Optimizing for long term success means turning away from victory at its most intoxicating. When you have reached the summit, time has come to descend, and start again on a different mountain. There are plenty of examples of this—Francis Fukuyama comes to mind as a contemporary one—but it is the harder path. For some, this will be a path worth taking. For others, wisdom is found in ceding the role of public intellect over to younger upstarts and moving to more rewarding positions guiding the next generation of intellectual lights.
If you would like to read some of my other jottings on psychology may find the posts "Historians, Fear Not the Psychologist," "Public Opinion in Authoritarian States," and "Taking Cross Cultural Psychology Seriously" of interest. If writing on intellectual life are more up your alley, consider "Questing for Transcendence," "Books Notes--Strategy, a History," "I Choose Hannah Arendt," and "On the Angst of American Journalists" instead. To get updates on new posts published at the Scholar's Stage, you can join the Scholar's Stage mailing list, follow my twitter feed, or support my writing through Patreon. Your support makes this blog possible.
 I've forgotten who, and did not bother saving the tweet—if you know who it is sound off in the comments)
 Carol Sigelman and Elizabeth Rider, Lifespan Human Development, 6th ed (Belmont, CA: Wadsworth Learning, 2009).
 John E Dowling, Understanding the Brain: From Cells to Behavior to Cognition (New York: W. W. Norton & Company, 2018).
 John Horn and Raymond Cattel, "Age differences in fluid and crystallized intelligence," Acta Psychologica (1967), vol 26, 107-129. For a very strong counter-statement that argues this fluid v. crystal distinction does not match the complexity of the data, see Joshua Hartshorne and Laura Germine, "When Does Cognitive Functioning Peak? The Asynchronous Rise and Fall of Different Cognitive Abilities Across the Life Span," Psychological Science (2015), vol 26, iss. 4, 433–443.
 Stephen Vaisey and Omar Lizardo, "Cultural Fragmentation or Acquired Dispositions? A New Approach to Accounting for Patterns of Cultural Change," Socius: Sociological Research for a Dynamic World (2016), vol 2 .