Although The Bell Curve represented Murray’s first published discussion of genes and intelligence—as tactical support for his preferred policies ending assistance to the poor—his co-author had been writing on the topic for more than two decades. Herrnstein’s initial interest in intelligence marked a radical departure from his previous work. As a Harvard graduate student in the early 1950s, he had studied with the famous behaviorist B.F. Skinner, specializing in operant conditioning with pigeons. Appointed to a junior faculty position at Harvard in 1958, he received promotion and tenure only three years later, after formulating the “matching law,” an important theoretical result predicting that, when an organism is offered two response alternatives, the ratio between them will match the ratio of reinforcements associated with each alternative. His reputation well established as one of the world’s experts on pigeon behavior, Herrnstein went on to occupy an endowed chair at Harvard.

In 1965 Herrnstein chose The Atlantic Monthly (soon to become The Atlantic), a mass periodical for educated laypersons, to share with the public the wondrous practical applications of his research expected to occur in the near future. Birds and other animals, he predicted, would eventually replace human labor in industry whenever a simple sensory task, as opposed to an exercise of judgment, was involved. He noted, for example, that pigeons could detect defective parts with greater accuracy than humans and could work longer with no sign of fatigue or the deterioration in standards displayed by their human counterparts. The full commercial exploitation of these capabilities, Herrnstein explained, was being “held back only by negative attitudes that oppose the dictates of good business”—perhaps an allusion to possible unemployment created by the use of animals. (In fact, when, in summer 2017, Western Michigan University rented a group of goats to clear some brush on campus, the local AFSCME chapter filed a grievance, contending that the goats were taking jobs from laid-off union members.) Herrnstein also hinted at important uses for pigeons in scanning reconnaissance photographs but was unable to provide details because of “security restrictions,” a project probably of more significance during the Vietnam War.1 However, these exciting applications failed to materialize, and in place of the glamor and enthusiasm once promised by the field of animal learning, it became a small backwater of experimental psychology.

Half a dozen years later Herrnstein returned to The Atlantic, though this time at the opposite end of the theoretical spectrum, no longer focusing on principles of learning and conditioning but now concerned with genes and intelligence. In an article titled simply “I.Q.” Herrnstein sounded all the themes that would emerge in substantially greater detail two decades later in The Bell Curve; widely discussed and controversial, the article was soon expanded into a book. As he explained in the latter publication, after being “submerged for twenty years in the depths of environmentalistic [sic] behaviorism,” Herrnstein’s “confidence in the environmentalist doctrine” had finally broken down when his “study of the subject of intelligence testing (or more broadly mental testing) persuaded [him] that the facts about people point to the role of genes in human society.” However, an additional factor in this dramatic change of perspective was the firestorm that had occurred two years earlier in response to an article in the Harvard Educational Review by Berkeley Professor of Education Arthur Jensen (also a Pioneer grantee), titled “How Much Can We Boost IQ and Scholastic Achievement?”.

Published not long after Lyndon Johnson’s War on Poverty had included educational resources on the grounds that an increase in cognitive abilities would better enable the children of the poor to improve their socioeconomic condition, the first sentence of Jensen’s article dismissed any such hopes, bluntly declaring that “Compensatory education has been tried and it apparently has failed.” Jensen went on to explain the reason for failure: these programs had been based on the inaccurate belief that the poor academic performance of minority children stemmed from “social, economic and educational deprivation and discrimination” and thus that they would benefit from the same kind of cultural enrichment and additional instruction in basic skills enjoyed by middle-class children. The real disadvantage for poor and minority children, he maintained, came not from their conditions but from their biology; they were just genetically less intelligent. Much of the remainder of this lengthy article—at 123 pages it consumed almost the entire issue of the journal—presented a discussion of the concept of heritability, a technical term from behavior genetics indicating what proportion of variation in a trait is associated with variation in genotypes, followed by the suggestion that, rather than material assistance Blacks would benefit most from eugenic measures to discourage their least intelligent elements from reproducing.2 For a publication in an academic journal, Jensen’s article produced an unprecedented degree of outrage. One social scientist accused him of having done “injury to children,” and other well-known psychologists called his work “academic manure,” “obscene,” and “abominable.”3 Student activists organized against Jensen, urging boycotts of his classes, interrupting his lectures, and demanding that he be fired.

Herrnstein felt strongly about what he regarded as Jensen’s mistreatment—and not just from the civil libertarian point of view that the Berkeley professor should not have endured such harassment merely for expressing his opinion; he also believed that Jensen had made a compelling case. Thus, along with reflecting his own newly developed interest in intelligence, the Atlantic article also served to provide intellectual support for Herrnstein’s beleaguered disciplinary colleague, whose controversial Harvard Educational Review article he described as “cautious and detailed, far from extreme in position or tone.” Although Herrnstein had never conducted any research on intelligence nor a fortiori on its genetic basis—never published anything on the topic in a professional journal—he was an effective popularizer. In readable prose for a mass audience, his Atlantic article described the development of the concept of intelligence, the measurement of which Herrnstein called “psychology’s most telling accomplishment to date,” and its importance in determining life outcomes. Turning to “the inherited factor in I.Q.,” Herrnstein explained the meaning of heritability and described the most straightforward method for its estimation—the similarity between the IQs of identical twins raised in separate homes, pairs of individuals sharing identical genotypes but different environments—concluding that Jensen and “most of the other experts in the field” were right: “the genetic factor is worth about 80 percent and … only 20 percent is left to everything else,” a result he considered “psychology’s best proved socially significant empirical finding.” On the most inflammatory issue of a genetic component to racial differences, Herrnstein declared that “the case is simply not settled,” but he certainly thought that an answer was possible and found it “irritating” for inquiry to be “shut off because someone thinks society is best left in ignorance.”4

Little of this discussion was particularly controversial until, in the last two pages, Herrnstein described the effect of hereditary factors on “social standing,” concluding that the society was heading toward a genetic caste system with a biologically superior upper class—essentially a genetic aristocracy. This effect was so clear, Herrnstein wrote, that he could express it in the form of a syllogism:

  1. 1.

    If differences in mental abilities are inherited, and

  2. 2.

    If success requires those abilities, and

  3. 3.

    If earning and prestige depend on success,

  4. 4.

    Then social standing (which reflects earning and prestige) will be based to some extent on inherited differences among people.

However, after the syllogism’s modest conclusion Herrnstein went on to describe a future in which social standing was not just “to some extent” related to heritable traits, envisioning instead a biologically stratified society with little possibility for mobility in either direction. At the nadir of this genetic hierarchy Herrnstein foresaw “precipitated out of the mass of humanity a low capacity … residue that may be unable to master the common occupations, cannot compete for success and achievement, and are most likely to be born to parents who have similarly failed”—a metaphor that Francis Galton, founder of the eugenics movement, also had in mind when he referred to the lower classes as “the residuum.” As technological advancement created new jobs demanding, in Herrnstein’s analysis, higher IQ’s, these hereditary defectives would be the most adversely affected, so that in his future society “the tendency to be unemployed may run in the genes of a family about as certainly as bad teeth do now.” At the other end of the socio-genetic ladder would lie a new aristocracy, a class with greater wealth, power, and privilege, but unlike aristocracies of the past, which, Herrnstein emphasized, “were probably not much superior biologically to the downtrodden,” this new privileged class would be entitled to its prerogatives because “when people can freely take their natural level in society, the upper classes will, virtually by definition, have greater capacity than the lower.”5

According to Herrnstein’s analysis, this scenario was inevitable. Although he agreed with Jensen’s estimate that the heritability of IQ was around 0.80—i.e., that 80 percent of the differences in IQ between people were associated with differences in their genes—Herrnstein emphasized that the exact value of this statistic was not necessary to his argument. The more that improvements occurred in the society—resulting in more equitable legal, social, and educational conditions—the more heritability would increase; when environmental differences were minimized, only genes remained to explain the differences in outcomes. Thus, he insisted, biological stratification was the direct and inevitable consequence of maximal equality of opportunity, because the removal of arbitrary barriers and unfair advantages would only increase the significance of genetic factors in both IQ and its correlate, socioeconomic success; the “successful realization of contemporary political goals,” he insisted, would result in “the growth of a virtually hereditary meritocracy.” This was Herrnstein’s most important point, what he most wanted the public to recognize: that “their political goals are fighting the nature of the beast.” The egalitarian objective of a more equitable distribution of society’s resources was not only exposed as an impossible fantasy in this view, but those who pursued it by calling for equal opportunity, would only create, to their own dismay, an even greater separation between classes: “Actual social mobility is blocked by innate human differences,” he explained, “after the social and legal impediments are removed.”6

At its core Herrnstein’s argument sought to remove questions of resource distribution generally considered to be moral or political decisions and present them instead as biologically ineluctable. This view placed humane aspirations for greater socioeconomic equality on a collision course with science; social systems intended to reduce inequality, whatever their name—economic democracy, democratic socialism, etc.—were thus proved to be hopeless. The fault was not in our stars but in our genes, and as long as equality of opportunity was guaranteed, no amount of tinkering with social organization could avoid the inevitable: biostratification. Inequality in the social order reflected inequality in the natural order.

Thus, in this view the society was heading inevitably toward what Herrnstein called a “meritocracy,” a word that he acknowledged taking from the British sociologist Michael Young’s novel, The Rise of the Meritocracy, describing how, well into the twenty-first century, the principles of genetics had combined with the measurement of intelligence to create an intergenerational ruling elite, whose membership was determined by test score; Young coined the portmanteau by joining the Greek suffix for “rule” or “authority” to the Latin “meritus” the past participle of the verb meaning “to earn” or “deserve.” Herrnstein praised the book as a “prescient” account of what to expect, already “catching the attention of alert social scientists,” apparently oblivious to the fact that the novel was intended as a withering satire of a dystopian future, in which an insufferably smug and arrogant ruling class, secure in its scientifically demonstrated superiority and lacking any sense of social responsibility since their position on top was due entirely to their own genes, presides over a lower class, the members of which—the “technicians”—are forced to recognize the truth of their inferiority, and the consequent fact that their position on the bottom is both inevitable and appropriate. Any sense of political community in such a society has been completely lost. Herrnstein converted “meritocracy” from an intended pejorative into a positive, even titling his own book based on the Atlantic article “I.Q. and the Meritocracy.” (In 2001 Young complained that his neologism “has gone into general circulation, especially in the United States,” pointing out that “the book was a satire meant to be a warning.”)7

Although no one prior to Herrnstein had provided as detailed a genetic argument, the notion that some people possess inborn qualities justifying their superior position in society has roots as old as antiquity. Aristotle believed in government by hoi aristoi—“the best,” those with exceptional natural ability—and maintained that, because of differences in the power of reason, “just as some are by nature free, so others are by nature slaves, and for these latter the condition of slavery is both just and beneficial.” And in The Republic Plato described Socrates’s “myth of the metals,” in which a citizen’s value to and position in the city are determined by which of three metals characterizes his soul: gold for those best fit to rule, silver for those who assist the rulers, and iron and bronze for those—farmers and craftsmen—whose place is to obey.8 But not until the twentieth century did some philosophers and social scientists suggest the premise underlying Herrnstein’s analysis: that only egalitarian societies would ensure that these innate characteristics suiting people to specific roles did in fact exercise such a determinative effect. In 1903 the Scottish philosopher, David G. Ritchie, anticipated Herrnstein’s argument, declaring that “the result of … equality of opportunity will clearly be the very reverse of equality of social condition,” since “the abolition of legal restrictions on free competition allows the natural inequalities of human beings … to assert themselves”; even under “a socialistic regime, which fell short of a complete communism,” Ritchie expected these “inequalities of condition” to emerge. In 1923 the prominent sociologist F.H. Hankins, later to become president of the American Sociological Association, maintained that the whole purpose of equal opportunity, and especially education for all, was to serve as the “principal means whereby the natural aristocracy of the country can be discovered and trained for the superior responsibilities it is to fill.” Acknowledging that the goal was a Platonic society, in which each person “is fitted into the social order at a level corresponding to his innate powers,” Hankins foresaw “an enormous difference between those at the top and those at the bottom in social value, in power and in financial rewards”; the elimination of “artificial handicaps,” he wrote, in a conclusion offering Herrnstein’s rationale as support for Aristotle’s pronouncement, would reveal “those born to rule” and those “born to be ruled.”9

However, whatever the role of equal opportunity, for many social scientists it was creation of the “mental test” that converted this notion of organized genetic determinism from sociological speculation to practical possibility. Ecstatic at the thought that a seemingly objective measure taking a mere 40 minutes to administer could furnish the basis for a Platonic paradise, early intelligence testers were eager to create a society in which each person could be assigned a genetically appropriate place. Lewis Madison Terman, for example—a member of the National Academy of Sciences, probably the most well-known educational psychologist in the first half of the twentieth century, and described by one historian as the scientist most “responsible for making the IQ a household word”—called for testing to begin in the earliest grades so that those children destined to be “the world’s hewers of wood and drawers of water” could be removed from the usual curriculum and “segregated in special classes … given instruction which is concrete and practical” in order to make them “efficient workers.” For all other children Terman urged that “vocational guidance” should begin no later than fifth or sixth grade, directing each student toward an intellectually “compatible” occupation by comparing the IQ score with the minimum necessary for success in that field. Such a procedure, he explained, would not only avert “selection of a vocation … requir[ing] a higher grade of ability than the individual possesses” but also ensure that bright students did not “waste” their abilities in an occupation requiring “mediocre intelligence.” Any IQ score above 85 for a barber, for example, was “so much dead waste.”10

Sir Cyril Burt, too, the internationally eminent British psychologist and first member of his profession to be knighted (whose research on the heritability of intelligence was posthumously exposed as worthless and probably fraudulent), believed that it was “the duty of the state through its school service” to provide a child “the education most appropriate to his powers, and … to place him in the particular type of education for which nature has marked him out.” Thus, Burt proposed that each child be classified according to test score into one of eight IQ ranges, each range then corresponding to an educational category leading to a specific set of vocational possibilities. In addition, Burt specified the portion of the population that was expected to fall into each classification: only the highest range, for example, encompassing a mere tenth of a percent of the population, would enjoy a university education leading to a career in the professions or to a “higher administrative” position, while roughly 11 percent would fall into the next two ranges channeling them into “higher grade schools” and eventually technical positions; the overwhelming bulk of the population in the lower ranges would receive the appropriate education for their destiny as workers, either “skilled,” “semi-skilled,” “unskilled,” or “casual.” Education would furnish what Burt called “the key … to social efficiency … a place for every man and every man in his place.”11

While Terman, Burt, and other prominent psychologists emphasized the efficiency of using IQ scores to determine one’s course in life, Charles Spearman—the British psychologist who first posited the notion of a general intelligence factor (“g”) and pioneered the statistical process of factor analysis—offered an additional and even more grandiose justification: harmony. Not only would the measurement of intelligence ensure that “each can be given an appropriate education, and therefore a fitting place in the state–just that which he or she demonstrably deserves,” Spearman predicted, but he was certain that, as a result of testing, “Class hatred, nourished upon preferences that are believed to be unmerited, would seem at last within reach of eradication; perfect justice is about to combine with maximum efficiency.”12 This last sentence is remarkable for its utter cluelessness about human nature, implying as it did that, after learning of their test scores, the poor would accept their unenviable station if not cheerfully, then at least without resentment toward their betters, knowing that it was merely the rational, social reflection of their genetic inadequacy; once the members of the lower class appreciated that their inferior position was not unmerited, social harmony would prevail.

Michael Young’s novel provided a much more realistic account of the likely reaction from actual human beings confronted with evidence of their mediocrity. In an unjust society, one lacking equal opportunity for advancement, “the workers,” Young observed, “could altogether dissociate their own judgments of themselves from the judgment of society.” Those on the lower rungs of the social ladder could console themselves with the thought that, but for circumstances, their life would have turned out much different: “Had I a proper chance I would have shown the world,” was their perspective. Thus, as Young put it, “Educational injustice enabled people to preserve their illusions, inequality of opportunity fostered the myth of human equality.” But in a meritocracy, he explained, it becomes harder for the poor to bear their allotted position: “all persons, however humble, know that they have had every chance,” and as a consequence, “if they have been labelled ‘dunce’ repeatedly they cannot any longer pretend. … Are they not bound to recognize that they have an inferior status–not as in the past because they were denied opportunity; but because they are inferior? For the first time in human history the inferior man has no ready buttress for his self-regard,” and the result, Young observed, was detrimental both to the individual and to the society: those “who have lost their self-respect are liable to lose their inner-vitality … and may only too easily cease to be either good citizens or good technicians.”13

At the time of the Atlantic article, Herrnstein’s interest in intelligence and meritocracy was informed primarily by the efficiency principle. Though not as rigidly deterministic as his predecessors—more inclined to allow the intellectual demands of different vocations to create a natural sorting mechanism—Herrnstein was especially concerned with the tails at each end of the intelligence spectrum. Like Jensen, he too believed that the Great Society programs, designed to assist poor and minority children, had been a “failure,” calling it “imperative that the government stop throwing its money down a bottomless hole.” But unlike Jensen, Herrnstein emphasized educational resources as a zero-sum game, in which well-meaning efforts to provide low achieving students with conditions similar to those enjoyed by high achievers amounted to “withholding educational advantages from gifted people and lavishing them on the less well endowed.” Thus, he argued, instead of compensatory education the vain attempt to reduce educational inequities threatened to create a system of “compensatory deprivation,” which might “reduce individual differences, but do so at the expense of those who are fortunate enough to have been well endowed to begin with”—an approach that Herrnstein compared to “depriving healthier people of some part of their medical care and diverting it to the unhealthy.” This sort of “selective deprivation,” he concluded, was not only “unfair” to the genetically advantaged but “a waste our society can ill afford.”14

The Bell Curve continued this focus on the importance of appropriate treatment for the extreme IQ scores, high and low. Both Terman and Burt had despaired of educating the duller students for any purpose other than to become unskilled labor. Herrnstein and Murray were even blunter: “People in the bottom quartile of intelligence,” they declared, “are becoming not just increasingly expendable in economic terms; they will sometime in the not-too-distant future become a net drag … unable to perform that function so basic to human dignity: putting more into the world than they take out.” And as a result, they concluded, “For many people, there is nothing they can learn that will repay the cost of the teaching.”15 From a cost-effectiveness perspective, there was no sense even attempting to educate a substantial portion of the population.

While resources were thus supposedly being squandered on a misguided attempt to educate the lower tail of the IQ distribution, the top five percent—the group dubbed the “cognitive elite” by The Bell Curve—was being deprived of their appropriate share according to Herrnstein and Murray. Again as in Herrnstein’s earlier work, The Bell Curve urged a reallocation, shifting federal aid from programs for the disadvantaged to programs for the gifted—an unsurprising recommendation given the book’s opinion of the former initiative as futile and the latter as essential. The cognitive elite—or, as Herrnstein and Murray candidly termed them, “the people who count in business, law, politics and our universities”—by virtue of their genetic advantage would inevitably grow up “segregated from the rest of society,” attending “the elite colleges,” enjoying “successful careers,” and “eventually lead[ing] the institutions of this country, no matter what.” Thus destined for positions of power and influence, they needed “education of a particular kind,” one not only with higher standards but emphasizing “how to think about their problems in complex, rigorous modes” and “bring to their thinking depth of judgment and, in the language of Aristotle, virtue.” Yet their actual treatment in school, Herrnstein and Murray complained, represented the “one clear and enduring failure of contemporary American education.” It was essential, they insisted, that this “natural aristocracy” be prepared for their genetically appropriate role to govern; such an emphasis on critical thinking, crucial to the formation of a polity capable of democratic self-governance, would nevertheless be unbefitting for the rest of the population, which presumably needed education of a more practical kind. (Similarly, one provision of “The Charlottesville Statement,” the white supremacist manifesto issued by Richard Spencer, one of the leaders of the neo-Nazi “Unite the Right” rally at the University of Virginia in 2017, maintained that higher education was “only appropriate for a cognitive elite dedicated to truth” and “improper, even detrimental” for the great majority, for whom “practical education–trade schools and apprenticeships—should be … the norm.”) Transferring resources from the disadvantaged to the cognitive elite was thus necessary for “the welfare of the nation, including the welfare of the disadvantaged.”16

But efficiency was not The Bell Curve’s only concern. Like Spearman, Herrnstein and Murray also believed that acknowledgment of the fact of genetic inequality and its ineluctable social consequences was essential if people were to “live together harmoniously despite fundamental individual differences.” Not as naïve as Spearman, however, they hoped that the less intelligent would find it in their best interest to accept their biologically determined lot in life but feared that, as a result of their failure to do so, the society was heading in an unfortunate direction. A widespread, “egalitarian” political ideal had fostered illusory hopes for improvement in the abilities of the cognitive underclass and correspondingly unrealistic expectations about their place in society. In response, Herrnstein and Murray predicted, “Over the next few decades, it will become broadly accepted by the cognitive elite that the people we now refer to as the underclass are in that condition through no fault of their own but because of inherent shortcomings about which little can be done.” To protect themselves from this low-IQ group, the cognitive elite—whose interests were increasingly converging with the affluent, producing “an unprecedented coalition of the smart and the rich”—would gravitate toward “a new kind of conservatism,” one “along Latin American lines, where to be conservative has often meant doing whatever is necessary to preserve the mansions on the hills from the menace of the slums below.” Thus, to keep the underclass “out from underfoot” the cognitive elite would implement a “custodial state … a high tech and more lavish version of the Indian reservation for some substantial minority of the nation’s population,” making it “difficult to imagine the United States preserving its heritage of individualism, equal rights before the law, free people running their own lives.”17

The only hope for avoiding this dismal scenario, according to Herrnstein and Murray, was to adopt a social policy informed by a “wiser tradition,” one derived from the great political thinkers, who, for thousands of years, had appreciated that people differed from each other in fundamental and important ways, fitting them to play specific roles. This was true from ancient philosophers, in both the East and West, who understood that “society was to be ruled by the virtuous and wise few,” to the nation’s founding fathers–Jefferson, Madison, Adams—who believed not in a democracy allowing an equal voice to all but in a republic ruled by the “natural aristoi.” The point to describing the views of these men, Herrnstein and Murray emphasized, was not to appeal “to their historical eminence, but to their wisdom. We think they were right.” Indeed, they noted, the “main purpose of education,” according to Jefferson, was “to prepare the natural aristocracy to govern”; the “people who count” had to be groomed for their appropriate role in the society.18

Thus, The Bell Curve concluded with a cautionary tale about the contemporary risks of failing to adopt a Platonic social model out of a reluctance to accept the importance of genetic differences in intelligence. Spearman had assumed that, faced with the objective evidence of their inferiority, the less well endowed would accept their station, knowing that it was warranted; Herrnstein and Murray feared the dire consequences of their reluctance to do so. Instead of vain and misguided attempts to overcome genetic disadvantage, the real need was to find “valued places” for everyone, especially those at the lower end of the intelligence spectrum. And the major obstacle to doing so, according to Herrnstein and Murray, was society’s rules “that are congenial to people with high IQs and that make life more difficult for everyone else.” What was needed, therefore, was a simplification, creating rules that were clear and comprehensible to “just about everybody who is not part of the cognitive and economic elites”: less government regulation; swift administration of criminal justice in which trial and punishment follow arrest “within a matter of days or weeks”; the elimination of Head Start, compensatory education, affirmative action, and government assistance for low income women who bore children; and the limitation of parental rights only to married couples, so that an unmarried mother had no legal basis for demanding child support from the father.19 More than 500 pages of statistical analysis of test scores thus culminated in a call for a set of policies that Murray had pursued long before he and Herrnstein had crunched their first set of IQ data, though now presented not as a political choice but a scientific necessity.

Economic Inequality: The Gradient of Gain

Writing in the early 1970s, Herrnstein made one prediction that turned out to be remarkably prescient: economic inequality would increase dramatically in the coming years. Two decades later in the National Review Murray too predicted that “the price for first-rate cognitive skills will skyrocket,” producing an “American caste system,” and The Bell Curve foresaw a similar trend, though by that time it was hardly surprising since inequality had become well entrenched as a feature of the economy. In contrast Herrnstein’s earlier analysis occurred during a period when gains were still spread fairly equally across the economic spectrum. From the beginning of what economists have labeled the “Great Compression” in the 1940s to the end of the 1970s, a bar graph plotting change in income against economic quintile looks almost like a picket fence, with each quintile enjoying approximately the same percentage increase; during those three decades, the share of the nation’s wealth held by the richest 1 percent fell from 48 percent of the total to just above 20 percent, as a combination of strong unions, progressive taxation, and social norms produced an unprecedented downward distribution of income, leading to probably the most generalized material prosperity in history. Indeed, writing around the same time as Herrnstein, the eminent sociologist Daniel Bell noted “the steady decrease in income disparity among persons, which he attributed to technological advance, implying that the trend could be expected to continue.20

Yet, as Herrnstein predicted, exactly the opposite occurred. From the end of the 1970s until the present, the economy has experienced the Great Divergence, again one of the largest redistributions of wealth ever, but this time upward. Of course, there was substantial overall growth during this period, but instead of the rising economic tide lifting all boats, only the luxury yachts rode the waves, while small craft found themselves stuck in shallow water. The same bar graph displaying income gain by quintile now looks like an irregular staircase beginning with a very small step up—the increase in income for the first quintile—and an increasingly larger step for each subsequent quintile; the larger the income, the larger the percentage increase. But if the very highest earners are broken out separately from the top quintile, they enjoyed such a substantial increase that the trendline over the income groups changes from linear with a large positive slope to dramatically exponential; the higher up the distribution, the much steeper the rise in income. Adjusted for inflation, at the beginning of 2019 the lower half of the income distribution had seen no increase in income since 1980, and the average hourly wage for the working class had actually declined at the same time that, according to the Berkeley economists Emmanuel Saez and Gabriel Zucman, “for the highest 0.1 percent of earners, incomes have grown more than 300 percent; for the top 0.01 percent incomes have grown by as much as 450 percent; and for the tippy-top 0.001 percent—the 2300 richest Americans—incomes have grown by more than 600 percent.” As a result the proportional difference between the top 1 percent and the bottom 99 percent is replicated by the difference between the top 0.01 percent—the 1 percent of the 1 percent—and the top 0.99 percent and yet again by the difference between the “tippy-top” 0.001 percent and the top 0.009 percent. In a particularly striking indication of the disparity in growth, a journalist specializing in finance reported that from 1990 to 2000, for every additional dollar earned by the bottom 90 percent of taxpayers, those in the top 0.01 percent earned an additional 18,000; the same figure for the period between 1950 and 1970 had been 162 dollars. Concerned that such an “extreme concentration of wealth means an extreme concentration of economic and political power,” Saez and Zucman concluded that “just as we have a climate crisis, we have an inequality crisis.”21

Other standard measures of economic inequality led to the same conclusion. The distribution of wealth, for example—the net value of all a person’s assets after factoring in debt—has become even more skewed than annual income. A 2014 study by the same two economists found that the wealthiest one tenth of a percent of the population accounted for as much of the country’s wealth as the bottom 90 percent combined, and according to a report from the Institute for Policy Studies, at the end of 2017 the three wealthiest individuals in the United States—Bill Gates, Warren Buffett, and Jeff Bezos—owned more than the 160 million people in the bottom 50 percent of the American population combined; the wealth of just these three people also exceeded the total wealth, adjusted for inflation, of the entire Forbes 400 in 1982, the list’s inaugural year. In 2019 the Gini coefficient for the United States—the most widely used index of a society’s economic inequality—reached its highest level since tracking began and is now larger than any of the European nations in the Organization for Economic Cooperation and Development; compared to these nations the United States also has the largest percentage of working age people who live in poverty.22 Once equal to that of other affluent democracies like Canada and Norway, income inequality in the United States now exceeds that of countries like India, Indonesia, Haiti, and Vietnam. By every conceivable measure, inequality in the United States is approaching levels not seen since before the Roosevelt administration—Teddy Roosevelt.

While the country has experienced extreme inequality before—during the Gilded Age in the late nineteenth century, for example—the present context is qualitatively different from the past in two related ways. As the economist Thomas Piketty points out in his landmark study of wealth concentration and distribution, Capital in the Twenty-first Century, earlier instances were characterized “by very high incomes from capital, especially inherited capital.” But in the contemporary United States, Piketty observes, “the peak of the income hierarchy is dominated by very high incomes from labor rather than by inherited wealth”; in contrast to the traditional view that inequality is rooted in the conflict between capital and labor, the major economic divide in the society now stems from differences within the ranks of working people. “It is hardly surprising,” Piketty notes drily, “that the winners in such a society would wish to describe … [it] as ‘hypermeritocratic,’ and sometimes they succeed in convincing some of the losers.”23 In his insightful book, The Meritocracy Trap, the Yale law professor Daniel Markovitz dubs these people the “superordinate working class”—graduates of elite universities with “immense skill, won through rigorous training,” whose jobs require “intense, competitive, and enormously productive industry,” and who enjoy annual wages in 7, 8, 9, and occasionally even 10 figures. No longer an idle aristocracy leading a life of extravagant leisure, many of the superrich now work, and work hard, for their colossal incomes (engendering a sense that they are not in any way parasites but have truly “earned” their money, which then fuels the resentment at any increase in tax rate); Markovitz calls them “today’s Stakhanovites.”24

In addition, as a result the main source of inequality has been relocated on the economic spectrum. In the past and especially during the post-war democratization of the economy, the middle class and the rich tended to converge, leaving the difference between the poor and everyone else as the society’s major economic fault line. But the substantial decline in the kind of desperate poverty that once engaged humanitarian sensibilities together with enormous increases for the rich has changed the nature of inequality, making the difference between the superrich and everyone else the new inflection point. As Markovitz points out, a measure of inequality like the Gini coefficient has not changed for the bottom 90 percent of income distribution through the last half century; indeed, for the bottom 70 percent it has actually fallen. It is the dramatic increase in inequality for the top 5 percent—what Markovitz calls “the income gap between the merely rich and the exceptionally rich”—that alone accounts for the rise in inequality for the population overall.25

This extreme elongation of the upper end of the income spectrum is not just a matter of some people being richer than others. Hemingway’s famous (though mythical) retort to Fitzgerald may have been accurate at the time, but today a tiny slice of the American population enjoys an entirely different life from that of their fellow citizens, far beyond the mere fact that “they have more money.” Of course, there have always been rich and poor neighborhoods; in the early 1980s the cultural and literary historian Paul Fussell referred to the very rich as “the class in hiding” for their “estates where you can’t see the house from the road.” But there were also common experiences that the richest people shared with the middle class. Now, however, as Nelson D. Schwartz documents in his aptly named book, The Velvet Rope Economy, those fortunate enough to have the financial resources “rarely come into contact with people from other walks of life.” Whether it’s private skyboxes at athletic events, helicopters to avoid road traffic, charter flights or private suites at the airport within steps of the plane to avoid the departure lounge altogether, concierge medical services that ensure immediate access to the best doctors and most recent advances (including the coronavirus vaccine as soon as it became available), private firefighting services that will arrive at a property and take preventive action when a fire that might eventually pose a threat is still a safe distance away, even special back stage passes at theme parks like Universal, Disneyland, or Disney World, the exceptionally wealthy can purchase an “E-Z Pass” in life, enabling them “to zip past the everyday obstacles the rest of us have to contend with.”26

Though writing decades before the degree of inequality became so dramatic, neither Herrnstein, in his 1971 article, nor Herrnstein and Murray in The Bell Curve had any doubt about the correct interpretation of this trend. Indeed, Herrnstein not only predicted such extreme economic differences as inevitable in the developing knowledge economy, he viewed them as a rational allocation of the society’s resources, assuming that they constituted an appropriate reflection of genetic differences in intelligence. For Herrnstein, the linkage between IQ score and income served an important societal purpose, channeling the most intelligent people into the socially more useful because intellectually more demanding occupations. “By directing its approval, admiration, and money towards certain occupations,” he wrote, “society promotes their desirability,” and “thereby expresses its recognition … of the importance and scarcity of intellectual ability.” And by ensuring that these positions enjoyed greater money, power, and social status, society provided a “gradient of gain” corresponding to “inborn ability,” as persons with superior intelligence sought the greater rewards associated with work of greater social value. Such a sensible mechanism, in Herrnstein’s opinion, allowed society to “husband … its intellectual resources,” preventing their waste on efforts of little importance.27 The cognitive elite deserved much more money than everyone else as long as they pursued those professions that most required the application of their intelligence, and providing the former was the best way to ensure the latter.

Although he viewed heredity as the major determinant of intelligence, for Herrnstein whether superior intellects were in fact put to the social use entitling them to greater reward was dependent primarily on Skinnerian principles of reinforcement. Indeed, he argued, if society was so foolish as to invert the gradient of gain so that bakers and lumberjacks—occupations that Herrnstein had singled out as performed by persons with an average IQ below that of the population in general and, as a consequence, appropriately less prestigious and less well remunerated—“got the top salaries and the top social approval,” then “soon thereafter, the scale of I.Q.’s would also invert” so that these newly desirable jobs now attracted people with the highest scores. As an inevitable result, according to Herrnstein, “the top I.Q.’s would once again capture the top of the social ladder.”28 Attaching different rewards to different occupations allowed society to direct the flow of talented labor toward the most socially beneficial positions.

As evidence for the linkage between intelligence and social contribution, Herrnstein presented the average IQ for a number of occupations, specifically citing accountants and public relations specialists as some of the highest scoring groups. Thus, at the time, the “brightest” people were attracted to professions that arguably spent much of their time either assisting the affluent to avoid their share of the tax burden or in attempts to confuse image with substance and generally deceive the public; Noam Chomsky suggested that service to wealth and power provided the true reason for their greater compensation. At the other end of the intelligence spectrum were the predominantly blue-collar workers—people whose jobs required them to get their hands dirty—and whose just deserts, according to Herrnstein, were “poverty, or at least, relative poverty as compared to our society’s successful people.” Upholsterers and stonemasons—occupations near the low end of the IQ spectrum on Herrnstein’s list—might be skilled artisans, able to restore antiques or create complex shapes out of rough pieces of rock, but as “technological advance changes the marketplace for I.Q.,” he wrote, the new positions would be beyond the “native capacity” of such workers.29

Thus, Herrnstein’s analysis of increasing economic inequality was informed by two assumptions, one leading to an explanation and the other to a justification. First, that the most intelligent people would unquestionably gravitate toward the highest paying professions, whatever they might be. And second, that such a steep gradient of gain would channel the most capable people into those roles most beneficial to the society. Dramatic levels of inequality were, in his analysis, both inevitable and desirable.

There is ample reason to believe that the potential of monetary reward has indeed exercised an effect on career direction, especially for well-educated persons from highly competitive schools with multiple career choices at their disposal—those high SAT scorers epitomizing the group that The Bell Curve referred to as the “cognitive elite.” As the US economy has experienced a generation-long transition from its traditional emphasis on manufacturing, agriculture, and wholesale and retail trade more toward financialization, graduates from the Ivy League and other elite institutions who had once pursued science, medicine, journalism, public service, and education have turned increasingly to the more lucrative opportunities associated with those areas subsumed under what one observer calls the “casino economy”: banking, securities, investments, and trading. In 2006 a New York Times article, appropriately titled “Lure of Great Wealth Affects Career Choices,” reported on the trend of professionals from other fields to “migrate” to Wall Street: newly minted Ph.D.’s, who once would have pursued a career in teaching and research; law school graduates no longer interested in public interest law or government jobs; and graduates from medical school, some of whom “go directly to Wall Street or into healthcare management without ever practicing medicine.” In a particularly dramatic example, the article cited a doctor who, two decades earlier, had graduated from Harvard College and then Harvard Medical School, intending to become a “physician-scientist” with the goal of finding a cure for cancer and “even dreaming of a Nobel Prize.” As a hematology-oncology specialist earning $150,000 in 1996 ($245,000 adjusted for inflation), he turned to a business consulting firm, eventually becoming a managing director of healthcare investment banking for Merrill Lynch with an annual income in seven figures. In the 2011 film Margin Call, a taut drama about the actions of a Wall Street investment bank during a 24-hour period after one of its junior analysts anticipates the imminent financial collapse, a middle-of-the-night meeting to decide how to react provides a realistic reminder of the finance industry’s attraction for the intellectual elite, no matter their original field of study. Asked by his superiors to describe his background, the analyst responds that “I hold a doctorate in engineering, specialist in propulsion, from MIT” and then, prompted to elaborate, explains that his “thesis was a study in the way that friction ratios affect steering outcomes in aeronautical use under reduced gravity loads.” “So you are a rocket scientist?” asks the impressed manager running the meeting. “I was … yes,” is the reply; “it’s all just numbers really, … and …the money is considerably more attractive here.” According to Markovitz, “entire groups at major banks” have become “dominated by physicists, applied mathematicians, and engineers, many with Ph.Ds.”30

Indeed, “considerably more attractive” is a gross understatement; in the last couple of decades the financial industry has become the source of previously unimaginable fortunes unrelated to the progress of the economy as a whole. Two years before the Great Recession, James Simons, a hedge fund manager—instructively, the chair of the math department at Stony Brook with an undergraduate degree in mathematics from MIT and a Ph.D. from Berkeley before becoming a “financial engineer”—took home 1.7 billion dollars; it would take more than 26 years for someone making the median personal income in the United States at the time to earn what Simons received every hour. A year later, after multi-billion dollar bailouts of the financial companies and the loss of millions of jobs, Simons’s income grew to 2.5 billion dollars; now his hourly pay of 1.2 million dollars was equal to what someone with the median personal income would take more than 40 years to earn. Even this staggering sum paled in comparison with the amount made that year by John Paulson, who banked 3.7 billion dollars by short-selling the subprime market, while 2.2 million households were faced with foreclosure. By 2018 the top 25 hedge fund managers were making an average of 850 million dollars. Though not as handsomely compensated as hedge fund managers, other finance professionals—bankers and traders—have received, in addition to their substantial salaries, regular annual bonuses of millions of dollars both before and after the recession in which their own companies lost billions. In 2014 a young derivatives trader, no longer comfortable with what he called a “wealth addiction,” estimated that “90% of Wall Street feels like they’re underpaid,” describing how his co-workers were “pissed off” at their 2 million dollar bonuses—he himself had once been furious that his own bonus was only 3.6 million—because their bosses were getting 150 million.31

Such lucrative possibilities have exerted an understandable effect on the choices made by the cognitive elite, the people expected to become the meritocrats in meritocracy. In 2014, 31 percent of Harvard graduating seniors went on to positions in finance or consulting; a year later more than a third did so. Every year between 2000 and 2010, at least one third of Princeton graduates who took jobs (as opposed to graduate school) entered the financial services industry with a high of 46 percent; if one adds consulting, the percentage is never less than 60 and often considerably higher. At the University of Pennsylvania, almost half of 2016 seniors chose careers in finance or consulting, with 29 graduates going just to Goldman Sachs and another 26 to JP Morgan Chase.32 It is difficult to believe that large proportions of the nation’s most intelligent 22-year-olds suddenly found an irresistible interest in balance sheets, assets, and liabilities, and the more likely explanation for these statistics is that paychecks have taken precedence over passion. And even these figures do not include the many graduates who go first to law school or MBA programs before winding up eventually in finance and consulting. Herrnstein’s Skinnerian assumption is undeniably accurate: in making a career choice, the cognitive elite follow the money.

The second assumption, however—that such dramatic levels of inequality are ultimately beneficial—is more questionable. In a well-organized society, of course, it is not only sensible but inevitable for differential rewards to function as incentives, ensuring that talent is utilized both efficiently and effectively; the prospect of greater compensation, whether in money or status, can ensure that people with the right combination of ability and perseverance take on difficult tasks that need doing. But the incredible fortunes now available to the highest earning professions have too often resulted not just in little social benefit but significant harm to the society and the economy.

There is even an argument that the prospect of such large incomes has been detrimental to some of the meritocrats who enjoy them, dissuading them from pursuit of their authentic interests. Some people regard their work as a “vocation,” from the Latin verb “vocare,” meaning to call or summon; they may feel “called” to preaching, writing, teaching, painting, building, and other activities, all pursued not out of financial interest but for intrinsic reward. As Markovitz’s analysis observes, “Work pursued authentically, as a vocation that reflects the worker’s true interests and ambitions can be a site of self-expression and self-actualization,” one that “integrates work with the other parts of a person’s life into an integrated whole.” The great dancer Martha Graham, for example, was once asked why she chose to be a dancer and famously responded, “I did not choose. I was chosen.” And an in-depth study of eminent scientists described the “driving absorption” that led them to work “long hours for many years, frequently with no vacations … because they would rather be doing their work than anything else.” But, Markovitz notes, the demands of meritocratic success trap superordinate workers, precluding them from pursuing work as a vocation and tending “inexorably toward alienated self-exploitation”; the same capitalist affliction “that Marx diagnosed in exploited proletarian labor in the nineteenth century” has been shifted “up the class structure.” Although he later bought into Herrnstein’s Skinnerian argument, Charles Murray himself once seemed to appreciate the importance of following a calling: a few years before publication of The Bell Curve he opposed raising teachers’ salaries on the grounds that an increase would only attract applicants “who are ‘in it for the money’” instead of the “able and dedicated career teachers …[who] could be making more … if they chose” but preferred the “intrinsic” rewards of teaching.33

In any event, Herrnstein’s argument for the salutary effect of extreme inequality was based solely on the claim that such generous rewards would attract members of the cognitive elite to the roles of greatest benefit to the society, thus ensuring that the best intellects would serve the collective welfare. The question is whether the actual professions now providing such incredible incomes have in fact played that role.

There is widespread agreement on what those professions are. In Tailspin, his compelling account of “The People and Forces Behind America’s Fifty-Year Fall,” the attorney and journalist-entrepreneur Steven Brill traced the increase in economic inequality specifically to the gigantic incomes in three often overlapping areas: financial engineers and consultants; corporate executives; and corporate lawyers and lobbyists. Jonathan Rothwell, the Principal Economist at Gallup, listed the same groups as those who “have contributed the most people to the 1 percent [of top earners] since 1980.”34

Finance

Of these three professions, finance has produced both the highest individual incomes, as the hedge fund managers cited earlier demonstrate, as well as the greatest proportion of the superrich; recent studies have found that between a quarter and a fifth of the richest Americans have made their fortunes in finance, “especially hedge funds and private equity,” and the sector accounts for 40 percent of Americans with investable assets of more than 30 million dollars. The finance professionals in the United States once performed a useful service, helping to channel capital in productive directions, increasing the efficiency of markets, and enjoying modest rewards commensurate with their value to the society, but their incomes hardly compared to the wealth of tycoons and owners of natural resources; when the famous financier and banker J.P. Morgan died and his estate was revealed, the steel magnate Andrew Carnegie remarked that he hadn’t known Morgan wasn’t really a rich man. And half a century ago “Wall Street” referred to a number of small private partnerships that specialized in purchasing equity securities from growing companies and immediately selling them to investors, thus putting their own money at risk and often sitting on the boards of companies they underwrote in order to be fully informed about the investment. In 1970, however, beginning with Merrill Lynch, one after another of these firms decided to go public, not only instantly making their partners fabulously wealthy but now allowing them to take risks with shareholders’ money rather than their own.35 The year before The Bell Curve was published, former Harvard University President Derek Bok summarized the outcome, writing that

… much of what transpires in Wall Street seems to go beyond socially productive activity and resembles some sort of casino to accommodate clever people searching for short-term gains. Moreover, however useful financial services may be, the sheer number of highly educated professionals engaged in selling bonds, analyzing stocks, talking with clients, and looking for market anomalies to exploit seems well in excess of any contribution they make to the long-term prosperity of the nation.36

Indeed, in the years since Bok’s observation, members of the cognitive elite have employed their superior intellectual abilities to devise new and creative strategies for enriching themselves while wreaking havoc on the economy in general. Financial analysts and traders have created one instrument after another designed to produce wealth untethered to hard assets such as buildings, factories, or anything of material value—the economic equivalent of abstract art largely responsible for producing the Great Recession. Instead of directing capital toward the production of goods and services, credit default swaps, mortgage-backed securities, and other complex financial derivatives became the product—not the means to an end but the end itself. By enabling creditors to take out insurance policies for more than the amount actually loaned to a troubled company, some of these instruments created a perverse incentive for hedge funds to push these companies into bankruptcy; after all, the insurance payout would provide more profit than repayment of the entire principal. Some hedge fund managers didn’t even bother with computer algorithms or other calculations, merely taking money from clients and passing it off to another fund while skimming a substantial advisory fee off the top. Nor did the recession, in which their own firms lost billions of dollars while so many people of lesser means lost their jobs and homes as a result of these exotic instruments, prevent the financial professionals from continuing to enjoy incredible windfalls. In 2009, nine of the largest recipients of federal bailout money—the same banks and Wall Street firms that had enriched their employees while feeding the nation’s economy into a meat grinder—shelled out bonuses of more than a million dollars each to 5000 of their analysts and traders. In 2015, just the annual bonus pool for Wall Street employees came to more than double the combined income for the entire year of all workers earning the minimum wage.37

In another financial ploy, so-called private equity firms (as opposed to publicly traded, which are subject to greater regulation and reporting requirements) specialized in purchasing companies, burdening them with as much debt as possible—not to grow the business, reinvest in infrastructure or equipment, hire more people, or otherwise improve prospects, but to pay themselves huge managerial fees quickly returning many times their original investment—and then, when the company has been essentially looted of all financial value, declaring bankruptcy, leaving thousands of workers not only jobless but deprived of the earned pension they had once thought secure. Gordon Gekko, the protagonist of Oliver Stone’s Wall Street who plots the predatory takeovers of otherwise functioning companies so that he can profit by wrecking them, may have been a fictional character, but his financial machinations come right out of the private equity playbook. The bankruptcies of seven major grocery chains since 2015 involving 125,000 workers, and nine of the ten largest retail firms in 2017 all involved private equity firms. In some cases, jobless workers were not the only ones adversely affected. In 2011, Manor Care, for example, a chain of nursing homes was purchased by the massive private equity firm, the Carlyle Group, which promptly extracted 1.3 billion dollars for investors; when the chain was driven into inevitable bankruptcy a few years later, it was so understaffed that some patients wallowed in their own filth, and rooms were overrun with roaches and ants. Even the success stories—the businesses putatively “rescued” by private equity—combine outrageous profit for the investors with massive loss of jobs for the employees. When Hostess Brands, for example—the manufacturer of Twinkies and other iconic snack cakes—was struggling to survive, the company was bought for 186 million dollars by two private equity firms, which quickly arranged for it to borrow more than a billion dollars for distribution to the investors; Leon Black, co-founder of one of the two firms, received 181 million dollars, while one of Hostess’s plants was shuttered and the number of people employed by the company went from 8000 to 1200.38 As with the bankers and traders, the cognitive elite who dreamed up these schemes prospered while so many others were left in financial ruin.

Corporate Management

Corporate executives constitute another group clustered at the very top of the income distribution. According to a labor journalist specializing in inequality, the “annual jackpots” enjoyed by CEOs comprise “the single largest contributor to the skyrocketing income of America’s top 0.1 percent” since 1979, representing 44 percent of their growth; together with the top financial professionals, the two groups account for two-thirds of the increase.39 In the United States such businessmen (and more recently women) have traditionally enjoyed heroic status. Notwithstanding Marxist theory, most Americans believe that business tycoons represent the true force for historical change—and often for the better, producing the consumer goods and services that improve quality of life for everyone while creating the gainful employment for masses of people that enable them to purchase what they have produced. As “captains of industry,” corporate executives are the closest civilian equivalent to military officers. Though working in the private sector, they lead organizations that constitute what Franklin D. Roosevelt referred to as “a public trust,” serving not just shareholders but employees, customers, and communities.

Unlike finance professionals, who have only recently ascended to the highest income levels, corporate executives have historically been the most highly paid people who worked for a living. Indeed, social scientists typically considered them well deserving of such generous remuneration, given their contributions. At the dawn of the twentieth century, William Graham Sumner, the nation’s first ever professor of sociology, explained that millionaires—all heads of business at the time—were “a product of natural selection, acting … to pick out those who can meet the requirement of certain work to be done …. They get high wages and live in luxury, but the bargain is a good one for society.” In 1940, E.L. Thorndike, one of the first psychologists to be elected to the National Academy of Sciences, argued that superior intellects should receive whatever they wanted, citing in particular executives of large businesses, who, he claimed, merited much greater salaries than they were receiving, considering the “value of the services” they provided. “Whatever will put great managerial ability at work should be offered,” Thorndike concluded—“money, power, prestige or whatever else is required.” And C. Wright Mills’s mid-1950s classic, sociological study of influence in mid-twentieth-century America, The Power Elite, described a common opinion of the richest corporate executives at the time as “responsible trustees, impartial umpires, and expert brokers for a plurality of economic interests, including those of all the millions of small property holders who hold stock in the great American enterprises, but also the wage workers and the consumers who benefit from the great flow of goods and services.”40

Although corporate executives have thus long enjoyed some of the highest earnings, their present compensation has nevertheless grown exponentially in comparison with the past. Mills’s study noted that the highest paid executive in 1952—indeed, the highest paid individual at a time when, as Mills observed, “All the men … of great wealth are now identified with large corporations”—was Charles Wilson, the CEO of General Motors, who made $581,000 in salary and bonuses or about $5.5 million adjusted for inflation; average CEO salary at the time was $100,000 or about $950,000 in contemporary dollars. By 2017 the highest paid executives took home more than $100 million apiece, and the average pay for CEOs of the largest 350 companies was 18.7 million.41

Nor is it the case that these increases have merely reflected a broader trend of rising salaries for all a corporation’s employees; according to a study by the Economic Policy Institute, between 1978 and 2017 CEO pay rose by 1000 percent while the average employee salary increased 11 percent. In the 1950s and early 1960s, a CEO typically earned about 20 times the salary of the firm’s average employee, a ratio that had long been considered the optimal maximum. In an earlier era, J. P. Morgan had famously insisted on never paying an executive more than 20 times the earnings of a company’s lowest paid employee. And in 1984 Peter Drucker, widely acknowledged as the founder of modern management, called for a “voluntary limitation” on executive pay of at most 20, and perhaps even 15, times the pay of the rank and file; years later he was quoted in a letter submitted to the SEC as having “advised managers that a 20-to-one salary ratio is the limit beyond which they cannot go if they don’t want resentment and falling morale” in their company. By 2017, however, the ratio of CEO compensation to the average worker’s salary had grown to 361—every single day the former’s income amounted to approximately what the latter received for the entire year—but typically much higher for larger corporations; the 102 million dollar pay package for the CEO of First Data, for example, was more than 2000 times the median compensation of all other employees.42 But even these dramatic statistics do not indicate the full extent to which executive compensation has increased, since the marginal tax rate for the highest incomes in 1952 was more than 90%—for every dollar of salary over $100,000, a taxpayer got to keep less than 10 cents—whereas the highest marginal tax rate is now only 37%. Presuming no extraordinary deductions, in 1952 after taxes Charles Wilson would have been able to keep around $115,000 of his $581,000 income. Nor did the earlier rate produce the howls of outrage later elicited by anything greater than the current one; in the post-war world an almost confiscatory marginal tax rate on the highest incomes was accepted as a social responsibility without protest by the people fortunate enough to make that much money.

In addition to making considerably more money, corporate executives now tend to arrive at their position through a much different career path than in the past. It was once not uncommon for managers from many different backgrounds to work their way up the corporate ladder, spending much of their working life with the same organization; Wilson, for example, joined a General Motors subsidiary as an engineer and sales manager in 1919, not rising to president of the company until 22 years later. Having come up through the ranks, such executives tended to rely for advice on the many people in the middle management positions they had once occupied. In contrast, most contemporary chief executives typically come from an elite institution; according to Markovitz, half of America’s corporate leaders attended one of twelve universities and typically hold an MBA or similar postgraduate degree. When one of these corporate heads needs assistance, they look to the professional management consultants from identical backgrounds, the ranks of middle management having been hollowed out and their salaries essentially transferred upward.43

While Herrnstein’s second assumption is obviously risible with respect to financial professionals—providing obscene incomes for these members of the cognitive elite has hardly ensured the use of their abilities for the greater social good—the case of business executives is more complex. Certainly, there are well run corporations that compensate their executives in a manner reflecting the company’s contributions to consumers, employees, shareholders, and the larger economy. However, the evidence strongly suggests that such ideal results are as much the exception as the rule.

A number of studies over the past three decades have found little rational justification in most cases for the outlandish salaries enjoyed by corporate executives. In 1991, when the ratio of CEO pay to that of the median employee was less than half of what it would later become, Graef S. Crystal a prominent “executive compensation consultant”—i.e., a professional hired by large companies to assist in the design of appropriate compensation packages for top level management—authored a book whose subtitle crisply summarized its conclusion: In Search of Excess: The Overcompensation of American Executives. In blunt prose Crystal outlined the problem of “bloated pay packages,” resulting in “huge and surging pay for good performance, and huge and surging pay for bad performance, too.” Derek Bok’s 1994 study of highly paid professions acknowledged that “in earlier generations” executives had earned much more than others, when “there was at least an observable link between reward and talent, entrepreneurial success and social welfare.” But, he wrote, “scholarly analysis” now indicated that “performance pay for top executives has turned out to be a sham and an embarrassment,” so that “the compensation actually paid to CEOs bears very little relation to the record of their companies” in terms of corporate value. Moreover, he found no relation to larger civic goals—no link between “the pay executives receive” and “the contribution they make to social welfare or economic growth.”44

A decade later Lucian Bebchuk and Jesse Fried, Harvard law professors specializing in business and the economy, published another instructively titled book: Pay Without Performance: The Unfulfilled Promise of Executive Compensation. While both this book and the earlier volume by Crystal detailed the numerous devices employed to further enrich executives beyond their huge base salaries—guaranteed bonuses, stock options rigged to pay off even when the stock price plummets, golden parachutes ensuring windfall payments for departure even when caused by abysmal performance—Bebchuk and Fried went on to explain how compensation packages were not only unrelated to performance but also deliberately structured to “camouflage” the true amount paid, both during and especially after the executive’s employment. Defined benefit retirement pensions, guaranteeing 7-figure annual incomes—unlike the defined contribution plans typically offered to other employees, which are exposed to the risks of investment—can wind up providing greater income over time than the executive had earned during his or her actual employment; lucrative consulting contracts—not for work actually performed but for an “availability” to consult that is rarely invoked since new CEOs are typically not inclined to seek advice from their predecessors—add more substantial sums; perks, such as access to corporate aircraft, chauffeured cars, paid assistants, apartments, and more are also worth huge amounts of money. According to Bebchuk and Fried, unlike the more traditional methods, none of these forms of “stealth compensation” must be disclosed through the usual channels, thus keeping them hidden from the public in general and shareholders in particular.45

All three of these books culminated in recommendations for reforming the process of determining executive pay. Bok, for example, proposed a number of changes designed “to tie compensation more closely to performance.” Though cognizant that no remedy could “cure all the ailments,” he was certain that the “worst abuses” could be curbed—“fewer CEOs taking home millions of dollars as their companies wallow in substandard performance.”46

Such expectations have proved naïve. While there is little data on the systematic relationship between CEO compensation and corporate performance, there has been no shortage of instances that Bebchuk and Fried described as “generous treatment even in cases of spectacular failure”—companies that went down the financial tubes, costing shareholders their investment, and sometimes employees their jobs, while the CEOs who presided over these fiascos received princely sums quite apart from the “camouflaged” forms of compensation, not in spite of their failure but because of it—the necessary price for getting rid of them. Shortly after their book appeared, for example, Alan Mulally, Ford Motor Company’s newly hired CEO, received 28 million dollars for his four months in the position, after the company announced a 12.7 billion dollar loss and plans to close plants and cut more than 30,000 hourly workers. Even that fortune paled in comparison with two other hastily arranged exit packages around the same time: Robert Nardelli, the CEO of Home Depot, received 210 million dollars after the company’s stock dropped and it lost market share to Lowe’s, and Henry A. McKinnell left Pfizer 200 million dollars richer after its share price fell 40% and thousands of employees were terminated. Though not quite as magnanimous, Hewlett Packard could claim the unique distinction of paying a succession of CEOs to retire: after a dismal record Carly Fiorina was paid 21 million dollars to leave and was replaced by Mark Hurd, who received 12.2 million dollars to resign in the wake of accusations of sexual harassment and financial improprieties, and was replaced by Leo Apotheker, who was given a 13.2 million dollar exit package less than ten months later, after the company lost more than 30 billion dollars in market capitalization under his leadership. And in the most recent example, after Boeing’s 737 MAX disaster—two crashes that cost 346 lives and erased 11 billion dollars of market value—Dennis A. Muilenburg was ousted as CEO, leaving with more than 62 million dollars in stock and pension awards.47

Nor do any of these examples, and numerous similar ones, include the many cases in which CEOs created their lucrative retirement package through ethically questionable methods. Bebchuk and Fried noted that the executives of the 25 largest firms to go bankrupt in the first few years of the twenty-first century sold almost 3 billion dollars of stock shortly before their companies tanked. Hundreds of other executives at firms whose share price plummeted by 75 percent or more conveniently unloaded 23 billion dollars’ worth of stock altogether just before the descent began. In some of these cases, while cashing out their own holding, the firm’s management prohibited employees from selling the stock out of their 401(k) accounts.48 And of course the CEOs of financial firms earned obscene amounts of money by selling products they knew to be defective, shattering the global banking system, and requiring the government to save them from closing their doors—a taxpayer expenditure that did not deter them from awarding themselves 7-figure bonuses at the same time.

Although the coronavirus forced many large businesses, especially in retail, to declare bankruptcy, it did not inhibit executives from arranging substantial bonuses for themselves shortly before closing their doors for good. In the first six months of the pandemic, 18 large companies collectively distributed 135 million dollars to their executives just before requesting court protection from their creditors. Five days before filing for Chapter 11, J.C. Penny, for example, awarded 7.5 million dollars to its four top executives while closing approximately 150 stores and eliminating thousands of jobs.49

While individual exceptions doubtlessly exist, just as with the Wall Street bankers and traders, for high-ranking executives it is again difficult to find support for Herrnstein’s contention that such a steep gradient of gain, producing previously undreamed-of incomes for select members of the cognitive elite, has resulted in considerable benefit for the society overall.

Law

A final profession disproportionately represented among the highest incomes is corporate lawyers, both those who work as lobbyists as well as in-house counsel. The presence of attorneys in the ranks of the superrich represents a fairly recent development. Not that long ago, before the legal profession became the butt of jokes based on their ethical shortcomings, lawyers, like financiers, were comfortably but not excessively compensated in accord with their service both to individuals and the society. But quite apart from how much they earned, there was a sense that lawyers played a crucial role in ensuring stability and fair play—perhaps an avenue to material prosperity but also the bedrock of the society’s moral order. As C. Wright Mills observed in mid-century, that role began to change with the “incorporation of the economy,” as more lawyers, especially in urban areas, “made business counseling the focus of their work, at the expense of traditional advocacy.” In particular, corporations turned to lawyers for assistance in “minimizing … [their] tax burden, … controlling government regulatory bodies, [and] influencing state and national legislatures.”50

Nevertheless, until the 1970s, according to Steven Brill—the attorney who founded both The American Lawyer, a magazine covering the business of law firms and lawyers, and the cable channel, Court TV—law remained “a relatively sleepy profession”; even at white-shoe firms new hires began with a salary only 20–25 percent higher than the average income. Starting in the 1970s, however, the “demand for lawyers exploded” in response to corporate interest in mergers, tax shelters, and a host of new government regulations involving consumer products, discrimination, worker safety, and protection of the environment. As a result, Brill noted, the new lawyers were concentrated in “firms that served large corporations and were prepared to pay skyrocketing salaries to attract the best talent”—the strongest students from the most competitive law schools—and by 2016 the partners at the top corporate law firm earned an average of 6.6 million dollars. As Herrnstein had predicted, the cognitive elite followed the money: Markovitz reports that more than half the partners at the five most profitable firms graduated from one of the top ten law schools; for the single most profitable firm that figure rises to 96 percent. By 2015, almost 60 percent of the 2010 graduates from Yale Law, a school previously known for graduates who entered public service, were employed by business firms; for 2015 graduates from Columbia Law, a year later more than 70 percent had taken such positions.51

The purpose of this cadre of lawyers was to prevent government from functioning in any way that might be detrimental to corporate interests. Brill described, for example, how, after first being proposed in the Federal Register, Occupational Safety and Health regulations concerning hazardous chemicals in the workplace were delayed for years by lawyers who offered thousands of pages of comment—each of which needed to be read and considered—scheduled countless meetings, appealed rules in court, quibbled over the meaning of common words, and generally did everything possible to throw a monkey wrench into the process. Thus, shortly after OSHA began operation in the early 1970s it took about a year to complete the review process resulting in a 10-page-long rule for a particular chemical, while in 2016 a rule about a different chemical took 19 years to write and consumed 604 pages; as part of the process OSHA estimated that the latter rule would have prevented more than 579 deaths a year. The corporate lawyers “swarming the process,” as Brill put it, have had the desired effect: even though many hundreds of new chemicals are introduced into the workplace each year, since its inception OSHA has been able to issue regulations in but thirty cases, only three of them since 1997.52

In theory, of course, the interests of consumers, employees, and the public are supposed to be represented by lawyers for either the government or, less often, a non-profit organization; the assumption underlying the legal system is that the “correct” result is most likely to occur when advocates for each side enjoy an equal opportunity to make the best case possible. In practice, however, lawyers in the public sector are almost always outgunned in both quantity and quality by those representing corporate interests. In his 1994 study of highly paid professions, Derek Bok provided a concise description of the problem:

The ablest lawyers usually go to established firms where they frequently litigate and negotiate with a much less experienced government attorney or with a solo practitioner representing a private claimant. … In all these circumstances, so long as the most promising young lawyers choose overwhelmingly to serve large corporations, continuing to add more and more exceptional talent to the profession may help to make legal encounters more unequal and to increase the odds of prevailing for reasons other than the true merits of one’s case. If so, the influx of exceptional talent may succeed not in furthering justice but in magnifying the human imperfections of our legal system so as to diminish, rather than enhance, the welfare of society.53

Since Bok’s warning, the malign influence exerted on public policy by corporate lawyers from elite schools has only increased, especially with the exponential growth in their deployment as Washington lobbyists from a cottage industry to an immense enterprise, turning “K Street” into a metonym for the entire operation. In 1971, 175 firms employed registered lobbyists in the capital; by 2016 more than 7700 corporations and trade associations did so at a cost of more than three billion dollars. The overwhelming number of these lobbyists work for business rather than for public interest groups; none work for unions. The two most powerful lobbies represent health care and finance, the former industry employing five lobbyists for every member of congress, the latter spending a million dollars per member. From the corporations’ perspective, this was money well spent; whether the issue was financial reform, the tax code, consumer rights, or labor relations, it was much cheaper to pay huge amounts of money to an army of top-notch lawyers and lobbyists to chip away at the substance of proposed legislation—inserting new provisions, adding qualifiers, injecting exceptions, tweaking definitions, postponing starting dates, introducing vague language (“reasonable” interest rates) that can be subsequently litigated—rather than obey it. The “Banking Act of 1933,” for example, also known as “Glass-Stegall”—the financial bill passed just after onset of the Great Depression, establishing the Federal Deposit Insurance Corporation among other reforms—was all of 33 pages long; thanks to the efforts of 2000 lobbyists—four for every member of both houses of congress—“Dodd-Frank,” the analogous attempt at modest reform of banking in the wake of the 2008 Great Recession (shorthand for the “Dodd-Frank Wall Street Reform and Consumer Protection Act”) consumed 2319 pages with unnecessary complications and impenetrable prose, not to mention hundreds of provisions dependent on the lengthy and complicated rule writing process that would take years to complete. “We have three lawyers total” working on the bill, commented the legislative director for a large consortium of non-profit groups; the banks have three lobbyists “working on a paragraph.”54 In another example of complexity introduced by lawyers/lobbyists mainly for corporate benefit, the original income tax code, passed in 1913, was 27 pages long; by 2017 it had grown to 6550.

Well compensated lobbyists miss few opportunities to pursue corporate interests even at a cost to public welfare. For most of the nation the first coronavirus stimulus bill was an emergency measure designed to alleviate some of the worst effects of the pandemic on the economy. Instead of a response to tragedy, however, lobbyists saw the bill as an opportunity—what the New York Times called an “irresistible target”—leading to “a frenzied effort to insert into the must-pass legislation provisions their clients wanted,” many of which “were largely unconnected to the coronavirus crisis.” As one congressional reporter put it, “lobbying firms of all stripes lined up at the trough”: undeterred by social distancing measures, they pressed their cause by phone and email, achieving tweaks to the tax code for wealthy investors in real estate and energy, banks, and large hotel chains, which these interests had sought long before anyone had ever heard the word “covid.” According to a law professor at the University of California, Irvine specializing in taxation, many of the tax benefits in the stimulus bill were “just shoveling money to rich people.”55

As the stimulus bill indicates, together with bankers and accountants, many elite lawyers are engaged in what Markovitz calls the “income defense industry,” enriching themselves by protecting “still richer people’s fortunes against government encroachment,” thereby thwarting any attempt by the state to regulate great wealth. “The trust and estates bar alone,” he points out, comprises over fifteen thousand lawyers, “whose work no doubt justifies the observation by Gary Cohn, chief economic advisor to President Trump during the first year of his administration, that “only morons pay the estate tax.” And specialists in tax havens have allowed those with more than thirty million dollars of investable assets to move collectively some eighteen trillion dollars’ worth offshore. As a result of these and other maneuvers, at the same time that the share of national income enjoyed by the richest individuals doubled, their tax rate fell dramatically. According to ProPublica, a non-profit organization of investigative journalists, in some years a number of the wealthiest people in the United States paid no income tax at all, and between 2014 and 2018 the 25 richest averaged a tax rate of 3.7 percent.56

There is even reason to believe that the exorbitant earnings enjoyed by corporate lawyers played a significant role in the justice department’s failure to pursue criminal charges against the finance executives responsible for the 2008 economic debacle despite ample evidence of their fraudulent behavior. Such reluctance to prosecute corporate misbehavior is relatively recent. Three decades ago hundreds of people associated with, first the savings and loan crisis, and then the “junk-bond” market were prosecuted and received stiff fines and jail sentences. Then, when the tech bubble burst at the beginning of the present century, government attorneys did not hesitate to seek harsh penalties against superrich executives who had looted their companies while shareholders lost their investment and employees their jobs. Charged with felonies such as bank fraud, securities fraud, insider trading, and perjury, a number of CEOs received not only substantial fines, which they had to pay personally, but lengthy prison terms: John Rigas (Adelphia Communications), who was sentenced to 15 years; Joseph Nacchio (Qwest Communications) to 6 years; Dennis Koslowski (Tyco International) to 8–25 years; Samuel Waksal (Imclone Systems) to 7 years; and two executives at Enron—Kenneth Lay, who died three months before his scheduled sentencing, and Jeffrey Skilling, whose original sentence of 24 years was reduced to 14 on appeal. In contrast, after the Great Recession 49 financial institutions paid a total of 190 billion dollars in fines for various misdeeds, but no individuals were charged; crimes had been committed, but there were no criminals. And since the money was paid by the corporations rather than the executives, making a substantial portion tax deductible, it actually came from the pockets of shareholders and taxpayers.57

What accounted for this dramatic change in response to corporate misbehavior? The Pulitzer Prize winning business journalist, Jesse Eisinger, has argued that young justice department lawyers have become reluctant to pursue corporate officials too vigorously, even when there seemed to be a prima facie case of criminally fraudulent behavior, lest such prosecution impedes their own prospects once their stint in the public sector has ended; knowing that they would eventually leave public service, they wished to remain hirable at high paying firms. According to Eisinger, a newly appointed assistant prosecutor is typically the product of an “elite” institution, an ambitious student who has spent “endless hours slaving to achieve the highest grades.” But having landed a prestigious position at the Department of Justice,

at that point, then, their formula for success has altered. They are not trying to please the powerful. If they do their job, they will displease them. To prosecute those sitting in corporate boardrooms, the young government litigators must become class traitors, investigating and indicting people very much like their mentors, peers, and friends. … A corrupt politician excites in upstanding prosecutors a sense of outrage. By contrast, a well-mannered and highly educated executive seems like someone who wouldn’t knowingly do something wrong.

Until fairly recently, top level corporate law firms did not represent their clients in criminal matters, but now at settlement negotiations prosecutors, whose salaries were set at a maximum of $160,000, were sitting across the table from professional counterparts making 20 times that amount. As Eisinger explained, in these circumstances prosecutors want to “appear tough” to the defense lawyers, to “dazzle them with their knowledge of legal precedent, mastery of details and bargaining skills. But young prosecutors also want their adversaries to imagine them as future partners. They want to be seen as formidable but not unreasonable. They want to demonstrate that they are people of proportion,” and by doing so, they “set themselves up for lucrative careers in the private sector.”58 Indeed, Eisinger named one prosecutor after another who negotiated a deal for high-profile white-collar crime and then accepted lavishly paid partnerships at prominent white-shoe firms, while the few government lawyers who had taken a more aggressive approach found themselves blackballed. Out of probable self-interest it seemed that one sector of the cognitive elite decided to give another sector a pass on criminal behavior, choosing to protect the interests of the wealthy rather than uphold the society’s moral order.

Markovitz’s conclusion about the contributions of all these extraordinarily well-paid “superordinate” workers did not mince words. “The elite’s true product,” he wrote,

may be near zero. For all its innovations, modern finance seems not to have reduced the total transaction costs of financial intermediation or to have reduced the share of fundamental economic risk borne by the median household, for example. And modern management seems not to have improved the overall performance of American firms (although it may have increased returns to investors). More generally, rising meritocratic inequality has not been accompanied by accelerating economic growth or increasing productivity.59

Thus, the evidence strongly suggests that Herrnstein’s second assumption is not merely unfounded but diametrically opposed to what has actually occurred. While there have undoubtedly been some exceptions, the availability of massive financial rewards has not ensured that superior intellect is systematically directed toward socially beneficial activity. Instead, it has often encouraged members of the cognitive elite to engage in what is essentially insider looting, allowing them to construct what Dennis Kelleher, the President of Better Markets—an organization founded in the wake of the Great Recession to promote financial reform—has aptly described as a “wealth extraction mechanism for the few rather than a wealth creation system for the many.”60 Highly intelligent people have raked in obscene amounts of money for behavior that is at best socially unproductive and at worst highly detrimental, while producing the largest redistribution of wealth upward in human history. Not only is there no moral justification for such extreme inequality, it is not possible to rationalize the second Gilded Age on the grounds that the cognitive elite have received remuneration appropriate to their contributions.

Naturally the fact that such incredible private rewards bear no, or even a negative, relation to public welfare is not to argue that differences in income serve no useful purpose. Scarce natural talent certainly merits greater compensation when directed to those activities that best serve the common good. However, at present the relationship between monetary reward and social contribution has fractured; not only do the careers most responsible for extreme inequality often fail to serve the common good, but to the extent that the prospect of riches does in fact attract the most intellectually capable, it only dissuades such talented persons from pursuing professions with the greatest potential of achieving that laudable goal.

Innate Inequality: Intelligence and Human Value

Even if “it is granted that differentials in economic rewards are morally justified and socially useful,” wrote the ethicist and theologian, Reinhold Niebuhr, in his influential 1932 book, Moral Man and Immoral Society, “it is impossible to justify the degree of inequality which complex societies inevitably create…. The literature of all ages is filled with rational and moral justifications of these inequalities, but most of them are specious … clearly afterthoughts.” In a meritocracy the rich must appear worthy of their good fortune in order to be perceived as in some sense legitimate; if the financial rewards enjoyed by so many of the cognitive elite are far from commensurate with the social value of their accomplishments, then a different justification must be found. The meme persists that high IQs are a marker of superiority, entitling the cognitive elite to much more of the society’s material resources than the lesser endowed, not so much because of what they have done but as a confirmation of who they are: the smartest of the smart, educated at the most prestigious universities, whose position in the top sliver of the intelligence distribution entitles them to a corresponding position in the income distribution. Indeed, in Herrnstein’s original article on the subject, he instructed those with a high IQ who wanted to become rich to “not waste your time with formal education beyond high school”; a degree from an elite institution, in this view, might have been merely the suggested attire for a party that the highly intelligent were destined to attend no matter their wardrobe.61

Alex Rubalcava, a Harvard student who would eventually go on to a successful career in venture capital, made the case even more bluntly, arguing that the actual substantive tasks performed by management consulting firms and Wall Street investment banks—facilitating mergers and acquisitions, selling shares to the public, and generally helping “wealthy individuals stay wealthy”—could and should be assigned to junior officers or even outsourced, so that these companies could focus all their efforts on what was their true “core competency”: recruiting Harvard students. “Remember,” he explained, “that companies that do nothing of value must obscure that fact by hiring the best people to appear dynamic and innovative while doing such meaningless work.”62 The quality of financial advice was much less important than the presumed intelligence of the people who offered it.

A number of ethnographic studies have provided ample evidence that Rubalcava’s view is widespread. In Liquidated: An Ethnography of Wall Street the anthropologist Karen Ho found—based on both interviews and her own experience as a Princeton student recruited to be a consultant at Bankers Trust—that the major firms directed “Herculean recruiting efforts” toward students from highly selective universities with a particular focus on Princeton and Harvard. At such elite institutions financial firms tended to dominate campus life through their constant presence at career forums, panel discussions, “meet and greets” offering free drinks and hors d’oeuvres, and their “goodie bags” filled with so much paraphernalia that “thousands of students become walking advertisements as their logos disperse into campus life.” After joining an investment firm, graduates from these universities receive special training programs that fast-track them for prestigious front office positions, while their co-trainees from “second tier” schools such as Rutgers or NYU are placed in parallel classes slating them for “less prestigious and (much) less well-paid divisions.” The fact that the objects of all this attention may have neither technical skill nor business savvy is insignificant; their intellectual pedigree and the superior intelligence it putatively represented is what counts. Top banks and investment firms claim that they have created “the most elite work-society ever to be assembled on the globe,” staffed by “the greatest minds of the century,” “the smartest people in the world”; they brag that “we hire only superstars,” graduates “only … from five different schools,” who constitute “the cream of the crop.”63 Again, it was their intelligence, as indicated by the elite institution which these employees had attended, that proved the worth of their advice and justified their entitlement to huge incomes.

Research by the sociologist Lauren Rivera found a similar dynamic. After graduating from Yale, Rivera spent two years in management consulting before deciding to become a sociologist studying the environment in which she had worked. Using the business connections she had established, Rivera secured a position as an unpaid “‘recruiting intern’ to help plan and execute recruitment events” for three types of “Elite Professional Service firms”—investment banks, “top-tier law firms” and management consulting firms—in exchange for which she received permission to observe the process and interview the recruiters. The most important single factor for these recruiters was the “prestige” of an applicant’s institution, with four universities—Harvard, Yale, Princeton, and Stanford—enjoying what she called the “super-elite” status that placed their graduates at the head of the line; other schools nationally ranked among the top 25 might be recognized as highly selective but nevertheless lacked the door-opening prefix. Again, students at the top schools were courted lavishly; one firm allotted a budget of close to a million dollars per year for recruiting events at one of the super-elite campuses. Neither what applicants studied nor how well they did mattered as much as what school they attended, as employers essentially “outsourced” their screening to admissions committees at elite universities. As Rivera put it, the credential most highly valued by these ESPs “was not the education received at a top school but rather a letter of acceptance from one.” Those selected at the end of the process not only enjoyed “unparalleled economic rewards for young employees” with no experience but also received “signing bonuses … as well as relocation expenses.” Rivera's investigation, just like Ho’s study, found that elite professional firms justified the huge expenses devoted to hiring graduates from super-elite institutions by marketing their employees as the “best and brightest,” “likely to become superstars.”64

Even the outrageous bonuses paid to Wall Street Executives after they had presided over their company’s financial meltdown were justified by their supposed brilliance. In Bailout: An Inside Account of How Washington Abandoned Main Street While Rescuing Wall Street Neil Barofsky, the former federal prosecutor named Special Treasury Department Inspector General to oversee the Troubled Assets Relief Program, described how the insurance giant AIG received 170 billion dollars from taxpayers to avoid collapse and then distributed 168 million dollars in “retention bonuses” to members of its Financial Products Division, “the very unit whose reckless bets had brought down the company.” Barofsky was surprised to find that Treasury officials “didn’t seem to begrudge the AIG executives the bonuses at all, viewing the payouts as “necessary to keep the ‘uniquely’ qualified” individuals in position to undo the mess they had created: “The Wall Street fiction that certain financial executives were preternaturally gifted supermen who deserved every penny of their staggering paychecks and bonuses was firmly ingrained in Treasury’s psyche.” Even after the financial crisis revealed their incompetence, the belief endured that a Wall Street Executive receiving a “$6.4 million ‘retention’ bonus … must be worth it.”65

But bloated incomes are merely the extrinsic manifestation of the cognitive elite’s intrinsic value, which, for many scientists, enamored of the importance of intelligence, has always extended beyond the monetary sense. From its inception the IQ test has been regarded by many of its most ardent advocates not as merely a measure of a highly specific sort of cognitive ability—usually defined as involving conceptualization and abstract reasoning—but as an indication of inherent worth. Not only do the cognitive elite deserve more, but their greater intelligence makes them innately more important people, whose lives and wishes matter more than those of the less cognitively gifted. As early as 1920, H.H. Goddard, who had translated the original IQ test—the Binet—from French into English, recommended that “men should be paid first according to their intelligence; and second according to their labor,” even for persons performing the same job. While it might seem odd for intelligence to take precedence over productivity, especially in a book claiming to explore the relationship between intellectual ability and “human efficiency,” Goddard pointed to the more refined sensibilities of those with greater intelligence as the justification for their material entitlement. Addressing those he considered his intellectual equals, Goddard ridiculed the possibility that someone with less intelligence “could live in your house with its artistic decorations and its fine pictures and appreciate and enjoy those things”; it was, he insisted, “a serious fallacy” to “argue that because we enjoy such things, everybody else could enjoy them and ought to have them.” In a slightly less condescending justification, two decades later E.L. Thorndike, the country’s most prominent educational psychologist at the time, proposed a precise mathematical system to determine how much weight should be accorded to the desires of each individual; the desires of an average person would count for 100, those of someone of superior intelligence for 2000. Although Thorndike acknowledged that some “men of genius” had sometimes sought “eccentric, ignoble or ruthless satisfaction,” nevertheless he thought it imperative to identify such persons as early as possible and “give them whatever they need.” And “what they need,” he concluded, “is what they themselves desire.” For the intellectually superior, no desire was to go unfulfilled.66

According to many prominent social scientists early in the twentieth century, being of greater value also entitled the cognitive elite to greater political influence; soon after creation of the intelligence test Charles Spearman even suggested it be used to select only the “better endowed persons for admission into citizenship.” More common, however, were proposals to change the rules for eligibility to vote, typically disenfranchising the less intelligent. Goddard, for example, found it “a self-evident fact that the feeble-minded should not be allowed to take part in civic affairs; should not be allowed to vote”—this at a time when the mass testing of draftees had led him to conclude, “beyond dispute” that half the nation was “little above the moron.” After warning of “distinctly inferior” immigrants as well as many Hispanics and Blacks who could never be “intelligent voters or capable citizens,” Terman called for “a less naïve definition of the term democracy,” one that would “square with the demonstrable facts of biological and psychological science.”67 Another psychologist of the time, George Barton Cutten, who went on to become president of Colgate University, happily anticipated that IQ tests would produce “a caste system as rigid as that of India,” depriving at least 25 percent of citizens of the ballot. And William McDougall, occupant of the William James chair of psychology at Harvard and arguably the most well-known academic psychologist in the English-speaking world at the time, declared that the franchise “must be denied to those who are obviously unfit to exercise it,” a policy he believed should apply to all democracies but especially in the United States, “made up as it is of so many heterogeneous elements,” where he anticipated that between a quarter and a third of the adult population would not be allowed to vote.68

While such sentiments are now largely rejected in an era more sensitive to individual rights, exceptions remain. Raymond Cattell, author of an enormous body of research in personality, human intelligence, and multivariate methodology and the seventh most highly cited psychologist of the twentieth century, supported restriction of the franchise throughout his lengthy career. In the 1930s he thought it “goes without saying” that the less intelligent should be prevented from voting and expected no opposition to such a proposal since those affected “seem to realize that their greatest happiness lies in a benevolent dictatorship.” Half a century later Cattell was outraged to realize that the latter expectation was clearly no longer tenable and railed at what he called “robbery … by the ballot box,” the use of the franchise by the “less gifted” to usurp the prerogatives of their intellectual superiors. To rectify this injustice he recommended various possibilities: a minimum IQ score, reducing the electorate to 60–75 percent of its present size, or an “explicit weighting of the votes of individuals according to their intelligence, sanity, and education,” a proposal he justified by comparing two “personal acquaintances”: a famous “classics professor … with a deep grasp of the political and social wisdom of the ages” and “an ordinary person who did some gardening.” The present practice of democracy, Cattell complained, allowed the latter’s opinion “to completely cancel” the former’s “long sighted contribution to the community”; the society could not survive, he concluded, “if it gives equal voting powers to individuals so disparate.” (In all likelihood, the classics professor was Revilo P. Oliver, a friend and colleague at the University of Illinois acknowledged by Cattell in print as an influence on his thinking, and arguably the leading Nazi intellectual in the United States at the time, who looked forward to future recognition of “Adolf Hitler as a semi-divine figure.”)69

If the more intelligent are of greater value to the society and if, as so many IQ scientists have concluded, intelligence has a substantial genetic component, then it follows naturally that the children of the more intelligent are of greater value than other children. Just as there have been calls to deprive the less intelligent of the franchise, ever since Francis Galton first proposed the concept of eugenics a century and a half ago, there have also been attempts to restrict their reproduction. Galton himself believed that the less intellectually capable would voluntarily accept appropriate limits on their behavior, but those who refused and continued to burden the society with their inferior offspring, he wrote ominously, would be “considered as enemies to the state.” And Spearman maintained that test scores should be used to determine “the right of having offspring.” Supported by many scientists, one of the major successes of the eugenics movement was the passage of laws authorizing involuntary sterilization of the “feeble-minded,” a practice that began early in the twentieth century and continued well into the post-war period.70

Similar to restriction of the franchise, involuntary sterilization is no longer acceptable though some scientists have continued to stress the importance of non-coercive measures to stop the supposed dysgenic trend caused by fecundity of the less intelligent. In 1963 the eminent University of Chicago physiological psychologist and pioneer in endocrinology Dwight J. Ingle, a member of the National Academy of Sciences, recommended quarantining those “poorly endowed with intelligence” in specific complexes—low IQ housing—where they would be provided with “an intensive program of birth control.” Throughout the next decade he continued to offer various plans for “selective population control,” typically by encouraging “barrenness … among the mentally dull” through subsidized sterilization or unspecified “material rewards.” By 1973 he was recommending that a group of professionals—scientists and physicians—determine the “genetic, … social, economic and behavioral fitness of the individual for parenthood,” a procedure that could be enforced by implanting “pellets of antifertility agents under the skin” of a woman who would then “have to apply for a license to have the pellet removed in order to become pregnant.”71 Around the same time the Nobel Laureate physicist-turned-behavior-geneticist William Shockley warned that medical advances were now assuring “to all the privilege of reproducing their kind,” leading to proliferation of the less intelligent. To halt this trend he proposed a “Voluntary Sterilization Bonus Plan”: in exchange for agreeing to be sterilized a person would receive $1000 for each IQ point below the population average of 100. And because he thought it most important to reach “those who are not bright enough to hear of the bonus on their own,” Shockley suggested that “bounty hunters” be paid a portion of the reward for persuading “low IQ high-bonus types to volunteer.” Arthur Jensen, too, viewed people with low IQs as “a burden on everyone, a disservice to themselves” and urged that “we should prevent their reproducing.” While he offered no specific proposal, he warned of the genetic deterioration from “current welfare policies, unaided by eugenic foresight,” clearly implying the necessity for some policy that would limit reproduction of the less intelligent.72 All three of these scientists emphasized that this dysgenic trend was much more severe within the black community, each claiming that the genetically least capable Blacks were producing the largest number of offspring. Indeed, each of them employed the same Orwellian phrase—“genetic enslavement”—specifically to Blacks, suggesting that their true shackles were now internal and that only control of reproduction could remove them.73

Again, it was Raymond Cattell who called for the most extreme measures. For Whites in his scheme, each person was to be assigned “a precise factor of fertility,” determining the “desirable number of offspring,” and the consequences for those who defied expert advice and brought “mentally backward children into the world in the face of recommendations to the contrary” would include sterilization, payment of a fine and even incarceration. For less capable minorities, however, such individual distinctions were unnecessary. “Where it is obvious that the race concerned cannot hope to catch up in innate capacity,” he wrote early in his career during the interbellum period, it was appropriate to facilitate their extinction through “birth-control regulation, segregation, or human sterilization”; soon thereafter he cited “the Negro” as an example of the “lower mental capacity” that warranted such treatment. While such observations may have reflected the eugenics era’s zeitgeist, well into the 1970s he was still promoting the concept of “genthanasia” for “phasing out” a less capable group through “educational and birth control measures.” Though he now refrained from naming names, his reference to a group with low intelligence but resistance to malaria provided an unmistakable hint for those aware that many people of African descent carry the sickle cell gene, which confers malarial immunity.74

In addition to the cognitive elite’s greater value financially, civically, and as progenitors of future generations, IQ scientists have also fostered a view of their lives as worth more in some deeper fundamental sense, one that violates traditional beliefs about the equal value of all lives just by virtue of their humanity. And because the lives of the cognitive elite matter more in this basic sense, the loss of their lives is considered a greater source of concern than the loss of their intellectual inferiors. When the philosopher Michael Scriven noted that “the worth of people and their rights do not depend on IQ,” for example, Shockley disagreed because, he claimed, test scores were correlated with other important traits, suggesting that human worth was indeed predicated on intelligence to some degree. And although the proposal in Stanley Kubrick’s classic film Dr. Strangelove was intended as black comedy—that in case of nuclear war persons with high IQs be selected for survival in underground shelters—Shockley was not joking when he suggested that nuclear war might serve as a “grim possibility” for solving “the problem of the quality of the human race” by forcing society to select the most intelligent from among the survivors to perpetuate life on the planet. As always Cattell did not bother with subtleties. To clinch the case that persons of different intelligence consequently differed in their innate value, he offered what he regarded as such an obvious and compelling example that it required no comment: “as if a motorist in an unavoidable situation would hesitate to run over … a feebleminded in preference to a healthy, bright child.”75 An IQ score is thus regarded more as a verdict than a measurement, a judgment of the value of one’s life.

Although Herrnstein and Murray ended The Bell Curve on a high note, emphasizing the importance of governmental policies that enable “people to live lives of dignity” no matter their intelligence, earlier in the concluding chapter they too hinted at the greater importance of the cognitive elite to the polity and of their children to the future welfare of the country. They didn’t suggest that anyone be deprived of the franchise, but they did find it essential that government remain the province of the “natural aristocracy.” They didn’t call for anyone to be involuntarily sterilized, but they did offer a eugenic rationale for their policy recommendation straight out of nineteenth-century Social Darwinism. A “society with a higher mean IQ is also likely to be a society with fewer social ills and brighter economic prospects,” wrote Herrnstein and Murray, and “the most efficient way to raise the IQ of a society is for smarter women to have higher birth rates than duller women.” But too many poor women, “disproportionately at the low end of the intelligence distribution,” were having children, portending “a future America with more social ills and gloomier economic prospects.” Thus, they argued, in words that could have been written by Herbert Spencer, providing financial assistance of any kind to the poor “subsidizes births” among those “who are also disproportionately at the low end of the intelligence distribution,” only “encouraging the wrong women” to reproduce, perpetuating their genetic deficiencies, and thereby undermining the intellectual level of the nation as a whole.76 And after half a thousand pages of data supposedly demonstrating that IQ scores were the major factor associated with more favorable outcomes for just about every meaningful social variable—income, employment, education, criminal behavior, health, infant mortality, and more—it was hard to escape the implication that the cognitive elite were just more valuable as human beings.

This notion that the lives of the more intelligent have greater innate worth has become a widespread meme as exemplified in a New York Times article that appeared in June 1995. Headlined “Sudden End for 2 Who Had Everything to Live For,” the 900-word article described in some detail the lives of two persons who had been gunned down at lunchtime in midtown Manhattan, both of them likely members of the cognitive elite: a computer graphics designer with a degree in communications from the University of Wisconsin “bursting with creative energy” and a Phi Beta Kappa graduate from Saint Olaf with a master’s degree in architecture from Yale and “a long list of clients.” An accompanying box with two-sentence sketches of each victim indicated, however, that seven people altogether had been killed in the rampage, the other five occupying mundane positions that apparently did not qualify them for the condition in the headline: a cab driver, a parking lot attendant, a blackjack dealer, a market company owner suspected of drug dealing, and the mother of the killer’s ex-girlfriend.77 Murray himself may have found the headline objectionable, but there is no doubt that it reflected The Bell Curve’s subtext as well as the thinking of numerous IQ researchers who preceded Herrnstein and Murray.

Although there may be a few specific contexts that justifiably require monetary calculation of an individual life’s value—settlement of an insurance claim for wrongful death, for example—the belief that some lives are intrinsically more valuable than others should be morally offensive. Notwithstanding the forced choice posed by Cattell, the death of someone with a low IQ is not ipso facto less grievous than the death of someone more cognitively capable. As the philosopher K. Anthony Appiah reminds us, every person, no matter their abilities, faces the “challenge of making a meaningful life. The lives of the less successful are not less worthy than those of others–but not because they are as worth or more worthy. There is simply no sensible way of comparing the worth of human lives.”78 Even the “very dull,” those whom The Bell Curve characterized as soon to become “a net drag,” incapable of “putting more into the world than they take out,” have “everything to live for.”

Of course, this is not to deny that there are crucial positions in the society requiring talent, education, and effort, and it is important to identify the people whose attributes make them most capable of fulfilling these roles and to provide material incentives encouraging them to do so. But quite apart from the question of whether the resulting income inequality is justifiable, differences in the inherent valuation of lives—in the esteem, dignity, and respect accorded to people—can be no less important as a source of human motivation. Indeed, it was these latter differences that played the more significant role in the 2016 presidential election.

Notes

  1. 1.

    R.J. Herrnstein, “In Defense of Bird Brains,” The Atlantic Monthly (September 1965): 101–104. C. Ingraham, “Forget Robots—The Goats are Coming for Our Jobs,” Washington Post, July 7, 2017.

  2. 2.

    R.J. Herrnstein, I.Q. in the Meritocracy (Boston: Little, Brown, 1973), 6–7; see also R.J. Herrnstein, “On Challenging an Orthodoxy,” Commentary (April 1973): 53. A.R. Jensen, “How Much Can We Boost IQ and Scholastic Achievement,” Harvard Educational Review 39 (1969): 1–123.

  3. 3.

    G. Piel, “…Ye May Be Mistaken,” in Genetic Destiny, ed. E. Tobach and H.M. Proshansky (New York: AMS Press, 1976), 132. B. Rice, “The High Cost of Thinking the Unthinkable,” Psychology Today (December 1973): 91.

  4. 4.

    R.J. Herrnstein, “I.Q.” The Atlantic Monthly 228 (September 1971): 43–64. Herrnstein, I.Q. in the Meritocracy, 53. The controversy over the Atlantic article eventually brought new attention to the twin study cited by Herrnstein as the largest of its kind and the only one with data affirming the crucial assumption of no relation between the occupational status of the homes in which the twin pairs were raised; as a result the study was demonstrated to be scientifically worthless and probably fraudulent. See W.H. Tucker, “Fact and Fiction in the Discovery of Sir Cyril Burt’s Flaws,” Journal of the History of the Behavioral Sciences 30 (1994): 335–347 and W.H. Tucker, “Re-reconsidering Burt: Beyond a Reasonable Doubt,” Journal of the History of the Behavioral Sciences 33 (1997): 145–162.

  5. 5.

    Herrnstein, “I.Q.” 62–64. F. Galton, English Men of Science: Their Nature and Nurture (London: Macmillan, 1874), 23.

  6. 6.

    Herrnstein, “I.Q.” 63–64. Caploe, “Herrnstein in ‘The Atlantic’ Predicts American Meritocracy.”

  7. 7.

    Herrnstein, “I.Q.” 63. Herrnstein, I.Q. in the Meritocracy, 215. M. Young, The Rise of the Meritocracy (New Brunswick, New Jersey: Transaction Publishers, 1958). M. Young, “Down with Meritocracy,” Guardian, June 28, 2001. University of North Carolina sociologist Bruce K. Eckland, too, took the book seriously, referring to “Young’s meritocracy [in which] talented adults rise to the top of the social hierarchy and the dull fall or remain at the bottom”; see B.K. Eckland, “Genetics and Sociology: A Reconsideration,” American Sociological Review 32 (1967): 181.

  8. 8.

    E. Barker, The Politics of Aristotle (London: Oxford University Press), 17. A. Bloom, The Republic of Plato (New York: Basic Books, 1968), 94.

  9. 9.

    D.G. Ritchie, Natural Rights: A Criticism of Some Political and Ethical Conceptions (New York: Macmillan, 1903), 258. F.H. Hankins, “Individual Differences and Democratic Theory,” Political Science Quarterly 38 (1923): 409.

  10. 10.

    E.G. Boring, “Lewis Madison Terman: 1877–1956, Biographical Memoirs of the National Academy of Sciences, 33 (1959): 414. L.M. Terman, The Measurement of Intelligence (Cambridge, Massachusetts: Riverside Press, 1916), 91–92. L.M. Terman, “The Significance of Intelligence Tests for Mental Hygiene,” Journal of Psycho-Asthenics 18 (1914): 124. L.M. Terman, The Intelligence of School Children (Boston: Houghton Mifflin, 1919), 270, 288.

  11. 11.

    C. Burt, “Psychological Tests for Scholarship and Promotion,” School 13 (1925): 741. C. Burt, “Individual Psychology and Social Work,” Charity Organization Review 43 (1918): 18. On the posthumous exposure of Burt’s research see supra, note 28.

  12. 12.

    C. Spearman, The Abilities of Man (New York: Macmillan, 1927), 8.

  13. 13.

    Young, The Rise of the Meritocracy, 85–87.

  14. 14.

    D.R. Caploe, “Herrnstein in ‘The Atlantic’ Predicts American Meritocracy,” Harvard Crimson, September 22, 1971. Herrnstein, I.Q. in the Meritocracy, 200–201.

  15. 15.

    Herrnstein and Murray The Bell Curve, 520.

  16. 16.

    Herrnstein and Murray The Bell Curve, 443, 442. R. Spencer, “The Charlottesville Statement” appears on the alt-right website, https://altright.com/2017/08/11/what-it-means-to-be-alt-right/.

  17. 17.

    Herrnstein and Murray The Bell Curve, 518, 523, 526.

  18. 18.

    Ibid., 528–532.

  19. 19.

    Ibid., 541–545.

  20. 20.

    Murray is one of five writers contributing to I.M. Stelzer, “The Shape of Things to Come,” National Review (July 8, 1991): 30. C. Goldin and R. Margo, “The Great Compression: The Wage Structure in the United States at Mid-Century,” Quarterly Journal of Economics 107 (1992): 1–34. Analysis of U.S. Census Bureau data in L. Mishel and J. Bernstein, The State of Working America 1994–95 (M.E. Sharpe, 1994), 37. The decline from 48 to 20 percent is cited in R. Frank, Richistan: A Journey Through the American Wealth Boom and the Lives of the New Rich (New York: Three Rivers, 2007), 39. D. Bell, “On Meritocracy and Equality,” Public Interest 29 (1972): 64.

  21. 21.

    See “Trends in the Distribution of Income,” CBO Blog, October 25, 2011, https://www.cbo.gov/publication/42537. E. Saez and G. Zucman, “Alexandria Ocasio-Cortez’s Tax Hike Idea Is Not About Soaking the Rich,” New York Times, January 22, 2019. D.C. Johnston, “Richest Are Leaving Even the Rich Far Behind,” New York Times, June 5, 2005, 1.

  22. 22.

    E. Saez and G. Zucman, “Wealth Inequality in the United States Since 1913: Evidence from Capitalized Income Tax Data,” NBER Working Paper Series, http://gabriel-zucman.eu/files/SaezZucman2014.pdf. C. Collins and J. Hoxie, Billionaire Bonanza: The Forbes 400 and the Rest of Us (Washington, DC: Institute for Policy Studies, November 2017), 2, 4. E. Sherman, “America is the Richest, and Most Unequal, Country,” Fortune, September 30, 2015, http://fortune.com/2015/09/30/america-wealth-inequality/. R. Menon, “The United States Has a National-Security Problem—And It’s Not What You Think,” Nation, July 16, 2018, https://www.thenation.com/article/united-states-national-security-problem-not-think/.

  23. 23.

    T. Piketty, Capital in the Twenty-first Century (Cambridge, Mass.: Belknap/Harvard University Press, 2014), 264, 265.

  24. 24.

    D. Markovits, The Meritocracy Trap: How America’s Foundational myth Feeds Inequality, Dismantles the Middle Class, and Devours the Elite (New York: Penguin, 2019), 5, 98.

  25. 25.

    Ibid., 105–106.

  26. 26.

    On the accurate story behind Hemingway’s misquoted comment, see E. Dow, “The rich Are Different,” letter to the editor, New York Times, November 13, 1988, 70. P. Fussell, Class: A Guide Through the American Status System (New York: Summit Books, 1983), 29–30. N.D. Schwartz, The Velvet Rope Economy: How Inequality Became Bug Business (New York: Doubleday, 2020), 16, 4. On the coronavirus vaccine, see S. Kahn, “How Rich People Will Cut the Line for the Coronavirus Vaccine,” Washington Post, December 18, 2020.

  27. 27.

    Herrnstein, I.Q. in the Meritocracy, 124; also Herrnstein, I.Q., 51.

  28. 28.

    Herrnstein, I.Q., 64.

  29. 29.

    Ibid., 51, 63. N. Chomsky, “Psychology and Ideology,” Cognition: International Journal of Cognitive Psychology 1 (1972): 39. For the complete list of occupations by IQ, see Herrnstein, I.Q. in the Meritocracy, 120–121. R.J. Herrnstein, “Whatever Happened to Vaudeville? A Reply to Professor Chomsky,” Cognition: International Journal of Cognitive Psychology 1 (1972): 303.

  30. 30.

    S. Brill, Tailspin: The People and Forces Behind America’s Fifty-Year Fall—And Those Fighting to Reverse It (New York: Alfred A. Knopf, 2018), 26. L. Uchitelle, “Lure of Great Wealth Affects Career Choices,” New York Times, https://www.nytimes.com/2006/11/27/business/27richer.html. Markovitz, The Meritocracy Trap. 239.

  31. 31.

    P. Krugman, “Gilded Once More,” New York Times, April 27, 2007, A27. “Top 25 Highest Paid Hedge Fund Managers of 2008,” March 26, 2009, https://www.marketfolly.com/2009/03/top-25-highest-paid-hedge-fund-managers.html. P. Krugman, “Bernie Sanders and the myth of the 1 Percent,” New York Times, April 18, 2019, A25. G. Thompson, “Meet the Wealth Gap,” Nation, June 30, 2008, 20. S. Polk, “For the Love of Money,” New York Times, January 19, 2014, SR1.

  32. 32.

    A. Griswold, “Harvard Grads Are Still Flocking to Finance,” Moneybox: A Blog About Business and Economics, May 27, 2014, www.slate.com/blogs/moneybox/2014/05/27/harvard_class_of_2014_elite_grads_are_still_flocking_to_finance_and_consulting.html; W.F. Morris IV, “Harvard’s Wall Street Problem,” Crimson, February 17, 2016, https://www.thecrimson.com/article/2016/2/17/-Wall-Street-Problem-Morris/. C. Rampell, “Out of Harvard and Into Finance,” New York Times, December 11, 2011, https://economix.blogs.nytimes.com/2011/12/21/out-of-harvard-and-into-finance/. C. Simon, “Prestige Draws Penn Students to Finance and Consulting after Graduation—But at What Cost?” Daily Pennsylvanian, April 26, 2017, https://www.thedp.com/article/2017/04/consulting-finance-popularity.

  33. 33.

    A. Roe, “A Psychologist Examines 64 Eminent Scientists,” Scientific American 187 (1952): 22, 25. C. Murray, In Pursuit of Happiness and Good Government (New York: Simon and Schuster, 1988), 235, 238. Markovitz, The Meritocracy Trap, 40, 192–194.

  34. 34.

    J. Rothwell, “Myths of the 1 Percent: What Puts Some People at the Top,” New York Times, November 24, 2017, B2.

  35. 35.

    See H.R. Gold, “Never Mind the 1 Percent. Let’s Talk About the .01 Percent,” Chicago Booth Review, Winter 2017/18, http://review.chicagobooth.edu/economics/2017/article/never-mind-1-percent-lets-talk-about-001-percent. Markovitz, The Meritocracy Trap, 164. The comment by Carnegie was cited in J. Madrick, “How to Succeed in Business.” New York Review of Books, (April 18, 1996): 22. W.D. Cohan, “Lehman’s Demise, Dissected,” New York Times, March 18, 2010, https://opinionator.blogs.nytimes.com/2010/03/18/lehmans-demise-dissected/.

  36. 36.

    D. Bok, The Cost of Talent: How Executives and Professionals Are Paid and How It Affects America (New York: Free Press, 1993), 238.

  37. 37.

    W.D. Cohan, “When Lenders Push Borrowers Over the Edge,” New York Times, May 13, 2019, A19. L. Story and E. Dash, “Bankers Reaped Lavish Bonuses During Bailouts,” New York Times, July 30, 2009, A1. See also B. Whiteman, “What Red Ink? Wall Street Paid Hefty Bonuses,” New York Times, January 28, 2009, A1. S. Anderson, Off the Deep End: The Wall Street Bonus Pool and Low-Wage Workers (Washington: Institute for Policy Studies, March 8, 2016), 2.

  38. 38.

    B. Covert, “Everyone Must Go,” Nation 308 (May 6, 2019): 23. E. Applebaum and R. Batt, “Private Equity Pillage: Grocery Stores and Workers at Risk,” American Prospect (Fall, 2018). P. Whoriskey and D. Keating, “Overdoses, Bedsores, Broken Bones: What Happened When a Private-Equity Firm Sought to Care for Society’s Most Vulnerable,” Washington Post, November 25, 2018. M. Corkery and B. Protess, “How the Twinkie Made the Superrich Even Richer,” New York Times, December 11, 2016, A1.

  39. 39.

    S, Pizzigatti, The Case for a Maximum Wage (Cambridge, UK: Polity, 2018).

  40. 40.

    W.G. Sumner (ed. By A.G. Keller), The Challenge of Facts and Other Essays (New Haven: Yale University Press, 1914), 90. Thorndike, Human Nature and the Social Order, 95. C. Wright Mills, The Power Elite (New York: Oxford University Press, 2000), 118.

  41. 41.

    Mills, The Power Elite, 116, 129. T. Cowen, “CEOs Are Not Overpaid,” Time, April 22, 2019, 22.

  42. 42.

    B. Saporito, “C.E.O. Pay, America’s Economic ‘Miracle’” New York Times, May 17, 2019, https://www.nytimes.com/2019/05/17/opinion/ceo-pay-raises.html. Morgan is cited in G.S. Crystal, In Search of Excess: The Overcompensation of American Executives (New York: W. W. Norton, 1991), 24. P.F. Drucker, “Reform Executive Pay or Congress Will,” Wall Street Journal, April 24, 1984, 34. Drucker’s letter is quoted in J. McGregor, “What’s the Right Ratio for CEO-to-Worker Pay?” Washington Post, September 19, 2013. I. Salisbury, “The Average CEO Makes as Much Money in One Day as the Typical Worker Earns in a Full Year, Time, May 22, 2018, http://money.com/money/5287123/ceo-pay-afl-cio/. D. Gelles, “Where the Buck Doesn’t Stop,” New York Times, May 27, 2018, BU6.

  43. 43.

    See the Wikipedia entry for Charles Erwin Wilson. Markovitz, The Meritocracy Trap, 183, 176.

  44. 44.

    Crystal, In Search of Excess, 11, 31. Bok, The Cost of Talent, 16, 111, 100.

  45. 45.

    L. Bebchuk and J. Fried, Pay Without Performance: The Unfulfilled Promise of Executive Compensation (Cambridge: Harvard University Press, 2004), chapter 8.

  46. 46.

    Bok, The Cost of Talent, 117–118.

  47. 47.

    Bebchuk and Fried, Pay Without Performance, 133. “Ford CEO: $28M for 4 Months Work,” April 5, 2007, https://money.cnn.com/2007/04/05/news/companies/ford_execpay/. J. Creswell and M. Barbaro, “Home Depot Board Ousts Chief, Saying Goodbye With Big Check, New York Times, January 4, 2007, A1. M. Huckman, “Pfizer’s McKinnell: The $200 Million Dollar Man,” December 22, 2006, https://www.cnbc.com/id/16326224. See the Wikipedia entry on Hewlett Packard: https://en.wikipedia.org/wiki/Hewlett-Packard. D. Gelles, “Muilenburg, Fired C.E.O. Will Receive More Than $60 Million,” New York Times, January 10, 2020. C. Reinicke, “Boeing sees $11 Million of Market Value Erased in Just 2 Days as Its 737 MAX Disaster Worsens,” Markets Insider, December 17, 2019, https://markets.businessinsider.com/news/stocks/boeing-stock-price-falls-erases-billions-2-days-737-max-halt-2019-12-1028769301.

  48. 48.

    Bebchuk and Fried, Pay Without Performance, 181.

  49. 49.

    A. Bhattarai and D. Santamariña, “Bonuses Before Bankruptcy: Companies Doled Out Millions to Executives Before Filing for Chapter 11, Washington Post, October 26, 2020.

  50. 50.

    Mills, The Power Elite, 56, 131.

  51. 51.

    Brill, Tailspin, 29–31, 56. Markovitz, The Meritocracy Trap, 11, 184.

  52. 52.

    Ibid., 111–114.

  53. 53.

    Bok, The Cost of Talent, 240.

  54. 54.

    Brill, Tailspin, 106, 110.

  55. 55.

    E. Lipton and K.P. Vogel, “Fine Print of Stimulus Package, Special Deals for Certain Industries,” New York Times, March 6, 2020, A8. A. Abramson, “Federal Stimulus Spending is Giving the lobbying Industry a Giant Windfall,” Time, May 2, 2020. Quoted in J. Drucker, “Bonanza Hides in a Rescue Package,” New York Times, April 25, 2020, B1.

  56. 56.

    Markovitz, The Meritocracy Trap, 54, 58. J. Eisinger, J. Ernsthausen, and P. Kiel, “The Secret IRS Files: Trove of Never-Before-Seen Records Reveal How the Wealthiest Avoid Income Tax, ProPublica, June 8, 2021, https://www.propublica.org/article/the-secret-irs-files-trove-of-never-before-seen-records-reveal-how-the-wealthiest-avoid-income-tax.

  57. 57.

    For details of their offenses and sentences, see the Wikipedia pages for each of the executives. J. Eisinger, The Chickenshit Club: Why the Justice Department Fails to Prosecute Executives (New York: Simon & Schuster, 2017), 318.

  58. 58.

    Ibid., 199–200, xix.

  59. 59.

    Markovitz, The Meritocracy Trap, 267.

  60. 60.

    See the transcript of Kelleher’s appearance on “The Beat with Ari Melber,” March 9, 2020, http://www.msnbc.com/transcripts/msnbc-live-with-ari-melber/2020-03-09.

  61. 61.

    R. Niebuhr, Moral Man and Immoral Society: A Study in Ethics and Politics (New York: Charles Scribner’s Sons, 1932), 8. Herrnstein, I.Q., 53.

  62. 62.

    A.F. Rubalcava, “Recruit This, McKinsey,” Harvard Crimson, November 26, 2001, https://www.thecrimson.com/article/2001/11/26/recruit-this-mckinsey-times-are-tough/.

  63. 63.

    K. Ho, Liquidated: An Ethnography of Wall Street (Durham, N.C.: Duke University Press, 2009), 39–40, 76. Markovitz, The Meritocracy Trap, 168.

  64. 64.

    L.A. Rivera, “Ivies, Extracurriculars, and Exclusion: Elite Employers’ Use of Educational Credentials,” Research in Social Stratification and Mobility 29 (2011): 72–73, 78–80. See also L.A. Rivera, Pedigree: How Elite Students Get Elite Jobs (Princeton, N.J.: Princeton University Press, 2015).

  65. 65.

    N. Barofsky, Bailout: An Inside Account of How Washington Abandoned Main Street While Rescuing Wall Street (New York: Free Press, 2012), 138, 139.

  66. 66.

    H.H. Goddard, Human Efficiency and Levels of Intelligence (Princeton, N.J.: Princeton University Press, 1920), vi, 100–101. Thorndike, Human Nature and the Social Order, 370–372.

  67. 67.

    Spearman, The Abilities of Man, 8. Goddard, Human Efficiency and Levels of Intelligence, 99. H.H. Goddard, Psychology of the Normal and Subnormal, 234. L.M. Terman, The Measurement of Intelligence (Cambridge, Mass.: Riverside, 1916), 91–92. L.M. Terman, “The Psychological Determinist; or Democracy and the IQ,” Journal of Educational Psychology 6 (1922): 62.

  68. 68.

    G.B. Cutten, “The Reconstruction of Democracy,” School and Society 16 (1922): 478–481. W. McDougall, Ethics and Some Modern World Problems (London: Methuen, 1925), 156–163.

  69. 69.

    R.B. Cattell, The Fight for Our National Intelligence (London: P.S. King, 1937), 59, 109. R.B. Cattell, Beyondism: Religion from Science (New York: Praeger, 1987), 113, 114, 223, 224. R.P. Oliver, Christianity and the Survival of the West (Cape Canaveral: Howard Allen, 1973), 75.

  70. 70.

    F. Galton, “hereditary Improvement,” Fraser’s Magazine 7 (1873): 129. Spearman, The Abilities of Man, 8. On the history, see P. Reilly, The Surgical Solution: A History of Involuntary Sterilization in the United States (Baltimore: Johns Hopkins University Press, 1991).

  71. 71.

    D.J. Ingle, I Went to See the Elephant (New York: Vantage, 1963), 213. D.J. Ingle, “Genetic Bases of Individuality and of Social Problems,” Zygon: Journal of Religion and Science 6 (1971): 182, 189. D.J. Ingle, Who Should Have Children? (New York: Bobbs-Merrill, 1973), 102, 115.

  72. 72.

    See Shockley’s address as a Nobel Laureate in Genetics and the Future of Man, ed. J.D. Roslansky (New York: Appleton-Century-Crofts, 1965), 67. W. Shockley, “Dysgenics, Geneticity, Raceology: A Challenge to the Intellectual Responsibility of Educators,” Phi Delta Kappan 53, special supplement (January 1972): 306. On the suggestion of “bounty hunters,” see A.R.S. Goodell, The Visible Scientists (Stanford: Proquest Dissertations Publishing, 1975), 339–340. J. Fincher, “Arthur Jensen: In the Eye of the Storm,” Human Behavior (March/April 1972): 22. Jensen, “How Much Can We Boost IQ and Scholastic Achievement,” 95.

  73. 73.

    D.J. Ingle, letter from the editor, Perspectives in Biology and Medicine 11 (1968): 713. W. Shockley, letter to the editor, Scientific American 224 (January 1971): 6. Jensen, “How Much Can We Boost IQ and Scholastic Achievement,” 95.

  74. 74.

    R.B. Cattell, Psychology and Social Progress: Mankind and Destiny from the Standpoint of a Scientist (London: C.W. Daniel, 1933), 317, 322, 323, 360. R.B. Cattell, The Fight for Our National Intelligence (London: P.S. King, 1937), 56. R.B. Cattell, A New Morality from Science: Beyondism (New York: Pergamon, 1972), 153–154, 221.

  75. 75.

    M. Scriven, “The Values of the Academy,” Review of Educational Research 40 (1971): 546. W. Shockley, “Negro IQ Deficit: Failure of a ‘Malicious Coincidence’ Model Warrants New Research Proposals,” Review of Educational Research 41 (1971): 243. “Is Quality of U.S. Population Declining? Interview With a Nobel Prize-winning Scientist,” U.S. News & World Report, November 22, 1965, 71. Cattell, The Fight for Our National Intelligence, 67–68.

  76. 76.

    Herrnstein and Murray, The Bell Curve, 530, 548, 551.

  77. 77.

    C. Goldberg, “Sudden End for 2 Who Had Everything to Live For,” New York Times, June 23, 1995, B2.

  78. 78.

    On efforts to assign monetary value to a life, see H.S. Friedman, Ultimate Price: The Value We Place on Life (Berkeley: University of California Press). K.A. Appiah, “The Red Baron,” New York Review of Books (October 11, 2018): 23.