The Biological Inferiority of the Undeserving Poor
Photograph by Tomas Castelazo
by Michael B. Katz
. . . if the misery of our poor be caused not by the laws of nature, but by our institutions, great is our sin. . . .
— Charles Darwin (1839)
For most of recorded history, poverty reflected God’s will. The poor were always with us. They were not inherently immoral, dangerous, or different. They were not to be shunned, feared, or avoided. In the late eighteenth and early nineteenth centuries, a harsh new idea of poverty and poor people as different and inferior began to replace this ancient biblical view. In what ways, exactly, are poor people different from the rest of us became – and remains – a burning question answered with moral philosophy, political economy, social science, and, eventually, biology. Why did biological conceptions of poverty wax and wane over the last century and a half? What forms have they taken? What have been their consequences?
The biological definition of poverty reinforces the idea of the undeserving poor, which is the oldest theme in post-Enlightenment poverty discourse. Its history stretches from the late eighteenth century through to the present. Poverty, in this view, results from personal failure and inferiority. Moral weaknesses – drunkenness, laziness, sexual promiscuity – constitute the most consistent markers of the undeserving poor. The idea that a culture of poverty works its insidious influence on individuals, endowing them with traits that trap them in lives of destitution, entered both scholarly and popular discourse somewhat later and endures to this day. Faulty heredity composes the third strand in the identification of the undeserving poor; backed by scientific advances in molecular biology and neuroscience, it is enjoying a revival. The historical record shows this idea in the past to have been scientifically dubious, ethically suspect, politically harmful, and, at its worst, lethal. That is why we should pay close attention to its current resurgence.
This article excavates the definition of poor people as biologically inferior. It not only documents its persistence over time but emphasizes three themes. First, the concept rises and falls in prominence in response to institutional and programmatic failure. It offers a convenient explanation for why the optimism of reformers proved illusory or why social problems remained refractory despite efforts to eliminate them. Second, its initial formulation and reformulation rely on bridging concepts that try to parse the distance between heredity and environment through a kind of neo-Lamarkianism. These early bridges invariably crumble. Third, hereditarian ideas always have been supported by the best science of the day. This was the case with the ideas that ranked “races”; underpinned immigration restrictions; and encouraged compulsory sterilization – as well as those that have written off the intellectual potential of poor children.
In its review of the biological strand in American ideas about poverty, this article begins in the 1860s with the first instance of the application of hereditarian thought I have discovered; moves forward to social Darwinism and eugenics, immigration restriction, and early IQ testing. It then picks up the story with Arthur Jensen’s famous 1969 article in the Harvard Educational Review, follows it to the Bell Curve, and ends with the astonishing rise of neuroscience and the field of epigenetics. It concludes by arguing that despite the intelligence, skill, and good intentions of contemporary scientists, the history of biological definitions of poor persons calls for approaching the findings of neuroscience with great caution.
***
In 1866 the Massachusetts Board of State Charities, which had oversight of the state’s public institutions, wrote, “The causes of the evil [“the existence of such a large proportion of dependent and destructive members of our community”] are manifold, but among the immediate ones, the chief cause is inherited organic imperfection, -vitiated constitution or poor stock.” This early proclamation of the biological inferiority of the undeserving poor arose as a response to institutional failure. Recurrent institutional and programmatic failure has kept it alive in writing about poverty ever since, supported always by scientific authority.
Beginning in the early nineteenth century, reformers sponsored an array of new institutions designed to reform delinquents, rehabilitate criminals, cure the mentally ill, and educate children. Crime, poverty, and ignorance, in their view, were not distinct problems. The “criminal,” “pauper,” and “depraved” represented potentialities inherent in all people and triggered by faulty environments. Poverty and crime, for instance, appeared to cause each other and to occur primarily in cities, most often among immigrants. This stress on the environmental causes of deviance and dependence, prominent in the 1840s, underpinned the first reform schools, penitentiaries, mental hospitals, and, even, public schools.
By the mid-1860s it had become clear that none of the new institutions built with such optimism had reached their goals. They manifestly failed to rehabilitate criminals, cure the mentally ill, reeducate delinquents, or reduce poverty and other forms of dependence. The question was, why? Answers did not look hard at the failures in institutional design and implementation or at the contexts of inmates’, prisoners’, and patients’ lives. Rather, they settled on individual-based explanations: inherited deficiencies. The Massachusetts Board of State Charities supported its belief that the inheritance of acquired characteristics (later known as Lamarkianism) reproduced the underserving poor as well as criminals, the mentally ill, and other depraved and dependent individuals with scientific evidence from physiologists which emphasized the toxic impact of large amounts of alcohol on the stimulation of the “animal passions” and the repression of “will”.
The State Board’s gloomy emphasis on heredity did not lead it to pessimistic conclusions, however. It believed, rather, in the body’s recuperative power over time. Vice had a standard deviation that, if not exceeded, could be eradicated by the body’s natural capacity for healing. In fact, the Board still believed that the persistence of crime and poverty was “phenomenal- not essential in society . . . their numbers depend on social conditions within human control.” The Board had revealed the source of social pathologies through the scientific study of heredity; through the scientific study of society it would excavate the laws governing its prevention.
The Board started out with an ideology prefiguring eugenics and ended with one anticipating Progressivism. Its early bridge between heredity and environmentalism, or biology and reform, remained one crossed by reformers for only a relatively short time until it was broken by social Darwinism. It was rebuilt in the early twentieth century until demolished once more by eugenicists and their successors and then reconstructed yet again in the early twenty-first century by the proponents of epigenetics.
By the 1920s, two initially separate streams – social Darwinism and eugenics – converged in the hard-core eugenic theory that justified racism and social conservatism. Social Darwinism attempted to apply the theory of Darwinian evolution to human behavior and society. Social Darwinists – whose leading spokesperson, Herbert Spencer, enjoyed a triumphant tour of the U.S. in 1882 – insisted on the heritability of socially harmful traits, including pauperism, mental illness, and criminality and on the harmful effects of public and private charities that interfered with the survival of the fittest. They viewed the “unfit” not only as unworthy losers but as savage throwbacks to a primitive life. Hereditarian beliefs thus fed widespread fears of “race suicide” giving an urgency to the problem of population control. The “ignorant, the improvident, the feeble-minded, are contributing far more than their quota to the next generation,” warned Frank Fetter of Cornell University.
The English scientist Francis Galton originally coined the term eugenics in 1883 to denote the improvement of human stock by giving “the more suitable races or strains of blood a better chance of prevailing speedily over the less suitable.” In the United States, eugenic “science” owed more to the genetic discoveries of Gregor Mendel, first published in 1866 but unrecognized until the end of the century, than to mathematical genetics as practiced by Galton and his leading successor Karl Pearson. In 1904 Charles Davenport, the leading US eugenics promoter, used funds from the newly established Carnegie Corporation to set up a laboratory at Cold Springs Harbor on Long Island. Davenport looked forward to the “new era” of cooperation between the sociologist, legislator, and biologist who together would “purify our body politics of the feeble-minded, and the criminalistic and the wayward by using the knowledge of heredity.” Eugenics entered public policy through its influence on immigration restriction and social reform as well as through state sterilization laws. Indiana passed the first of these in 1907. By the end of the 1920s, twenty-four states passed laws permitting the involuntary sterilization of the mentally unfit, a practice upheld by the U.S. Supreme Court in 1920 in Buck v. Bell.
In the United States, the application of evolutionary and genetic ideas to social issues gained traction in the late nineteenth century as a tool for explaining and dealing with the vast changes accompanying industrialization, urbanization, and immigration. Eugenics drew support from both conservatives and progressives and underlay the emerging consensus on the need for immigration restriction that resulted in the nationality based immigration quotas legislated by Congress in 1924. “In the early twentieth century,” point out Hilary Rose and Steven Rose in Genes, Cells, and Brains, “barring Catholics, eugenics commanded the support of most EuroAmerican intellectuals – not just racists and reactionaries but feminists, reformers, and Marxists.” Conservatives found in eugenics and social Darwinism justification for opposing public and private charities that would contribute to the reproduction of the unfit. But eugenics found enthusiasts as well in birth control advocate Margaret Sanger and in settlement house workers preoccupied with the alleged degeneracy of an immigrant working class. Like their predecessors on the Massachusetts Board of State Charities decades earlier, they turned to the heritability of acquired characteristics and the plasticity of human nature to reconcile their belief in the biological foundation of physical and moral degeneration with their commitment to the power of social reform to build character and instill habits.
Nonetheless, by the 1920s, cracks appeared in the bridge that linked the environmentalists and hereditarians. Hereditarians took an increasingly hard line, manifest in the new science of intelligence tests as well as in their continued advocacy of sterilization. Developed by the French psychologist Alfred Binet, intelligence tests were brought to the United States in 1880 by American psychologist Henry H. Goddard who first applied them at the Vineland, New Jersey, Training School for Feeble-Minded Boys and Girls – he directed its new laboratory for the study of mental deficiency. Other psychologists picked up Goddard’s work on intelligence testing, extended it other populations, and experimented with different methods. Lewis Terman at Stanford, one of the most prominent and a proponent of the hereditarian view of intelligence, introduced the term “IQ,” which stood for “intelligence quotient,” a concept developed in 1912 by William Stern, a German psychologist. Intelligence testing, which at first aroused skepticism and hostility, received a tremendous boost during World War I, when a trial of the tests on more than 1.7 million people during the war dramatically brought them to public attention. The tests purported to show that nearly one-fourth of the draft army could not read a newspaper or write a letter home and, by implication, that the mental ages of the average white and black Americans were, respectively, thirteen and ten.
Davenport, Goddard, and others blamed the results for whites on the immigration of inferior races and used them as ammunition in their advocacy of immigration restriction. The tests, they argued, demonstrated the genetic heritability of mental deficiency. These ideas worked their way into public education in the 1920s, underpinning the educational psychology taught in teacher preparation courses and the massive upsurge in testing used to classify students, predict their futures, and justify unequal educational outcomes. “Terman and other psychologists” points out historian Paul Fass, “were quick to point out that opening up avenues of opportunity to the children of the lower socioeconomic groups probably made no sense; they did not have the I.Q. points to compete.” In the minds of its prominent advocates, intelligence testing was linked with beliefs that science had demonstrated the primacy of heredity over environment and that the immigration of inferior races was driving America toward a dysgenic future.
Even before the 1920s, strains between eugenicists and reformers had opened fissures in the consensus around the heritability of mental and character defect. Eugenicists’ commitment to “germ plasm” pulled them away from the environmental and neo-Lamarkian theories underpinning Progressive reform. Then, after the 1920s, biochemistry and the rise of the Nazis combined to drive eugenics into eclipse and disrepute. The more research revealed about the complexity of human genetics, the less defensible even reform genetics appeared. The American Eugenics Society praised Hitler’s 1933 sterilization law while German eugenicists flattered their American counterparts by pointing out the debt they owed them, and the Nazi regime welcomed and honored prominent American eugenicists.
The fall of eugenics left the field open to environmental explanations. Nurture rather than nature became the preferred explanation for crime, poverty, delinquency, and low educational achievement. The emphasis on environment fit with the emergent civil rights movement, which rejected racial, or biological, explanations for differences between blacks and whites – explanations that had been used to justify slavery, lynching, segregation, and every other form of violent and discriminatory activity. Hereditarian explanations fit badly, too, with the optimism underlying the War on Poverty and Great Society that assumed the capacity of intelligent government action to ameliorate poverty, ill health, unemployment, and crime.
Nonetheless, by the late 1960s a new eugenics began to challenge the environmental consensus. Its appearance coincided with the white backlash against government-sponsored programs favoring African Americans and the disenchantment following on what appeared to be the failure of programs of compensatory education designed to make up for the culturally deficient home life of poor, especially poor black, children. Psychologist Arthur R. Jensen’s 1969 article in the Harvard Educational Review, “How Much Can We Boost IQ and Scholastic Achievement?” led the revival of hereditarianism. “Compensatory education,” Jensen argued, “has been tried and it apparently has failed.” The reason was that compensatory education programs ran up against a genetic wall. Poor, minority children lacked the intelligence to profit from them.
Jensen’s article provoked a furious counter-attack. Nonetheless, the controversy breathed new life into research and writing on the influence of heredity on intelligence and seeped into the rationales for failure offered by educators. (I recall sitting in a meeting in the early 1970s with a high-level Toronto school administrator who, in a discussion of the low achievement of poor students, said, in effect, “well, Jensen has told us why.”)
The new field of sociobiology, founded by Harvard zoologist E. O. Wilson, a leading authority on insect societies, reinforced the renewed emphasis on heritability. Sociobiology, Wilson wrote, focused on “the study of the biological basis of social behavior in every kind of organism, including man.” This new emphasis on heritability, however, met strong scientific as well as political criticism and failed to clear away the taint that still clung to eugenics and genetically-based theories of race, intelligence, and behavior. The idea that the undeserving poor were genetically inferior had not been wiped from the map by any means, but it remained muted, unacceptable in most academic circles.
In 1994, in their widely publicized and discussed The Bell Curve, Richard Herrnstein and Charles Murray – whose notorious Losing Ground had served as a bible for anti-welfare state politicians – challenged the reigning environmentalist view of intelligence. Success in American society, they argued, was increasingly a matter of the genes people inherit. Intelligence, in fact, had a lot to do with the nation’s “most pressing social problems” such as poverty, crime, out-of wedlock births, and low educational achievement. They wrote that “low intelligence is a stronger precursor of poverty than low socioeconomic background.” Poverty, they argued, “is concentrated among those with low cognitive ability,” which, itself, was largely inherited. It also was racially tinged because blacks, they found, revealed lower cognitive ability at every socioeconomic level. Evidence points “toward a genetic factor in cognitive ethnic differences” because “blacks and whites differ most on tests” measuring “g, or general intelligence”, which is a fixed, inherited index of mental capacity.
In Inequality by Design, a powerful demolition of The Bell Curve, Claude Fischer and his colleagues show how Murray and Herrnstein misued their principal sources, leaving their empirical conclusions utterly unreliable and their larger argument in shambles. Nonetheless, despite assaults in the public media and by scholars, hundreds of thousands of copies of the 800-plus page hard cover edition of the book were sold. The Bell Curve is best understood not as a popularization of science but as an episode in the sociology of knowledge. Clearly, even if it often did not dare speak its name, the suspicion remained alive that heredity underlay the growth and persistence of the “underclass” and the black-white gap in educational achievement, which seemed to many impervious to increased public spending or reform. This suspicion was nurtured by a small set of academics and some foundations, like the Pioneer Fund, which claims that it “has changed the face of the social and behavioral sciences by restoring the Darwinian-Galtonian perspective to the mainstream of traditional fields such as anthropology, psychology, and sociology, as well as fostering the newer disciplines of behavioral genetics, neuroscience, evolutionary psychology, and sociobiology.”
From the 1990s onward, a profusion of new scientific technologies has provided the tools with which to explore mechanisms underlying the linkages between biology and society and fostered the astounding growth of the bioscience industry in genetics (the Human Genome Project), stem cell research, and, most recently, neuroscience. Teachers, point out Hilary and Steven Rose, “report receiving up to seventy mailshots a year promoting a variety of neurononsense. . . . The snake-oil entrepreneurs are in there selling hard to teachers who are without the protection provided by clinical trials” and other tools available to physicians.
With astonishing acceleration, neuroscience, evolutionary psychology, genomics, and epigenetics emerged as important scientific fields – in practice, often combined in the same programs. Neuroscience and other biological advances promised new ways of explaining social phenomena, like crime, and medical issues, such as the black-white gap in cardiovascular diseases, the increase in diabetes, the rise of obesity, and the origins and treatment of cancer-related disease. They promised, as well, the possibility of understanding how the brain ages and how Alzheimer’s disease and dementia might be mitigated or delayed. Research focuses, too, on how the environmental stresses associated with poverty in childhood could damage aspects of mental functioning and learning capacity with lasting impact throughout individuals’ lives, and, some scientists believe, beyond through the inheritance of acquired deficiencies.
In its January 18, 2010, cover story, Time announced, “The new field of epigenetics is showing how your environment and your choices can influence your genetic code – and that of your kids.” Epigenetics, the article explained, “is the study of changes in gene activity that do not involve alterations to the genetic code but still get passed down to at least one generation. These patterns of gene expression are governed by the cellular material – the epigenome – that sits on top of the genome, just outside it. . .It is these ‘epigenetic’ marks that tell your genes to switch on or off, to speak loudly or whisper. It is through eugenic marks that environmental factors like diet, stress and prenatal natal nutrition,” which “can make an imprint on genes,” are transmitted across generations. More soberly, the eminent child psychiatrist Sir Michal Rutter offered this definition: “The term ‘epigenetics’ is applied to mechanisms that change genetic effects (through influences on gene expression) without altering gene sequence.” “Epigenetic studies,” Hilary and Steven Rose report,
are uncovering a dazzling array of regulatory processes by which signaling molecules – sometimes themselves proteins, sometimes small molecules, some generated internally by each cell, some diffusing from other regions of the developing foetus – act as switches, turning particular stretches of DNA on or off so as to ensure that particular proteins are synthesized at the appropriate moment in the development sequences. Alterations in the timing of these switches may result in huge changes in the adult phenotype, producing new variations on which evolution can act. Genes are no longer thought of as acting independently but rather in constant interaction with each other and with the multiple levels of the environment in which they are embedded.
The flood of scholarly research and popular writing on epigenetics justified science writer Nessa Carey giving her book the title, The Epigenetics Revolution.
Epigenetics found such a receptive audience, in part, because once again scientific advance coincided with a major conundrum – the persistent “achievement gap” between blacks and whites which bedeviled educators. A large literature suggested a variety of sources, most of which focused in one way or another on the handicaps associated with growing up in poverty while the proponents of hereditary explanations lurked in the background. What the environmentalists lacked was a mechanism that explained exactly how the environment of poverty was translated into low school achievement. This is what epigenetics offered. It promised as well to parse the acrimonious differences between environmentalists and hereditarians in explaining the sources of criminality and virtually all other behavior.
The breathless embrace of epigenetics ran ahead of the evidence about the heritability of acquired characteristics and limits of existing epigenetic knowledge. Even Carey, an epigenetics enthusiast, warned, writing specifically about neuro-epigenetics, “this whole area, sometimes called neuro-epigenetics, is probably the most scientifically contentious field in the whole of epigenetic research.” In fact, the links between children, poverty, and biology are exceedingly complicated and only partly understood, as serious scientists working in the area readily admit.
The significance of epigenetic research on how environment alters gene expression, according to Nobel laureate economist James Heckman, is that it “teaches us that the sharp distinction between acquired skills and ability featured in the early human capital literature is not tenable. . . . Behaviors and abilities have both a genetic and an acquired character. Measured abilities are the outcome of environmental influence, including in utero experiences, and also have genetic components.” For Heckman, most of the gaps at age eighteen that explain adult outcomes are present by age five. By the time disadvantaged children reach school, the clear implication is that it is too late to remedy their cognitive deficiency or to put them on a road to escape poverty.
Other neuroscientists are not so sure. They view brain development as more plastic, with changes possible through adolescence and, possibly, even in old age, although they find direct evidence of early childhood disadvantage on the size of key areas of the brain, especially those that control memory and executive function. Rutter points out, “it is now clear that the brain is intrinsically plastic right into adult life, although plasticity reduces with increasing age. The sensitive periods are not as fixed and immutable as was once thought, and they can be extended pharmacologically. . . . In addition, plasticity can be increased by vigorous extended exercise.”
Epigenetics has facilitated and revived the reconciliation of hereditarianism and reform that flourished before social Darwinism in the late 1860s and then again in the Progressive Era, before splitting apart in the 1920s. Epigenetics promises to move beyond the long-standing war between explanations for the achievement gap, persistent poverty, crime and other social problems based on inheritance and those that stress environment. It gives scientific sanction for early childhood education and other interventions in the lives of poor children. As with earlier invocations of science, popular understanding fed by media accounts threatens to run ahead of the qualifications offered by scientists and the limits of evidence.
Herein lies the danger. In the past, the link between hereditarianism and reform proved unstable, and when it broke apart the consequences were ugly. Even when in place the link supported racially-tinged immigration reform and compulsory sterilization – all in the name of the best “science.” Indeed, every regime of racial, gender, and nationality-based discrimination and violence has been based on the best “science” of the day. “It is when scientists and doctors insist that their use of race is purely biological,” cautions legal scholar and sociologist Dorothy Roberts, “that we should be most wary.” Philosopher Jesse J. Prinz warns that “When we assume that human nature is biologically fixed, we tend to regard people with different attitudes and capacities as inalterably different. We also tend to treat differences as pathologies.”
It is not a stretch to imagine epigenetics and other biologically based theories of human behavior used by conservative popularizers to underwrite a harsh new view of the undeserving poor and the futility of policies intended to help them. This is not the aim, or underlying agenda, of scientists in the field, or a reason to try to limit research. It is, rather, a cautionary note from history about the uses of science and a warning to be vigilant and prepared.
Piece adapted from The Undeserving Poor: America’s Enduring Confrontation with Poverty, by Michael B. Katz, 2013. A version of this article appeared in Social Work & Society, Vol 11, No 1, 2013. The author would like to give special thanks to Mike Rose.
Cover image by Alex Proimos
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.
About the Author:
Michael B. Katz is Walter H. Annenberg Professor of History and Research Associate in the Population Studies Center at the History Department at the University of Pennsylvania. Educated at Harvard, he has been a Guggenheim Fellow and a resident fellow at the Institute for Advanced Study, the Shelby Cullom Davis Center for Historical Studies (Princeton), the Russell Sage Foundation, and the Woodrow Wilson International Center for Scholars; he also has held a fellowship from the Open Society Institute. He is a fellow of the National Academy of Education, National Academy of Social Insurance, and the Society of American Historians and the American Philosophical Society. In 1999, he received a Senior Scholar Award – a lifetime achievement award – from the Spencer Foundation. From 1989-1995, he served as archivist to the Social Science Research Council’s Committee for Research on the Urban Underclass and in 1992 was a member of the Task Force to Reduce Welfare Dependency appointed by the Governor of Pennsylvania. From 1991-1995 and 2011-2012, he was Chair of the History Department at the University of Pennsylvania; from 1983-1996 he directed or co-directed the University’s undergraduate Urban Studies Program; in 1994, he founded the graduate certificate program in Urban Studies, which he co-directs. He is a past-president of the History of Education Society and of the Urban History Association. In 2007, he was given the Provost’s Award for Distinguished Graduate Student Teaching and Mentoring.