Edward Countryman, The People’s American Revolution
BAAS Pamphlet No. 13 (First Published 1983)
ISBN: 0 946488 16 9
- The Problem of the Revolution
- Americans of Many Kinds
- Resistance to Imperial Reform, 1765-1776
- The Political Revolution, 1774-1789
- Revolution and Transformation
- Appendix: The Artisans in the Revolution
- Guide to Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1: The Problem of the Revolution
What was the American Revolution? Was it simply the decision of the Thirteen Colonies to declare their independence? Or did it arise from the strains of war, as Britain struggled unsuccessfully to retain them? Did the Revolution lie in the replacement of monarchy by republican government? Was it brought about among a united people? Or did it pit different kinds of Americans against one another? Did it take place in the real world of social and political relationships, or’ in the realm of consciousness, mentality and ideology? Or was the Revolution no single one of these, but rather a grand transformation, binding many separate changes together?
The bare story seems simple enough: after 1763 Britain challenged the traditional autonomy of its colonies by introducting new policies of imperial reform and taxation; the colonies responded, nullifying first those policies and then their tie to Britain itself; finally, they created a republic where an empire had been. Yet the reality surrounding that story is complex indeed. Though analysts have tried for two full centuries to make sense of it, no single interpretation has ever won general acceptance. Among the earliest interpreters were the Revolution’s participants and victims, for the aftermath of independence saw half-literate farmers, angry politicians, sophisticated intellectuals, and loyalist exiles all writing down their versions of what they had lived through. Yet for all that they had shared in its events, these men and women could not agree on what the Revolution had been. Writers of the time raised practically every question about the Revolution that academic historians have been arguing about since the writing of American history began. In particular, some contemporaries saw the Revolution as not merely a conflict between America and Britain, but a conflict—and transformation—within America itself.[1]
The earliest scholarly account of the Revolution—George Bancroft’s History of the United States, written in the middle decades of the nineteenth century-established the academic tradition of seeing the Revolution as essentially a colonial struggle for independence. Bancroft was no professor; rather, he was a gentleman-scholar, a patriot, and a politician. He did his research carefully, but he wrote for, and reached, a general readership, using a colorful, impassioned prose that no modern academic would dare employ. For him the central story of the Revolution, and of all American history, was the rise of American liberty, a rise that was completed when the Founding Fathers wrote and implemented the Federal Constitution.[2] But despite his towering achievement, Bancroft’s great work of Romantic “Whig” historiography inspired the first generation of professional historians to criticism rather than to emulation. Bancroft insisted that all American history pointed in one direction, but his successors began to consider the motives, the situations and the purposes of people for whom that direction had not been so clear.
The foremost such people during the Revolution, of course, were the British and the loyalists. Did Britain really intend to impose on the colonies a fearsome tyranny? Were Americans who opposed independence really that tyranny’s crawling minions? The doubts that lie behind such questions began to seem plausible towards the end of the nineteenth century, and by World War I the intellectual climate had become ripe for a different view. That view first appeared in the work of Herbert Levi Osgood and George Louis Beer, and it culminated with that of Charles McLean Andrews and his many doctoral students, most notably Lawrence Henry Gipson.[3] These writers explored the problems that were faced by British statesmen, and they wrote with sympathy of what the loyalists endured. Unimpressed by the claim that Britain planned tyranny, many of them conceived the Revolution as simply a rash upsurge by colonists too small-minded to perceive the problems as men with an imperial viewpoint saw them. In their work the colonial movement became little more than a matter of selfish unwillingness to accept responsibilities. The policy changes that ignited the Revolution grew, they argued, not from a malevolent conspiracy but rather from the fact that in 1763, when those changes began, Britain stood exhausted from its long struggle with France. Since no one had gained more from British victory in that struggle than the colonists, who at last were free of the fear that the French would bring war to them from the west and north, it seemed only fitting that they should pay some of the cost. It was for that reason, wrote these “Imperial” scholars, that Britain attempted to tax them by the Sugar Act (1764), the Stamp Act (1765), and the Townshend Acts (1767). When the colonists resisted these measures, such writers maintained, it was simply for the sake of avoiding their just obligations.
While these scholars were denigrating the notion that the Revolution was a high-minded struggle for liberty against tyranny, others were challenging the idea that it was the movement of a united people. Such “Progressive” writers as Carl Becker, Charles A. Beard and the elder Arthur Schlesinger worked from a position of sympathy for ordinary men and women. In their dissections of the dynamics of the independence movement they found not broad agreement but rather dispute, conflict and manipulation. Becker summed up their approach when he argued in his 1909 study of the province of New York that the Revolution was not just a struggle over home rule; it also turned on the question of who should rule at home.[4] The culmination of this tendency came in 1913 when Charles Beard took the Founding Fathers, the men who met in 1787 to write the United States Constitution, down from the pinnacle where they had stood for so long. Far from being the disinterested patriots Bancroft had made them, Beard maintained that they wrote the Constitution primarily to restore value to depreciated paper securities that they had bought up at cheap rates. These Revolutionary leaders sought little more than to make a killing for themselves.[5]
Underlying both the Imperial and the Progressive interpretations lay a deep skepticism about the pieties of nineteenth-century American life. That such skepticism should have developed is not surprising. The last decades of the nineteenth century and the first years of the twentieth, when these scholars wrote, saw America and Britain moving ever closer in terms of world politics. They saw the United States itself becoming an imperialist power. They saw its citizens moving ever farther apart in terms of their dealings with one another. Yet the new historians based their accounts on massive research as well as on questions shaped in their own times. Their goal was to deepen understanding of the past, not simply to score polemical points.
But just as the broad generalizations of a Bancroft provoked pointed challenges, so did those of Beard and Becker themselves. By the middle of this century, in the atmosphere of national unity generated by World War II and the Cold War, historians were mounting a merciless attack on the Progressive framework. One of the central Progressive tenets had been that in early America a privileged elite had been pitted against a disenfranchised mass. Beginning in 1955, however, Robert E. Brown and B. Katherine Brown devoted study after study to a refutation of that argument.[6] Both the quantitative methods they used and the questions they asked have generated fierce debate, but it is now generally agreed that, in spite of property requirements, in most colonies probably a majority of adult, white males possessed the right to vote. At the same time, Forrest McDonald was arraying a massive amount of data to demonstrate that how those involved in approving the Constitution stood on the issue had nothing to do with whether they would gain from a rise in the value of depreciated securities.[7]
Brown and McDonald made possible a return to the nineteenth-century idea that Revolutionary Americans had been an undifferentiated, united people. The way to a fresh assertion of that proposition lay in the close attention that other historians had already begun to pay to the sheer wordiness of the Revolution’s makers. They had poured out endless letters, pamphlets, essays, songs and broadsides, but neither historians concerned to explore internal conflict nor historians writing from the point of view of British officials had taken that outpouring very seriously. Becker, for instance, maintained in his study of The Declaration of Independence (1922) that the Revolutionaries had simply taken up one intellectual position after another as it suited their needs.[8]
But in 1948 Edmund S. Morgan advanced the argument that during the debates over the Stamp and Townshend Acts “official” colonial spokesmen maintained a coherent and unchanging position on the relationship of the colonies to Parliament. Morgan’s thesis was that they took their ideas seriously and acted on the basis of them. It provided a starting point from which a new interpretive framework could take shape.[9] For a host of writers who have followed him, the prime problem to be considered has been Anglo-American “Whiggery,” the political culture and political language of the era. Whiggery insisted that an eternal conflict existed between the principle of power and the principle of liberty, and that constant vigilance was needed if liberty were not to be lost. It equated liberty with government under laws made by parliamentary institutions, and it maintained that the British constitution, with its balance of Crown, Lords and Commons, provided the best means of making liberty secure. Many historians including, among others, Clinton Rossiter, Douglass Adair, Cecelia Kenyon, and J.G.A. Pocock—have described Whiggery as an intellectual structure powerful enough to explain the Revolution by itself. With roots deep in traditions that reached back to medieval England and Renaissance Italy, that culture preconditioned its adherents to perceive tyranny behind any move to increase any government’s strength. Whatever the British intended, it was less important than what the Americans understood by it.[10]
Admittedly, some of these writers, notably Gordon Wood, have demonstrated how the Revolution’s intellectuals used the language of Whiggery to argue out their differences, and he and Pauline Maier have shown much interest in street radicals, crowd action and popular upheaval.[11] But, for the most part, all of these historians have been more concerned with a universe of discourse that the whole Revolutionary generation shared than with any social element -class, region, race, gender, religion—that set its members apart. Bernard Bailyn, in particular, has given classic expression to this approach, which has so much in common with that of Bancroft a full century earlier: once more the Revolution has become a struggle to preserve liberty, a struggle which in the end transformed traditional institutions and established American republicanism and, in time, democracy. Perhaps what most clearly distinguishes these modern historians from Bancroft is their awareness of how immensely ironic history can be, as people who set out to do one thing find themselves accomplishing something very different.’[12]
But throughout the 1950s and the early 1960s writers such as Merrill Jensen and Jackson Turner Main kept alive the questions that the Progressive historians had elaborated. In Jensen’s analyses of the Articles of Confederation, the postwar years, and the coming of the Revolution and in Main’s many books on early American society and politics, divisive social experience, not shared values, remained the key to understanding.[13] A vigorous reinvestigation of that approach began in the late 1960s and is still underway, and it owes a great deal to,the example ofJensen and Main. But this most recent writing has also been influenced by two other stimuli, one political, the other academic. The generation of historians who have produced it came to maturity as the United States experienced refusal by blacks to tolerate any longer the conditions white America had imposed on them. These historians lived through the war in Southeast Asia and the movement against it, and they saw the birth of new consciousness among such groups as women, homosexuals and Native Americans. Is it surprising that many of them are sensitive to questions of conflict and upheaval in a way that another generation might not have been? The other influence, of tremendous importance for the formation of historical concepts and the development of research methods, has been the work of such non-Americanists as George Rude, Albert Soboul, E.J. Hobsbawm and E.P. Thompson. Their studies of English and European crowd action, class formation and popular culture have provoked many investigations of American material framed in similar terms. American historians like Gary B. Nash and Rhys Isaac, among others,[14] have argued that the Revolution was driven by a popular radicalism which was only partially connected with the intellectual world of Whiggery. They have maintained that as that radicalism worked itself out it transformed aspect after aspect of the lives of Revolutionary Americans. They have held that that transformation affected different kinds of Americans—farmers, artisans, merchants, women, blacks—in different ways and that it gave people who were otherwise powerless a chance to intervene in historic events on their own. These studies have insisted, although in a way quite different from that of the Progressive writers, that the Revolution was a time when Americans disputed among themselves over fundamental questions.
The aim of this pamphlet is not to present a synthesis of all these contrasting approaches to the Revolution, but to outline and explain the general understanding of that phenomenon that emerges from the work of the younger, more “radical,” perhaps “neo-Progressive,” historians. By no means would all of them agree with the views expressed here, and I do not interpret their work as the authors themselves would choose; inevitably this is the personal view of one historian of a “radical” persuasion, who might well be accused of trying to extend to the whole country the conclusions of his own work on New York, the province which in an earlier generation had sustained Becker’s views.[15] Yet the fact remains that the most recent writing has brought forward much new material and many important perspectives which tend to modify the “ideological” interpretation of Bailyn, Wood and other “neo-Whigs.” In particular, while students of the Whig beliefs and attitudes of the Revolutionary generation recognise that the Revolution transformed the American polity in subtle and unsuspected ways, more recent, “radical” writing has insisted that this was no mere accident. Rather, it was the necessary consequence of the upsurge of popular involvement in the Revolution—on the part of people who had not normally exercised influence or participated in decision-making in the various societies and polities of colonial America.
2: Americans of Many Kinds
There were no “colonists” or “Americans” in 1763. Instead, there were people of one empire and of many colonies. People thought of themselves in terms of their provinces, or perhaps as Britons overseas, but no United States existed. The thirteen provinces that rebelled -and the many such as Nova Scotia, Quebec and Jamaica that did not—were more than just administrative units. Each was a separate political society. Colonies and mother country alike shared the heritage of the English Whig settlement, which had established the principle that power was jointly held by the Crown, the House of Lords and the House of Commons. In the colonies a governor stood in place of the King, a council in place of the Lords and, as fax as the colonists were concerned, a local assembly in place of the Commons. But political practices varied from province to province. In Massachusetts a town meeting might well resolve a set of direct instructions for the town’s delegate to the provincial assembly. In Virginia there were no towns to meet, and if people wanted something from the government they expressed their desires deferentially, informally and as individuals. New Yorkers and Pennsylvanians were used to their leaders openly competing for support; South Carolinians were famous for their harmony. In Virginia a well-defined ruling class wielded power in a manner that most other whites accepted. In neighboring North Carolina, no real planter class had appeared and those who tried to rule found that others gave them little respect.[16]
The colonies were diverse socially as well. South Carolina had so many blacks that it was “more like a negro country”; two of every five Virginians were likewise black and enslaved. In those two provinces productive work meant black labor to raise staple crops like tobacco, rice and indigo for export. In Georgia and North Carolina there were far fewer blacks; the plantation system had not yet triumphed, and most whites lived as small farmers, remote from the Atlantic market. In northern Virginia and southern Maryland tobacco culture shaded off into wheat raising, and from there north to New York the main export was grain. In New Jersey and Pennsylvania the grain was raised largely by small freehold farmers, but a sizable proportion of New York’s crop was grown by white tenants on great estates. Some of those estates faintly reflected the heritage of European feudalism. Except for New Hampshire timber and Connecticut horses, the small freehold farms of New England exported little in the way of agricultural goods; indeed, the region had to import a good proportion of its food.[17]
The colonies existed within a larger world. In some ways their societies looked very much like those to be found elsewhere, and as they grew in population and wealth, that became more and more the case. There were colonials who had experienced that larger world directly, before migration or on journeys to England and Europe. There were others who lived with memories of Europe—or Africa -passed down from their parents and grandparents. For some, the world elsewhere provided a model of civilization to be emulated; for others, it was a model of corruption to be shunned. There were colonists who knew that the Spanish-American provinces to the south did not enjoy the British political tradition, and some may have known that in Mexico City, as in Paris or London, there were extremes of wealth and poverty not to be found in British America. But what counted most in the colonists’ day-to-day lives were the relationships within which they lived with one another.
Those relationships were shaped by an absence, save in the case of slave and free, of formally defined, legally established social positions. But black slavery, which was unique to the Americas, had been shaped in the first place by the demands of a market system that centered on western Europe and that stretched across half the world. That same market system affected the relationships of all North Americans, although in different ways.
The major cities, Charleston, Philadelphia, New York, Newport and Boston, all had thriving communities of merchants. Some were agents of British and European trading houses, but others bought and sold up and down the Atlantic coast and into the Caribbean. Not a few prospered on the slave trade. Some of these merchants had prominent British connections and vast fortunes; £50,000 sterling was not uncommon in New York. Many others operated on a much smaller scale, not even bearing the titles—Esq., Gent., Mr.—that signified gentility in colonial society. Such men lived close to the artisan communities in each of the cities. An artisan, or “mechanic,” might be a journeyman baker or a wealthy master printer with many employees, but he worked with his hands. John Singleton Copley’s well-known portrait of Paul Revere (see opposite) illustrates the pride artisans could take in themselves. Revere posed for it not in Sunday best but in his work clothes, with his tools and a silver teapot of his making in front of him. But at best such men were only half-inside the political community, voters and minor office-holders but never wielders of power. Below the artisans, and their journeymen and apprentices, were laborers and servants, some of them white, some black, some free, some bound servants, some slaves. At every social level there were some women who acted independently, running trading houses and shops or simply working on their own. But most women knew little of the world of affairs, and most thought their sex almost automatically disqualified them from any social role beyond that of “good wives.”[18] The social relationships of the largest cities provided a model for lesser places, such as Salem, Massachusetts, or Baltimore, Maryland. In Philadelphia, New York and Boston, at least, the eighteenth century saw a marked trend towards the concentration of wealth and towards the growth of a group of marginal and really poor people.[19]
In the countryside, too, people’s lives varied widely. A South Carolina or a Virginia planter or the holder of a New York estate might have a fortune in excess of £100,000. A slave had nothing. A tenant or a small farmer might have a leasehold or a freehold worth several hundred pounds. In some places, such as tidewater Virginia or the New York estates, great wealth and modest means might exist side by side. But the holders of South Carolina’s great fortunes were to be found in the lowlands, separated by many miles from small farmers in the backcountry. In New York a tenant on the east bank of the Hudson and a freeholder on the west bank might pay roughly equal taxes. But the tenant lived on land that belonged to someone of a wholly different order and the freeholder in a community in which no great fortune existed.
Similar communities, roughly egalitarian, were to be found in New England. They were densely organized around networks of church, town meeting and family. They did include people who were well off and people who were really poor, but even the “river gods” of the Connecticut Valley held nothing like the fortunes of New York landlords or Virginia planters. Some towns in eastern New England were becoming seriously overcrowded by mid-century, with disease on the increase, life expectancy falling, and young adults finding themselves compelled to migrate. Many of those migrants went either north, towards the Green Mountain region that New York, Massachusetts and New Hampshire all claimed, or west, towards the ill-defined borderland that separated New York from Massachusetts.[20] Throughout the interior of New England and the middle colonies and the backcountry of the South there were people whose lives were touched but little by the long-distance and impersonal relationships of the Atlantic market. Instead, they lived within networks of exchange and obligation that were local and often non-monetary. They experienced the world very differently from people whose crops and goods were destined for sale far away.
Relationships among these different people were complex and tangled. Everywhere the realm of high power was the preserve of the elite. But what marked an elite off, how its members conceived of themselves, and how they dealt with “lesser” people varied. Throughout the colonies it was the votes of ordinary men that gave great men their political power. But in rural New England they enjoyed that power within organic communities. New York landlords enjoyed it because they could command the suffrage of their tenants. Virginia planters enjoyed it within an elaborate pattern of ritual, bonhomie and deference that simultaneously permitted politesse and self-assertion, market relations and the image of a bucolic idyll, planter power and cross-class camaraderie.
The overall process of the Revolution would be worked out within these sometimes tense, sometimes relaxed, sometimes commercial, sometimes organic, sometimes hierarchical, sometimes consensual relationships. It would see the making and unmaking of a series of political coalitions, each structured around both the great issues of the day and the local concerns of particular groups and communities. That process would fundamentally alter the ways in which people dealt with one another. In no place would the Americans of 1790 live in quite the same way as had the colonists of 1760.
3: Resistance to Imperial Reform, 1765-1776
The policy changes that Britain began to make in 1763 dealt with the administration and the finance of the empire. The prime purpose of the empire had always been to orient the wealth of the colonies for Britain’s benefit, and since the time of Oliver Cromwell a piecemeal series of laws had been enacted to achieve that goal. These laws established a “Navigation System” that imposed obligations on the colonies but provided significant benefits in return. Colonial commerce was restricted to British vessels, but ships of American construction counted as British and a flourishing ship-building industry resulted. Some crops, such as tobacco, rice and indigo, were restricted to the British market, but there they generally found ready sale, in some cases with direct support from the British government. There were legal restrictions on what the colonists might produce, and there were severe duties on goods imported from outside the empire. All of these elements of the Navigation System simply reflected the fact that the colonies lay on the periphery of an economic structure over which they had no real control. They existed, in theory and in many ways in reality, not for their own sakes but to serve the needs of metropolitan Britain. There were elements in their developing societies, such as staple-crop production and unfree labor, that they shared with other parts of the eighteenth-century periphery, including eastern Europe, South America and the Caribbean. It may be that had independence not happened in the way it did North America’s future would have been marked by dependency and underdevelopment instead of by the dynamic autonomy that the nineteenth-century United States was to enjoy. But there is no denying that prior to 1763 the system did confer important benefits at little perceived cost. The “enumerated goods” had a good market, the hat and iron industries flourished despite the restrictions of the Iron Act and the Hat Act, and the duties on foreign imports were easily evaded. Years of “salutary neglect” had enabled the colonists to become accustomed to being effectively autonomous. The practices of merchants who traded where and as they thought best and of politicians who acted as if a provincial assembly was the equivalent of the House of Commons both reflected colonials’ belief that they ran their own lives.[21]
After 1763, however, the British government challenged that belief, again and again. The end in that year of the “Great War for Empire” (or Seven Years War) saw British politicians determined to assert the mother country’s superiority, both as a matter of principle and as a matter of direct interest. They embarked on a program of policy changes, aimed at establishing that superiority. They enforced laws that had been left lax; they created an American customs service; they demanded that colonial assemblies vote funds to support regular troops; most of all, they taxed. The Stamp Act of 1765, the Townshend Taxes of 1767 and the Tea Affair of 1773 all grew out of British conviction that Parliament had the power to tax the colonists directly, that it represented and ruled them just as much as it did Britons at home.
The movement that opposed those taxes marked the first stage in the development of the Revolution’s popular radicalism. Ordinary people became involved for many different reasons, as colonists of different sorts came to the conclusion that great events were making their lives intolerable. This movement deserves to be called radical for three different reasons. First, it generated not simply discussion about British policies that Americans disliked but also direct action strong enough to frustrate those policies and make them unworkable. Second, it brought to political consciousness large groups of people who previously had stayed out of public affairs. Third, it generated political formations that were without precedent in colonial life. The eventual result was a situation in which the old institutions of power could no longer endure.
Radical opposition developed not as a single united movement but rather as a series of coalitions. The coalition that prevented the implementation of the Stamp Act in 1765 was not the same as the coalition that achieved independence in 1776. Nor was either the same as the coalition behind the Federal Constitution in 1787. To simply speak of “Americans” or of “colonists” will not do. Rather, we must understand each stage of the Revolution in terms of the precise combination of groups, interests and individuals that took a stance in the political arena. We cannot understand how each coalition developed unless we understand why and how its separate elements involved themselves.
The initial strength came from two sources. One was the political elites, gathered in institutions like Virginia’s House of Burgesses. That the elites had reason to be disturbed is clear: Parliament’s assertion of its might threatened their own power. Moreover, within their Whig worldview, the imposition of British policy without going through the ritual of consent by a representative assembly was the first step to tyranny. British spokesmen argued that such consent had been given, in Parliament, but the colonial elite argued forcefully that they, not the House of Commons, represented the people of the colonies. They produced incisive pamphlets; their assemblies passed resounding resolutions; they used their connections to bring pressure in Britain itself. This leadership by the most prominent men and the leading institutions of colonial society was important, for it helped to legitimate colonial anger in the language of high principle. But equally important was the second source of opposition, which expressed itself not so much in words as in the direct, militant action of ordinary people. Without that direct action the opposition to British policy would have come to nothing more than a debate. Popular involvement was what changed that debate to a movement.
The cardinal fact in the popular politics of the decade from the Stamp Act to Independence was crowd action. Between 1765 and 1775 crowds nullified the Stamp Act, frustrated the American Customs Commissioners, brawled with redcoats, and dumped tea into more than one harbor. They closed courts and tore down elegant houses. They broke jails open and stopped surveying parties and disrupted concerts. Some acted as extensions of the governments of their own communities, some in direct opposition to those governments, and some brought governments down. Some crowds were led by prominent men from outside their own ranks, but some brought forth their own leaders, out of the depths of obscurity.
Such crowd action was not new, nor was it mindless chaos. Mobs had long been a feature of eighteenth-century life, and they erupted within a framework of political economy and popular culture that governed how crowd members behaved and gave some forms of crowd action a quasi-institutional legitimacy. Traditional, preRevolutionary crowds often acted because the authorities could not: what, after all, was a sheriff’s posse or a militia company or a volunteer fire company other than a crowd, drawn into ranks and given official standing? Sometimes pre-Revolutionary crowds acted to prevent an indiscriminate evil, such as smallpox contagion entering a community. But sometimes they defended the direct interests of their own members. Sometimes they did both. As Jesse Lemisch has argued, merchant seamen who rioted in Boston in 1747 against impressment into the British navy were saving Boston as a whole, for while a press was on boatmen would not bring in food and fuel and merchants could not put their ships to sea. But they were also saving themselves from the horrors of naval life, and in their own minds that was what counted most.[22]
Other crowds turned out because men outside their own ranks called them forth. Elections, especially in the middle colonies, frequently saw each side stationing its gangs of toughs at the polls. Sometimes crowds took direct action on matters of private property. Crowds never attacked the principle of private ownership, but they often acted on the belief that people besides the owner had a rightful say in how property might be used. The economic life of early American towns did not revolve around an unrestrained free market; rather, the towns were heirs to a long European tradition that a community might interfere in property’s use for the sake of the general welfare. That tradition took one form in New York and Philadelphia, where the authorities established controlled markets in which important commodities had to be sold. It took another in Boston, where in 1737 a crowd tore down the building erected for just such a market. But the basic point was clear: the right to a supply of necessities like bread, salt and firewood was more important than the right of a merchant dealing in them to find his profit where he could. Both in Europe and in America ordinary people enforced that tradition if the authorities did not. This traditional “corporatism” underlay a great deal of the crowd action that America experienced between 1760 and 1780.[23]
The crowds of the countryside present still more complexities. Pennsylvania settlers rose in 1763, doing fearful injury to peaceful Indians and then marching on Philadelphia. This “march of the Paxton Boys” subsided quickly, but elsewhere rural movements proved durable, lasting in some cases for years. In the Green Mountains, on the east bank of the Hudson River and in central New Jersey, small farmers rose over who would hold the land and how it would be developed. In both Carolinas movements of upland “Regulators” challenged the policies and the power of provincial governments that were based in the lowlands. When country people rose, the authorities took them seriously, condemning their leaders to death without trial and calling out the militia and regular British troops to put them down.[24]
Late colonial America, in other words, was a turbulent, unstable place, where direct action could both extend and frustrate the power of government. Crowd action could spring from the solidarity of people whose communities were under threat, but it could also express the hostility of people whose interests clashed. But that was true of virtually the whole Atlantic world at the time. What turned the crowds of North America from recognized elements in that world into a revolutionary force, able to turn a part of it upside down?
The answer is that day-to-day grievances meshed with imperial issues to create a general crisis, a process fostered by the group of self-conscious radicals that emerged under the name Sons of Liberty. Their achievement was to create a militant, disciplined American movement capable of directly opposing British policy and power. Who were they? What is the relationship between their militancy, domestic grievances, and the imperial issue?
Pauline Maier has shown how the Sons emerged at the time of the Stamp Act as an inter-colonial group.[25] Bent on militant opposition to British policy, they provided both geographical links between widely separated places and social links between the Whig elite and the plebs. Maier has argued convincingly that in their own eyes they were not tribunes of an oppressed people, bent on forcing the imperial issue for the sake of internal change. But she and others have also demonstrated that they were men of very specific sorts. Many were artisans, like Paul Revere in Boston or the instrument maker John Lamb in New York. Others were small-scale intercolonial merchants, such as the New York street leaders Isaac Sears and Alexander McDougall. Frequently they were men on the make: Sears was the son of an oyster catcher, McDougall of a milkman, Lamb of a convict servant. The Sons also attracted unhappy intellectuals, like the Harvard graduate Samuel Adams and the self-taught physician Thomas Young. Adams glowed with a vision of a Christian Sparta; Young preached an unabashed Deism: they knew that if they let themselves they could disagree on a great deal. But they shared an outrage towards the world they lived in, a closeness to ordinary people, and a delight in public action. Whatever they were, the Sons stood several rungs down the social ladder from the elite. They were literate enough and sophisticated enough to understand the arguments of a john Dickinson or the resolutions of a House of Burgesses. They were well enough off to have the time to sit in the gallery and watch the provincial assembly debate. But even their Costume- long trousers and leather aprons rather than the knee-breeches and silk and velvet of the elite—marked them as of the people rather than of the “better sort.”
How and on what terms the Sons and the members of the crowds dealt with one another varied. The “Loyal Nine,” from whom the Sons in Boston emerged, used several means in 1765 to, generate their town’s resistance to the Stamp Act. As an early preparation, they contacted a cobbler named Ebenezer Mackintosh. Mackintosh was leader of one of the crowds that traditionally gathered on “Pope’s Day,” November 5, to build elaborate anti-Catholic effigies and sometimes to brawl. Now the Loyal Nine asked him to turn out his followers on a different, more immediate issue. On the day of the first rising the radicals acted out a dramatic open-air tableau to show all Bostonians how the Stamp Act would disrupt their lives. But these emergent Sons of Liberty insisted that their townsmen confine direct action to the imperial issue. Their tableau vivant of August 14 provoked an uprising in which a crowd sacked a building which Stamp Distributor Andrew Oliver was supposedly preparing to use as an office, and then Oliver’s house. The Sons applauded. But when crowds gathered again on August 26 and destroyed the mansion of Lieutenant-Governor Thomas Hutchinson, the Sons joined the town’s elite in a chorus of condemnation. Hutchinson, they believed, had nothing to do with the Stamp crisis. Their orientation came across clearly in their newspaper, The Boston Gazette. Throughout the late 1760s it was filled with angry, militant prose, but invariably it focused on the British issue, not on local concerns.
But in New York City the Sons acted differently. They showed no opposition at all in November 1765, when crowds sacked the mansion of a British officer and burned carriages and sleighs belonging to the lieutenant-governor. Early in 1766 Sons actually led a crowd that disrupted the first performance in a newly-opened theatre, driving out the patrons and the actors and then tearing the building down. Their newspaper, The New York journal, carried essay after essay attacking the evils of high rents, rising prices and short employment. At the end of. 1769 two New York Sons, Alexander McDougall and John Lamb, launched a campaign against the presence of British troops in the city. McDougall wrote a broadside accusing the provincial assembly of betraying New York by voting money for the troops’ support; when the assembly imprisoned him for contempt, his associates dramatized his plight and made him a popular hero. But Lamb understood what made people angry. His broadside accused the off-duty troops of taking scarce jobs that New Yorkers needed, and it sparked off days of street violence.[26]
Never was there a single pattern: each group of Sons operated in its own way in its own community. But they did maintain contact, pledge mutual cooperation, and keep political consciousness high from the Stamp Act Crisis to the Coercive Acts of 1774. Maier’s demonstration of how they did it forms one of the central statements in the present understanding of the Revolution.
But the Sons were a phenomenon of the cities and the large towns, and, at most, townspeople totalled less than five percent of the American population in 1775. Without massive rural involvement the American movement would have come to nothing. When and why did country people join in? As in the cities, local grievances, popular culture and the hard work of political organization all combined to make a movement.
Consider three places about which we know a fair amount, Massachusetts, New York and Virginia. In economic, cultural and even political terms they had little in common, and their Revolutionary experiences varied enormously. New England’s synthesis was decaying rapidly by the third quarter of the century, with family, village and church no longer providing a cohesive social framework. Moreover, through the 1760s and early 1770s Governors Francis Bernard and Thomas Hutchinson tried to co-opt the village leaders who were elected to the provincial assembly by offering them posts as justices of the peace and ranking militia officers. Loyalism would not be an enormous problem in Massachusetts or the other New England provinces, but it would attract a disproportionate number of those leaders. Boston radicals set out to rouse rural Massachusetts opinion as early as 1768, when they called delegates from the towns to a provincial convention. Between 1772 and 1774 their committee of correspondence waged a ceaseless propaganda campaign in the hinterland, and by 1774 the entire province was a tinderbox. In the summer of that year angry farmers forcibly closed county courts and demanded the resignations of high officers. By the autumn those same farmers had begun gathering military stores and preparing for war, and on two occasions they spontaneously turned out by the tens of thousands in response to rumors of fighting near Boston. The first was a false alarm; the second, in April 1775, was the real thing.[27] Rural Massachusetts had entered the Revolution for many reasons. By 1774 its people were genuinely angry about what the British were doing in response to Boston’s Tea Party. They said so repeatedly in the resolutions of their town meetings. But at the same time these farmers were fearful that they were on the point of losing their land and their way of life, for reasons only partly connected with the British issue. They were angry at local leaders who had let themselves be seduced by colonelcies and judgeships. Thus, when they closed the courts and humiliated the judges, sheriffs and royal councillors, they were protesting against more than the Massachusetts Government Act, by which Parliament had reorganized the province in the aftermath of the Tea Party. They were also striking at their own local elite, at the social trends for which it stood, and at the institutions through which it ruled.[28]
Rural New Yorkers were of at least three sorts. Some, near New York City, lived in stable, prosperous, market-oriented communities. Others, more remote, lived in communities that were almost as cohesive, but that were much less tied to the market. Still others lived on the great estates. These were anything but cohesive, and in 1766 they erupted in a massive rising that stretched from New York City to Albany. But all New Yorkers lived under a provincial government that was operating more and more on the principle of providing for the interests of men who actually held office. New York City radicals took much less interest in the countryside than did their Boston counterparts. The province’s elite of merchants and landlords itself split open, and ordinary people went in four different directions. Almost unanimously the counties around New York City became loyalist. Overwhelmingly the yeoman counties on the west bank of the Hudson became Revolutionary. But after 1775 the counties of great estates on the east bank and in the Mohawk Valley broke into civil war, with some landlords and some tenants choosing each side. In the far north the Green Mountains, which Britain had formally awarded to New York in 1764, broke away to establish the separate state of Vermont.[29]
Virginia was different again. Though it was the most populous province it had no cities. It knew wealth and it knew poverty, but all its whites knew they were not black and understood the codes of behavior that made them in some ways an organic group. Nonetheless, problems were developing, and as in Massachusetts and New York, they involved relations between people who were privileged and people who were not. Evangelical religion was attracting humble Virginians away from planter Anglicanism, and some planters were disturbed enough by it to break up Methodist and Baptist meetings forcibly. The planters themselves were caught between a fascination with the glittering culture of metropolitan England and an awareness of their own provincialism. They were increasingly conscious, as well, of corruption and decay in their own ranks. The scandal in 1766 over illegal loans made by the provincial treasurer John Robinson was only one piece of evidence. When the Revolution came, Virginians did not launch an assault on the established political order, as Massachusetts farmers did. They did not divide in any serious way into rebels and loyalists, as rural New Yorkers did. Most of the elite chose independence, and most lesser whites followed them. As J.R. Pole has commented, Virginia’s open, seemingly democratic state constitution of 1776 is a monument to the planters’ confidence that they could continue to rule their world.[30]
Instead, as Rhys Isaac has forcefully argued, Virginia’s tensions expressed themselves largely in religious and symbolic terms. The revival that lured poorer whites was not unique to Virginia, for in mid-century all of the colonies had been swept by a wave of evangelical fervor. So intense was it that it split Congregationalists and Presbyterians into antagonistic wings, gave new impetus to Baptists, and laid the basis for the eventual rise of Methodism. Everywhere it erupted this “Great Awakening” drew on social tensions, and frequently it spilled over into politics. Isaac has shown that in Virginia the demeanor and belief of the evangelicals challenged the whole synthesis of public display, hierarchy, bonhomie, individualism and racism that Virginia’s planter class had painfully constructed. A writer using E.J. Hobsbawm’s categories might describe these Baptists and Methodists as primitive rebels, using their austere and egalitarian religion to protest against the ostentation and the stratification of planter life. The planters met the challenge: the issue did not break Virginia apart. They did it not least by recruiting their humbler fellows into a Revolution that they, the planters, would lead. But ordinary white Virginians got involved on terms that they chose themselves, and in consequence they changed the tone of Virginia life. Revolutionary Virginia, like colonial Virginia, built its social life around ritual and drama, but its rituals were different. They stressed commitment to the cause and equality among the men who joined. When the planter elite gathered in a Revolutionary congress wearing not their traditional costumes but rather the hunting shirts of plain men, it signified their acceptance of change.[31]
Each of these patterns was unique, yet each represented the particular expression of larger developments. In most provinces backcountry social grievance burst in some way into political dispute. It took one form among the South Carolina Regulators, who demanded courts where there were none. It took another in North Carolina, where other Regulators challenged the rule of would-be grandees. In Maryland the planter class nearly lost its hold at independence in the face of widespread “disaffection” and outright popular loyalism. Tenant rebellion and loyalism in the Hudson Valley, land rioting in New Jersey, and the movement that freed Vermont from New York had a great deal in common. Independence in New England saw the flowering of evangelical sects as radical as any that had appeared during old England’s “revolution of the saints” a century before.[32] Each of these developments took place in a particular society, but all fed into the way that Americans experienced their political Revolution.
4: The Political Revolution, 1774-1789
In political terms resolution means rapid, fundamental change, as one set of power relationships and institutions collapses and another takes its place. That is why it is always wrong to use the word to describe a mere shift of ruling groups while institutions endure. On this count, the American Revolution seems problematical. The states look so similar to the provinces. The period seems one of simple, easy continuity, with change happening only when the break with Britain made it necessary.[33] But this was in fact far from the case.
Independence was achieved by gatherings that were wholly illegal from the standpoint of the old order. Popular committees at the level of towns and counties, conventions in the provinces, and the Continental Congress were revolutionary manifestations of the most fundamental sort. Between 1774 and 1776 they destroyed the old political order at every level, and brought direct involvement in politics to men who had never before experienced it. The Declaration of Independence, proclaimed by the Continental Congress, was the culmination of a process that began with the closing of the Massachusetts courts, organized by town committees. It was through these committees, conventions and congresses that the Revolution came to ordinary people and that ordinary people made their Revolution.
The committee movement had two sources. One was spontaneous and local, the other orchestrated and proto-national. In large communities and small, the aftermath of the Boston Tea Party, with the town’s harbor closed and the government of Massachusetts reorganized, moved men to organize themselves so they could help. Sometimes the organizers were simply the old elite. In the Carolinas, as in Virginia, Whig gentlemen strove mightily to convince their humbler neighbors to overcome their doubts and fears and join in.[34] But in general where popular committees appeared “new men” did as well. Sometimes they were brought in to give the movement the broadest possible base; sometimes they pushed their own way forward in open conflict; sometimes their involvement sprang from a mixture of the two.
Richard Ryerson’s study of the committee movement in Philadelphia shows how it brought disruption, mobilization, and transformation. Radical politics there resulted directly from the emergence of an organized, articulate mechanic class, and merchants hostile to Britain found themselves forced to enter into a series of coalitions with these men. The mechanics found their politicial forum in the committee, which, as time went on, came to be made up of men less and less characterized by property and prestige. The organization of committees created a situation of dual power, as they vied with the provincial government for authority and finally brought that government down in the summer of 1776. Ryerson links the internal disputes, the emergence of new men, and the politicization of day-to-day issues with the larger struggle for American freedom. Building a coalition whose members knew and expressed their own interests while they acted on the larger question was a “basic revolutionary process.”[35]
The committee movement began with tens, perhaps hundreds, of separate local initiatives. But it gained continental scope and political force at two clearly visible points. In November 1774, the first Continental Congress called for “committees in County, City and Town” to enforce its economic boycott of Britain. Neither boycotts of British commerce norcommittees to enforce them were new by this time, but previous committees had been picked by a town’s merchants; now they would be picked at elections and mass meetings by the whole body of citizens. These committees had a mandate to interfere directly in economic life to enforce the boycott. They could, and did, order the seizure of goods imported in violation of it. They could, and did, hold up violators for public contempt and ostracism. The Continental Association, as the boycott was called, did more than resist Britain; it marked a long step towards the transformation of America.
With the outbreak of war in April 1775, community after community gathered to elect committees of safety able to meet the crisis. Committees that had consisted of four or five self-chosen men operating half in secret grew to memberships of thirty or more and began operating openly. In Albany County, New York, mass meetings early in May elected a committee of 153 members to replace one of perhaps fifteen. In New York City a committee of one hundred men was elected after days of tumult in which Sons of Liberty broke open the town arsenal and took control of the streets.[36]
The first task of the committees of safety was to organize for war. The need was for a popular army that could defend the local community and that could go, if need be, to Massachusetts, where half-organized farmers confronted General Thomas Gage and the only powerful British fore in America. As soon as it formed, the Albany committee of safety asked the mayor to organize a “burgher’s watch,” and when he refused established one itself. In Philadelphia, as in Virginia, the costume of the Revolutionary militia became an emblem of much larger tensions, for ordinary militiamen insisted that a plain hunting shirt would do for officers and men alike. They raised the issue through their committee of privates, whose very name betokened what was underway. By organizing those militia units and by assembling supplies and raising finance for them, the committees were turning themselves into a counter-government. They began to meet not in taverns or private homes but in their own “committee chambers.” They established jails for the Revolution’s enemies, appointed officials to carry out their orders, and summoned the citizens to public meetings which they expected everyone to attend. As early as the summer of 1775 local government could do little unless the committee agreed and by 1776 the committees were wielding all power.[37]
The committees were radical both because they developed and operated outside the old framework of legality and because of the way they brought new men into public life. Farmers, small merchants and artisans suddenly found themselves at the center of affairs. What their involvement could lead to was shown most clearly in Philadelphia in 1776. When Tom Paine wrote and published Common Sense he made himself the voice of the small people whom the Revolution had brought into politics. Eschewing ornate references and classical quotations, he slashed through the rationale of monarchy. He likewise abandoned traditional Whig politics, with its careful balance of the principles of monarchy, aristocracy, and democracy. Paine called for the establishment of a simple republic, with direct, responsive institutions, and throughout the continent people agreed that he had spoken for them. In Pennsylvania such men, Philadelphia artisans and backcountry farmers alike, joined together to write a radical democratic state constitution that provided no governorship and no upper house. By annual elections, by simple requirements for office, and by requiring that laws be submitted to the people before they were finally enacted, the Pennsylvania constitution came close to institutionalizing the massive involvement that the committee movement had generated.[38]
Nor were Pennsylvanians alone. The Green Mountain rebels, struggling simultaneously against Britain and against New York, took the Pennsylvania constitution as a model for their own. Anonymous writers in Massachusetts began to produce pamphlets bearing titles such as The People the Best Governors. Mechanics in New York City, hearing in May 1776 that a new state constitution was to be written, sent a bold message to their “elected delegates,” demanding that before going into effect it be submitted to the people. But, perhaps more indicative of where they stood’, they demanded that any constitution leave the citizens free to recreate their revolutionary committees at any time and for any reason they might choose.[39]
Nor did the movement stop at the point of independence. Rather, it persisted in places as late as 1778, and in 1779 popular pressure revived it all over the Northern states. Committees persisted and were revived because ordinary people saw in them the best means of dealing with the economic and social distress that the War of Independence brought. The war’s demands were severe: three armies, American, British and French, competed for scarce supplies; continental and state currency depreciated to the point of worthlessness; refugees from areas of battle and zones of British control crowded into Revolutionary districts. As early as the end of 1776, committee members were dealing with such problems in the way that traditional corporatist political economy prescribed. They set up manufactories to provide work for people who needed it. They regulated prices and took control of the distribution of scarce goods. They jailed people who violated their orders. After 1776 they found themselves jostling for power not with the institutions of the British empire but rather with American state governments that were proving unequal to the task of controlling the economy.
The committees and what they stood for generated intense opposition among loyalists, of course. But they also provoked a fundamental debate within the Revolutionary coalition, a debate which developed on several levels. It pitted men who had become democrats, enamoured of the people’s right to rule, against men who remained Whigs, believing in balance and stability. It pitted plebians just discovering their political identity against members of the old elite, habituated to power. Increasingly, after 1776, it pitted men committed to traditional corporatist views of political economy against believers in the free market. The one position continued to hold that the community had a right to intervene in the use and the disposition of private property; by the late 1770s intervention meant the action of committees and it amounted to a means of lower-class self-defence. The other position held that only an unrestrained market could resolve the country’s problems of supply and demand. English thinkers had begun articulating free-market ideology as early as the seventeenth century, and Adam Smith put its case forcefully in The Wealth of Nations, published in Britain in 1776. In America the war years saw free-market thought become more and more identified with the mercantile and landed upper class, and with a belief in a complex balance of power rather than a simple direct democracy.[40]
How the elite responded to the popular upheaval varied from state to state.[41] In Pennsylvania they simply lost control in the summer of 1776 and saw the radicals impose a democratic constitution that they loathed. But men of cooler temper rapidly regrouped to oppose the radicals: led by Robert Morris, the Philadelphia merchant who organized Congress’s finances, and byJames Wilson, judge and legal theorist, they campaigned ceaselessly against the state constitution and for a free market. In 1779 they opposed the revived committee so strongly that a street battle resulted. In Maryland great planters rather than plebians seized control in 1776. They wrote a state constitution whose complicated arrangements were intended to keep the plebians as far as possible from the levers of power. The Boston lawyers and merchants who took the lead in Massachusetts offered two constitutions to the people of their state, one in 1778 and the other in 1780. Both were complex documents, written to establish stability, not to continue participation. The voters rejected the first but accepted the second, and once it was in effect those same leaders imposed a rigid hard-money policy on the state. That policy met the needs of seaport merchants, but ran directly against the interests of western farmers. New York’s constitution of 1777 expressly stated that the popular committees had been only a temporary convenience whose necessity was at an end. Between its proclamation—it never was ratified despite the mechanics’ demand—and the end of the committee revival, a coalition of landlords, merchants and lawyers did all they could to establish two main principles. One was that despite the committees the constitutional government must be supreme. The other was that the free market must determine the state’s economic life.
It was from such disputes that competitive state-level politics developed. Again there was no simple or single way. In New York and Pennsylvania competition brought open partisanship. In Pennsylvania “Cons titutionalists” (the radicals) and “Republicans” (the Whigs) confronted each other over the question of the state constitution of 1776. In New York partisanship grew more slowly. The issue at stake was not the shape of the state’s government but rather a host of problems of public policy, including the treatment of loyalists, the mode and social effects of taxation, the control of prices, and the distribution of vacant land. In Massachusetts farmers affected by Boston’s hard-money policies continued to gather in conventions and to keep the courts closed. They still feared that if the courts opened they would issue writs for the seizure of their farms to pay off debts and taxes. The farmers simply did not have the hard coin in which those debts and taxes were due. Their resistance finally culminated in Shays’ Rebellion of 1786, when they briefly took up arms against Boston’s policies, and in the next election, when their votes replaced Governor James Bowdoin with John Hancock and brought in a legislature more willing to listen to them.
Meanwhile the planters who wrote Maryland’s restrictive state constitution had learned that without plebian consent it simply would not work. Led by Charles Carroll of Carrollton to see “the wisdom of sacrifice,” they made massive concessions to popular demands on such questions as taxation and money policy. South Carolina saw intense disputes after independence, among low-country planters, backcountry farmers, and Charleston mechanics and merchants. Only when the introduction of tobacco culture and of massive slavery transformed the backcountry in the image of the lowlands did the state settle down. Even in Virginia, the state least transformed of all, political life was much more open and competitive after 1776 than it had ever been before. All of these changes help explain the fact, noted years ago by Jackson Turner Main, that one consequence of the Revolution was to democratize the state legislatures, bringing into them on a massive scale men who would never have held seats in the provincial assemblies.[42]
In New England, in the mid-Atlantic states, in the Chesapeake, and in the far South alike, men of all sorts disputed over how and in whose interests republican America would run its affairs. Particular disputes might be framed in terms of simple institutions versus complex ones, as in Pennsylvania or Massachusetts. They might be framed in terms of committee power versus regular government, as in New York. They might be framed around the question of soft money versus hard or even, as in Virginia, around the costume of the militia. But they all reflected a tension that ran through the whole continent and the whole era.
All of these disputes culminated in 1787 and 1788 with the struggle over the Federal Constitution. The Constitution replaced the weak continental government that had been established under the Articles of Confederation during the war with a much stronger one, still in operation today. Like New York’s constitution of 1777 and the Massachusetts constitution of 1780, it provided for a two-house legislature, a strong executive, and a powerful court system. Like every other stage in the Revolution, its coming was marked by complexity. People entered the Federalist coalition of 1787 and 1788 for just as many reasons as they entered the resistance movement of 1765 and the coalition for independence in’ 1776. Some were concerned about commerce and trade among the states and with the world; some were worried about America’s weakness in a world of power politics; some had evil memories of the difficulty of running the war of independence without an adequate government; some wanted to reorganize American finances. But the men who were central to Federalism—such as Washington, Alexander Hamilton, and James Madison—believed that not only was the central government too weak but also something had gone profoundly wrong in the states. As they looked around them they saw too much soft money, too many men with mud on their boots in power, too much rebellion, too much democracy. They worked to strengthen the central authority for the sake of creating a counter-balance to what they saw at the state level.[43]
It would be wrong, however, to see the Constitution as simply a reaction against everything that had happened since independence. The Federalists mobilized many of the men who had stood in the late 1770s against committee power, simple government and economic corporatism. But they also attracted genuine popular support. In Boston, New-York, Philadelphia, Baltimore and Charleston, mechanics paraded behind the banners and symbols of their crafts to celebrate the Constitution. Their enthusiasm was. unforced and unfeigned. Twenty years earlier these men had rioted and demonstrated in their anger; now (as the Appendix illustrates) they celebrated both what they saw as a solution to America’s problems, and the social and political space that they had carved out for themselves. They had left behind their belief that either crowd action or popular committees could solve massive social and economic problems. They were learning to live with free-market economics, and had come to appreciate their interdependence with the people of other states. But most of all, they had gained self-confidence and self-organization, and though they were with the Federalists in 1788 it was because they had thought the questions through for themselves. They would not, in fact, be with them for long. By 1794 they would be lining up politically with the emergent Democratic-Republican party in opposition to the foreign and domestic policies for which Washington and Hamilton had come to stand.[44]
The self-consciousness and political independence of the mechanics encapsulate the difference that the Revolution made to all sorts of people who lived through it. It did not end slavery where slavery was important. It did not make poor Americans into rich ones. America may well have been a less equal society in 1790 than it was in 1765.[45] But the Revolution, did strikingly transform the terms on which people dealt with one another, replacing not only monarchy with republicanism but also an elitist political system and political culture with one much more open and democratic. That change happened because people who had found a new political identity in the Revolution forced it to happen.
5: Revolution and Transformation
So complex an event as the Revolution cannot be reduced to any single formula. The configuration of elements and of people and the course of events were simply not the same in Georgia as they were in New Hampshire, or in North Carolina as in New Jersey. But beneath the variety problems common to all thirteen states were working themselves out. One set of problems, the traditional matter of studies of the Revolution, turned on the relationship of America to Britain. The second, which has been this pamphlet’s concern, turned on the relationship of ordinary people to the arrangements of power, ideology and privilege that structured their societies. A third, lurking behind the first two, led to the establishment of the political framework within which a liberal, capitalist and industrial society, rapidly expanding to imperial dimensions, would develop in the course of the nineteenth century. The connection between the first and the third is not difficult to see: autonomous development required the end of colonial relations and the establishment of a continental government strong enough to protect it in its infancy.[46] But what is the larger significance of the second set? How were the changes that took in place in political society related to the creation of the United States as we know it?
The answer may lie in a basic congruence between those changes and the central text of American liberalism, James Madison’s Federalist, no. 10. The Federalist was a series of essays jointly written by Madison, Alexander Hamilton and John Jay in 1787 and 1788. Addressed to the people of New York, the essays argued reason after reason why they should accept the new Constitution. In the tenth of those essays Madison addressed the relationship between a republic’s size and its prospects for ,stability. Political thinkers had always maintained that a republic could work only if its citizens had a great deal in common socially and economically and if they acted for the general good rather than to advance their own interests. Madison argued instead that it was possible to have a republic whose citizens would have little in common and who would act for their own selfish reasons.[47] Madison was an ideologue, not a dispassionate analyst. He had a policy to achieve, not simply an essay to publish. But he realized earlier than most Americans that their country was made up of groups of people conscious of their own interests, anxious to control their own world, and willing to struggle in order to do it. Madison saw farmers pouring westward, merchants scrambling for profit, tenants determined to become freeholders, entrepreneurs developing visions of industry, lawyers forming bar associations, and mechanics organizing trade associations. His hope was that in a large republic the political process would yield men capable of the large view. But he realized that the underpinning of the republic would have to lie in self-interest and endless competition, not in. self-restraint and highminded pursuit of the general good. He could not have written the tenth Federalist twenty years earlier. The Revolution had been an explosion of involvement, consciousness and self-organization on the part of all sorts of people. They were the people whom Madison observed, and on whose pursuit of self-interest his analysis conferred legitimacy.
The congruence between a social framework comprising many jostling, competing groups and an economic framework based on competitive capitalism is not difficult to perceive. But the Revolution established the pattern of American liberty as well as laying the basis for American capitalism. Part of that pattern is the tradition of individual rights, enshrined in the first ten amendments to the Constitution and enforceable in the courts. But another part of it is the continuing emergence into American society of new self-defined groups, claiming a right to organize and to advance their interests equal to that of any group already established. That is why nineteenth-century factory women struggling against their working conditions, middle-class feminists protesting in 1848 against women’s legal subordination, Populist farmers enraged in the 1890s by a marketing system that controlled their lives, and the militants of the Black Panther Party of the 1960s all invoked the heritage of the Revolution to legitimate their own struggles.
In fact, recent writing has shown that even those uses of the Revolution have their roots in the period itself. The Revolution did not bring equality for women, but it did make a significant difference in their lives. Linda Kerber has shown that the notion of “republican motherhood” which emerged during the era was a novel construct in the history of American women. Growing out of the contradiction between the Revolution’s rhetoric of liberty and America’s reality of sexual subordination, it laid one of the intellectual bases on which an organized self-conscious feminism could begin to take shape. Mary Beth Norton has gone further, finding that the Revolution brought a dramatic shift as women moved from submission to a world over which they had no control to assertion of their own power in that world. At the Revolution’s end the women who had lived through it were no longer content to be “good wives,” ignorant of the larger world and accepting their exclusion from it. Instead, they were reading newspapers, discussing politics, disputing with the men in their lives, and seeing that their daughters had the best educations possible.[48] Blacks, too, found their lives changed. They were no more a single group than whites, and their experiences varied immensely.
Most were to suffer new forms of slavery, as the cotton South opened up after 1793, but a minority were more fortunate. During the war some escaped to British lines and eventually to Canada; some were manumitted by masters briefly conscious of the claims of liberty; others benefited from emancipation in the North, whether rapid or gradual, in the generation following the Revolution. These new freedmen took the first steps towards making a collective entity of black America—not least, the founding of the Black Protestant churches from which a Martin Luther King would eventually emerge.[49]
The main story of the Revolution, of course, was played out among white males, and what happened in the short run to women and to blacks is peripheral. It is not, however, peripheral to understanding the Revolution’s long-range effects. The Revolution began with an effort to stave off unwanted changes in how the colonists lived, but it ended by establishing political and social arrangements that led to ceaseless change.[50] In some ways the Federalists of 1787 were trying to put an end to such change when they wrote the Constitution. They were convinced that the surge of ordinary men into public affairs had simply made a mess of American life. That is why they established a government in which, they hoped, men of broad vision, men like themselves, would be in power. They fully expected that a government more distant from the people in spatial and social terms would be less responsive to them in political terms. They wanted to end direct political involvement for the sake of bringing about the legal and economic stability that they believed were necessary to realize their vision of America’s future.[51]
But though they created a strong government, they still found that they needed popular involvement to make it work. That is Madison’s whole point in the tenth Federalist. One might well argue that it was precisely ordinary people’s involvement and confidence that they could control their world that gave the federal government its enduring strength as the United States conquered a continent, waged a civil war and industrialized. The Revolution established conditions that would lead within a century to the triumph of corporate capitalism, but it also established conditions in which groups that organized around their own interests could, sometimes, use the power of the state itself to realize those interests. It may well be that America’s continuing political stability rests on the interplay between a political order committed at its core to the protection of social arrangements centered on the acquisition and the development of private property, and the fact that large numbers of the republic’s citizens have good reason to believe that that political order is their own. Both elements in that interplay were achieved during and as a result of the Revolution.
6: Appendix
The Artisans in the Revolution
Throughout the Revolutionary era the artisans (or “mechanics”) of America’s towns were on the political stage. The changing issues about which they involved themselves, the changing language that they and their spokesmen used, and the changing means they employed to express their concerns show graphically how one group changed the terms of its involvement in American society during the Revolution.
The artisans were not a working class in the modern sense of the term. Though they worked with their hands, producing goods for others to buy and use, they were not employees in capitalist enterprises. Rather, they did their work in small shops, each presided over by a master of the trade, each including independent journeymen hired for a term as well as apprentices learning how it was done. Journeymen and apprentices alike looked forward to becoming masters themselves. Artisans formed their identities in terms of their crafts, and during the Revolution they formed those identities in terms of a claim that they should enjoy political rights equal to those of anyone else. We can watch that claim taking shape, as first they asserted their right to involvement in the movement, then realized their power to speak out in their own interest, and finally took a central part in the making of the Republic.
Artisan involvement first became an issue during the boycott of British goods with which the colonies resisted the Townshend taxes. The non-importation movement was good for American mechanics, because it assured that there would be a market for their own goods. But when non-importation began to crumble in 1770, artisans found that they and their merchant allies had different views. For the merchants, it meant, at last, a chance to resume their businesses, and they claimed the right to decide for themselves when they would do it. For the tradesmen it meant competition once again from British goods, and they claimed a say in whether the boycott should end. Here is how one Philadelphia broadside of 1770, signed “A Tradesman,” made the case:
And will you suffer the Credit and Liberties of the Province ofPennsylvania to be sacrificed to the Interests of a few Merchants inPhiladephia? Shall the GRAND QUESTION, whether America shallbe free or not, be determined by a few Men, whose Support andimportance must always be in Proportion to the Distresses of ourCountry? … In determining Questions of such great Consequence, theConsent of the Majority of the Tradesmen, Farmers and other Freemen. . . should have been obtained . . . . The Tradesmen who have sufferedby the Non-Importation Agreement are but few, when compared to theNumber of those who have received great Benefit from it . . . . 1 co njureyouby the Loaeyou bear toyourselves- toyour country- toyour posterityand above all by the Homage you owe to HUMAN LIBERTY, not tosurrender . . . but to assertyour Freedom at the Expence ofyour Fortunesand your Blood.[52]
During the Independence crisis artisans openly assumed a separate station in political life. In Worcester, Massachusetts, blacksmiths resolved not to “do or perform, any blacksmith’s work or business . . . for any person or persons whom we esteem enemies to this country” and recommended “to all denominations of artificers that they call meetings of their respective craftsmen . . . and enter into associations and agreements” to do the same.[53] The smiths were acting for the common cause, but by the end of the 1770s, as inflation ravaged the American economy, other mechanics were looking to their own interests. In Philadelphia the leather trades decided in 1779 that they wanted an end to public efforts to control the market, and they told other Pennsylvanians why:
The committee . . . hint that their fixing the prices of our commodities first,was in a great measure to give us “the preference of setting the first exampleas a rule for other trades, for though only one was mentioned, all mereintentionally and inclusively regulated. “ And we would gladly have madethat honour our own, by a compliance, did not we see . . . that any partialregulation of any number of articles would answer no end but that ofdestroying the tradesman whose prices are limited, and . . . leaning thecountry in absolute want of those articles.[54]
Many artisans would not have agreed with the leatherworkers’ insistence that the free market should determine the price of all goods. But they would have endorsed emphatically the fact that the tanners and curriers and shoemakers were thinking for themselves, calculating their own interests. The many artisans, all over America, who supported the Federal Constitution in 1787 and 1788 were doing the same thing. Here is a meeting of Boston tradesmen, announcing its position:
It is our opinion, if said Constitution should be adopted . . . Trade and Navigation will revive and increase, employ and subsistence will be afforded to many of our Townsmen, who are now suffering from want of the necessaries of life; that it will promote industry and morality, render us respectable as a nation; and procure us all the blessings to which we are entitled from the natural wealth of our country, our capacity for improvement, from our industry, our freedom and independence.[55]
Mechanics throughout the country showed that they agreed when they joined in massive parades to celebrate as their states declared ratification. In Charleston, in New York, in Philadelphia, in Baltimore, in Boston and in lesser towns as well, men marched behind the banners of their crafts. Carrying their tools, and even pulling floats on which some of their number were putting them to use, they displayed slogans that proclaimed their republican American identity and their pride in themselves. “Both Buildings and Rulers Are the Work of Our Hands,” announced the Philadelphia bricklayers. “Time Rules All Things,” proclaimed the clockmakers. “By Hammer and Hand, All Arts Do Stand,” read the banner of the smiths. “May Our Country Never Want Bread,” said the bakers.[56]
Twenty years earlier, such parades would not have been possible. They were patriotic celebrations, proclaiming the delight of people who had once been colonial subjects and had now become republican citizens. But they were also celebrations of the self, proclaiming the pride in being what they were that the artisans had built as they waged their Revolution.
7: Guide to Further Reading
Readers requiring a narrative introduction to the Revolution should turn to one of the following one-volume studies written specifically to introduce the subject: Esmond Wright, Fabric of Freedom, 1763-1800 (London: Macmillan, 1965; rev. edn., New York: Hill & Wang, 1978); Edmund S. Morgan, The Birth of the Republic, 1763-1789 (Chicago: Univ. of Chicago Press, 1956); and J.R. Pole, Foundations of American Independence, 1763-1815 (Indianapolis: Bobbs Merrill, 1972; London: Fontana-Collins, 1973).
Those who wish to explore the Revolution more widely and deeply are presented with two possible reading strategies. One is to begin with the classics and work through the historiography to the present day. Two anthologies can serve as starting places for the reader following this route. One is Jack P. Greene, ed., The Reinterpretation of the American Resolution, 1763-1789 (New York: Harper & Row, 1968), which contains a superb bibliographical essay and an excellent collection of mid-twentieth-century essays and excerpts. The other is Edmund S. Morgan, ed., The American Revolution: Two Centuries of Interpretation (Englewood Cliffs, NJ.: Prentice-Hall, 1965), which has selections from eighteenth, nineteenth and twentieth-century writers. Selections from the writings of the Imperial and the Progressive historians, not included in either Greene or Morgan, can be found in Esmond Wright, ed., Causes and Consequences of the American Resolution (Chicago: Quadrangle, 1966). All these anthologies provide samples from the original major works which the reader can move on to as he or she chooses.
The other reading strategy is to begin with the contemporary debate. In that case, a starting place can be found in Pauline Maier, From Resistance to Resolution: Colonial Radicals and the Development of American Opposition to Britain, 1765-1776 (1972), [11]* and in the anthology edited by Alfred F. Young entitled The American Resolution: Explorations in the History of American Radicalism (1976).[32]* Maier and the writers in the Young book consider many of the same problems and much of the same evidence, but with different points of view, emphases and conclusions. The contrasts between them in many ways pose the framework for contemporary discussion.
The approach to the Revolution from the point of view of political culture receives its central statement in Bernard Bailyn, The Ideological Origins of the American Revolution (1967).[12] The origins of that culture are considered at length in J.G.A. Pocock, The Machiavellian Moment: Florentine Political Thought and the Atlantic Republican Tradition (1975).[10] What became of it after independence is discussed at equal length -and most revealingly—in Gordon S. Wood, The Creation of the American Republic, 1776-1787 (1969).[12] J.R. Pole writes within much the same framework in Political Repesentation in England and the Origins of the American Republic (1966),30 which incorporates detailed studies of the key states of Pennsylvania, Virginia and Massachusetts. Somewhat older, but still useful, is Clinton Rossiter, Seedtime of the Republic (1953).[10] The link between ideas and action is considered in Maier, From Resistance. to Resolution, in Edmund S. and Helen M. Morgan, The Stamp Act Crisis: Prologue to Resolution (Chapel Hill: Univ. of North Carolina Press, 1953), and, from a different viewpoint, in Eric Foner, Tom Paine and Revolutionary America (1976).[38]
A number of scholars have considered the social, as opposed to intellectual background to the Revolution. Jack P. Greene has written on it in a number of places, but most succinctly in his Harmsworth inaugural lecture at Oxford, All Men are Created Equal: Some Reflections on the Character of the American Resolution (Oxford: Clarendon Press, 1976). Richard Hofstadter deals with it in America at 1750: A Social Portrait (New York: Knopf, 1971), as do Jackson Turner Main in The Social Structure of Revolutionary America (1965) [13] and Kenneth Loekridge in Settlement and Unsettlement in Early America (Cambridge, England: Cambridge UP, 1981). Perhaps the most sophisticated single statement is James A. Henretta, The Evolution of American Society, 1700-1815 (1973),[17] which should be supplemented by his essay “Families and Farms: Mentalite in Pre-Industrial America,” William and Mary Quarterly, 35 (1978), 3-32, and by Michael Merrily “Cash is Good to Eat: Self-Sufficiency and Exchange in the Rural Economy of the United States,” Radical History Review, no. 4 (1977), 42-71.
The testing ground for the broad generalizations in these studies is to be found in analyses of communities and states. One of the most easily grasped, yet most sophisticated, is Robert Gross’s The Minutemen and Their World (1976), a study of Concord, Massachusetts. A number of other studies are roughly congruent with Gross’s approach, likewise finding significant social tensions within the Revolutionary movement. These include, working from north to south, the following: Richard L. Bushman, From Puritan to Yankee (1967),[31] on Connecticut; Edward Countryman, A People in Revolution, 1760-1790 (1981),[15] on New York; Rhys Isaac, The Transformation of Virginia, 1740-1790 (1982);14 Ronald Hoffman, A Spirit of Dissension (1974),32 on Maryland; and Jerome J. Nadelhaft, The Disorders of War: The Revolution in South Carolina (1981).[34] Gary B. Nash, The Urban Crucible (1979)[14] discusses and compares developments in the three major cities.
Against these, however, should be read studies which do not find that social tensions contributed significantly to the Revolutionary movement. The most seminal is Robert E. Brown, Middle-Class Democracy and the Resolution in Massachusetts (1955).[6] This can be read in conjunction with Patricia U. Bonomi, A Factious People: Politics and Society in Colonial New York (1971);[16] Sung Bok Kim, Landlord and Tenant in Colonial New York: Manorial Society, 1664-1775 (Chapel Hill: Univ. of North Carolina Press, 1978); Jere R. Daniell, Experiment in Republicanism: New Hampshire Politics and the American Resolution, 17411794 (Cambridge, Mass.: Harvard UP, 1970); and A. Roger Ekirch, “Poor Carolina”: Politics and Society in Colonial North Carolina, 1729-1776 (1981).[16] A view of urban development that conflicts with Nash’s can be found in G.B. Warden, “Inequality and Instability in Eighteenth Century Boston: A Reappraisal,” Journal of Interdisciplinary History, 6 (1975-76), 585-620. In all of these studies, conflict is ascribed to tensions within middle-class society rather than—as in this pamphlet—to the working-out of fundamental contradictions.
The radicals of the Revolution are considered in a number of places. Pauline Maier supplements From Resistance to Resolution (1972)11 with her The Old Revolutionaries: Political Lives in the Age of Samuel Adams (1980).[25] Eric Foner’s study of Tom Paine and Reaolutionary America (1976)[38] places its subject firmly in social context. The Revolutionary committees are considered, in different ways, not only in Gordon Wood’s Creation of the American Republic” and Countryman’s A People in Revolution,” but also in Richard D. Brown, Reaolutionary Politics in Massachusetts: The Boston Committee of Correspondence and the Towns, 1772-1774 (Cambridge, Mass.: Harvard UP, 1970), and Richard Alan Ryerson, The Resolution is Now Begun: The Radical Committees of Philadelphia, 1765-1776 (1978).[35] Crowd action is a major subject of the books by Maier, Foner, Nash, Countryman, Ekirch, Bonomi and Kim noted above, but the most important study is Dirk Hoerder, Crowd Action in Revolutionary Massachusetts, 1765-1780 (1977).[26]
Three main studies now encompass the debate on women and the Revolution. They are: Joan Hoff Wilson, “The Illusion of Change: Women and the American Revolution,” in Young, ed., The American Revolution, [32] pp. 383-445; Mary Beth Norton, Liberty’s Daughters: The Revolutionary Experience of American Women, 1750-1800 (1980);18 and Linda Kerber, Women of the Republic: Intellect and Ideology in Revolutinary America (1980).[18] The superb anthology edited by Ira Berlin and Ronald Hoffman, Slavery and Freedom in the Age of the American Resolution (1983),[49] represents the most advanced thinking on the experience of Black Americans in the era. The subject is discussed at length in David Brion Davis, The Problem of Slavery in the Age of Resolution, 1770-1823 (Ithaca: Cornell UP, 1975), Duncan J. MacLeod, Slavery, Race and the American Resolution (Cambridge, England: Cambridge UP, 1974) and Benjamin Quarles, The Negro in the American Revolution (Chapel Hill: Univ. of North Carolina Press, 1961).
It is perhaps not surprising, given the attention that modern scholarship has paid to the ideology and the social process of the Revolution, that very little work has considered the American colonies in the light of contemporary debates on colonialism, underdevelopment and dependency. There is nothing to match the analysis of Stanley and Barbara Stein in The Colonial Heritage of Latin America: Essays on Eocnomic Dependence in Perspective (New York: Oxford UP, 1970). Three recent books, however, present a suggestive start, in different ways: Paul G.E. Clemens, The Atlantic Economy and Colonial Maryland’s Eastern Shore: From Tobacco to Grain (Ithaca: Cornell UP, 1980); Joseph A. Ernst, Money and Politics in America, 1755-1775: A Study in the Currency Act of 1764 and the Political Economy of Revolution (Chapel Hill: Univ. of North Carolina Press, 1973), and Michael Kammen, Empire and Interest: The American Colonies and the Politics of Mercantilism (Philadelphia: Lippincott, 1970). Possibilities for further study along these lines are discussed in Edward Countryman and Susan Deans, “Independence and Revolution in the Americas: A Project for Comparative Study” (1983).[46] The problem of unequal colonial relations lurks behind the rich analysis in Edmund S. Morgan, American Slavery, American Freedom: The Ordeal of Colonial Virginia (New York: Norton, 1975).
The Bicentennial decade produced a sizable number of excellent anthologies on the Revolution. In addition to Alfred F. Young’s The American Resolution (1976),[32] these include: Stephen G. Kurtz and James H. Hutson, eds., Essays on the American Resolution (1973);[12] Erich Angermann et al., eds., New Wine in Old Skins: A Comparative View of Socio-Political Structures and Values Affecting the American Revolution (1976);[39] Jack P. Greene and Pauline Maier, eds., Interdisciplinary Studies of the American Resolution (Beverly Hills: Sage, 1976); Richard Maxwell Brown and Don E. Fehrenbacher, eds., Tradition, Conflict and Modernization: Perspectives on the American Resolution (1977);[24] Richard M. Jellison, ed., Society, Freedom and Conscience: The American Resolution in Virginia, Massachusetts and New York (1976); and Ronald Hoffman and Peter J. Albert, eds., Sovereign States in an Age of Uncertainty (1982).[41]
Finally, Alfred F. Young traces the Revolutionary experience ofone Boston shoemaker in his prize-winning essay “George Robert Twelves Hewes: A Boston Shoemaker and the Memory of the American Revolution,” William and Mary Quarterly, 38 (1981), 561-623. Hewes’s career led him through so many of the Revolution’s facets and transformations that it comes as close as anyone’s to epitomizing what the era meant for Everyman.
* For full bibliographical details, see the appropriate reference in the Notes, as indicated.
8: Notes
- E.g., Staughton Lynd, “Abraham Yates’s History of the Movement for the United States Constitution,” in his Class Conflict, Slavery and the United States Constitution (Indianapolis: Bobbs-Merrill, 1967), pp. 217-46. Back
- George Bancroft, History of the United States, 6 vols. (1834-76; rev. edn., N.Y.: Appleton, 1883-85). Back
- For the most succinct statements, see Charles M.Andrews, Colonial Background of the American Resolution (New Haven: Yale UP, 1924) and L.H. Gipson, “The American Revolution as an Aftermath oCthe Great War for Empire, 1754-1763,” Political Science Quarterly, 65 (1950), 86-104. Back
- Carl L. Becker, The History of Political Parties in the Province ofNew York, 1760-1776 (Madison: Univ. of Wisconsin Press, 1909). Back
- Charles A. Beard, An Economic Interpretation of the Constitution of the United States (ICY.: Macmillan, 1913). Back
- The most important is Robert E. Brown, Middle-Class Democracy and the Revolution in Massachusetts (Ithaca: Cornell UP, 1955). Back
- Forrest McDonald, We the People: The Economic Origins of the Constitution (Chicago: Univ. of Chicago Press, 1958). Back
- Carl L. Becker, The Declaration of Independence (New York: Knopf, 1922). Back
- E.S. Morgan, “Colonial Ideas of Parliamentary Power, 1764-66,” William and Mary Quarterly, 5 (1948), 311-41. Back
- Clinton Rossiter, Seedtime of the Republic (ICY.: Harcourt, Brace and World, 1953); D.G. Adair, “Experience Must Be Our Only Guide: History, Democratic Theory, and the United States Constitution,” in Ray Allen Billington, ed., The Reinterpretation of Early American History: Essays in Honor ofJohn Edwin Pomfret (San Marino, Cal.: Huntington Library, 1966); Cecelia Kenyon, ed., The Anlifederalists (Indianapolis: Bobbs-Merrily 1966); J.G.A. Pocock, The Machiavellian Moment: Florentine Political Thought and the Atlantic Republican Tradition (Princeton, NJ.: Princeton UP, 1975). Back
- Gordon S. Wood, The Creation of the American Republic, 1776-1787 (Chapel Hill: Univ. of North Carolina Press, 1969); Pauline Maier, From Resistance to Revolution: Colonial Radicals and the Development of American Opposition to Britian, 1765-1776 (ICY.: Knopf, 1972). Back
- See esp. Bernard Bailyn, The Ideological Origins of the American Resolution (Cambridge, Mass.: Harvard UP, 1967); and, for a summary of the “ideological” interpretation, his essay on “The Central Themes of the American Revolution,” in Stephen G. Kurtz and James H. Hutson, eds., Essays on the American Resolution (Chapel Hill: Univ. of North Carolina Press, 1973), pp. 3-31. Back
- Merrill Jensen, The Articles of Confederation (Madison: Univ. of Wisconsin Press, 1940), The New Nation: A History of the United States During the Confederation, 1781-1789 (ICY.: Knopf, 1950), The Founding of a Nation: A History of the American Revolution, 1763-1776 (ICY.: Oxford UP, 1968). For a summary of Ma.in’s position, see his The Social Structure of Revolutionary America (Princeton, NJ.: Princeton UP, 1965). Back
- Gary B. Nosh, The Urban Crucible: Social Change, Political Consciousness, and the Origins of the American Resolution (Cambridge, Mass.: Harvard UP, 1979); Rhys Isaac, The Transformation of Virginia, 1740-1790 (Chapel Hill: Univ. of North Carolina Press, 1982). Back
- Edward Countryman, A People in Revolution: The American Revolution and Political Society In New York, 1760-1790 (Baltimore: John Hopkins UP, 1981). Back
- For an overview of colonial political life, see Bernard Bailyn, The Origins of American Politics (ICY.: Vintage, 1970). This chapter draws heavily on the following state studies, in rough geographical order: Michael Zuckerman, “The Social Context of Democracy in Massachusetts,” William and Mary Quarterly, 25 (1968), 523-44, reprinted in Stanley N. Katz, ed., Colonial America: Essays in Politics and Social Development (Boston: Little, Brown, 1971), pp. 466-91; Patricia U. Bonomi, A Factious People: Politics and Society in Colonial New York (ICY.: Columbia UP, 1971), and Countryman, A People In Resolution, ch. 3; Gary B. Nosh, Quakers and Politics: Pennsylvania, 1681-1726 (Princeton, NJ.: Princeton UP, 1968); Isaac, Transformation of Virginia; A. Roger Ekirch, “Poor Carolina”: Politics and Society in Colonial North Carolina, 1729-1776 (Chapel Hill: Univ. of North Carolina Press, 1981); R.M. Weir, “`The Harmony We Were Famous For’: An Interpretation of Pre-Revolutionary South Carolina Politics,” William and Mary Quarter, 26, (1969), 473-501. Back
- James A. Henretta, The Evolution of American Society, 1700-1815: An Interdisciplinary Analysis (ICY.: Heath, 1973). Back
- Laurel Thatcher Ulrich, Good Wises: Image and Reality in the Lives of Women in Northern New England, 1650-1750 (ICY.: Knopf, 1982); Mary Beth Norton, Liberty’s Daughters: The Revolutionary Experience ofAmerican Women, 1750-1800 (Boston: Little, Brown, 1980), Part 1. Back
- Nosh, Urban Crucible, pp. 233-63, 312-38. Back
- Kenneth Lockridge, “Land, Population and the Evolution of New England Society, 1630-1790,” Past & Present, No. 39 (1968), 62-80, reprinted with an afterthought in Katz, Colonial America, pp. 466-91. Back
- For the most recent analysis of the administration of the empire, see James A. Henretta, “Salutary Neglect”: Colonial Administration Under the Duke of Newcastle (Princeton, NJ.: Princeton UP, 1972); also Jack P. Greene, “An Uneasy Connection: An Analysis of the Preconditions of the American Revolution,” in Kurtz and Hutson, eds., Essays on the Resolution, pp. 32-80. For the colonial elite, see Stanley N. Katz, Newcastle’s New York: Anglo-American Politics, 1732-1753 (Cambridge, Mass.: Harvard UP, 1968); Charles S. Sydnor, Gentlemen Freeholders: Political Practices in Washington’s Virginia (Chapel Hill: Univ. of North Carolina Press, 1952). Back
- Pauline Maier, “Popular Uprisings and Civil Authority in Eighteenth-Century America,” William and Mary Quarterly, 27 (1970), 3-35, and in her From Resistance to Resolution, ch. 1; Jesse Lemisch, “Jack Tar in the Streets: Merchant Seamen in the Politics of Revolutionary America,” William and Mary Quarterly, 25 (1968), 371-407. Back
- Nosh, pp. 129-36; Countryman, pp. 56-58; E.P. Thompson, “The Moral Economy of the English Crowd in the Eighteenth Century,” Past & Present, No. 50 (1971), 76-136. Back
- For an overview, see R.M. Brown, “Back Country Rebellions and the Homestead Ethic in America, 1740-1799,” in Richard Maxwell Brown and Don E. Fehrenbacher, eds., Tradition, Conflict and Modernization: Perspectives on the American Resolution (ICY.: Academic Press, 1977). Back
- Maier, From Resistance to Resolution and The Old Revolutionaries: Political Lives in the Age of Samuel Adams (ICY.: Knopf, 1980). Back
- Dirk Hoerder, Crowd Action in Resolutions y Massachusetts, 1765-1780 (ICY.: Academic Press, 1977), ch.2; Maier, From Resistance to Revolution, ch.3; Countryman, ch.2. Back
- R.L. Bushman; “Massachusetts Farmers and the Revolution,” in Richard M. Jettison, ed., Society, Freedom and Conscience: The American Resolution in Virginia, Massachusetts and New York (ICY.: Norton, 1976); Richard D. Brown, Revolutionary Politics in Massachusetts: The Boston Committee of Correspondence and the Towns, 1772-1774 (Cambridge, Mass.: Harvard UP, 1970); Robert A. Gross, The Minute men and Their World (ICY.: Hill & Wang, 1976). See also John M. Murrin’s creative “Review Essay” on the early New England social literature, History and Theory, 11 (1972), 226-75. Back
- For an argument that the era saw a general social crisis in rural America, see Henretta, Evolution of American Society, ch.4. Back
- Countryman, chs.2, 4, 5. Back
- J.R. Pole, Political Representation in England and the Origins of the American Republic (London: Macmillan, 1966), pp. 285, 294-95. Back
- Isaac, Transformation of Virginia, chs. 7-1 I; E.J. Hobsbawm, Primitive Rebels: Studies in Archaic Forms of Social Movement in the 19th and 20th Centuries (Manchester: Manchester UP, 1959). For the impact of the Great Awakening, see Richard L. Bushman, From Puritan to Yankee: Character and the Social Order in Connecticut, 1680-7775 (Cambridge, Mass.: Harvard UP, 1967). Back
- J.P. Whittenburg, “Planters, Merchants and Lawyers: Social Change and the Origins of the North Carolina Regulation,” William and Mary Quarterly, 34 (1977), 215-38; Ronald Hoffman, A Spirit of Dissension: Economics, Politics and the Revolution in Maryland (Baltimore: Johns Hopkins UP, 1973); Edward Countryman, “`Out of the Bounds of the Law’: Northern Land Rioters in the Eighteenth Century,” in Alfred F. Young, ed., The American Resolution: Explorations in the History ofAmerican Radicalism (DeKalb, Ill.: Northern Illinois UP, 1976),pp. 37-69; Stephen A. Marini, Radical Sects of Reaolulionary New England (Cambridge, Mass.: Harvard UP, 1982). Back
- For the continuity argument, see Clinton Rossiter, The First American Revolution: The American Colonies on the Eve of Independence (ICY.: Harcourt, Brace and World, 1956); Daniel J. Boorstin, The Genius of American Politics (Chicago: Univ. of Chicago Press, 1953); and Benjamin F. Wright, Consensus and Continuity, 1776-1787 (Boston: Boston UP, 1958). Back
- Isaac, ch. l l; Jerome J. Nadelhaft, The Disorders of War. The Resolution in South Carolina (Orono, Maine: Univ. of Maine Press, 1981), ch. 1. Back
- Richard A. Ryerson, The Resolution is Now Begun: The Radical Committees of Philadelphia, 1765-1776 (Philadelphia: Univ. of Pennsylvania Press, 1978). Back
- Countryman, A People In Revolution, pp. 131-60. Back
- Ibid.; Ryerson, The Resolution is Now Begun, pp. 122-24. Back
- Eric Foner, Tom Paine and Revolutionary America (ICY.: Oxford UP, 1976). Back
- Countryman, pp. 162-63. See also Ryerson, chs.4, 8; and Dirk Hoerder, “Socio-Political Structures and Popular Ideology, 1750s-1780s,” in Erich Angermann et al., eds., New Wine in Old Skins: A Comparative View of Socio-Political Structures and Values Affecting the American Resolution (Stuttgart: Ernst Klett, 1976), pp. 41-65. Back
- See Foner, esp. ch.5, and Countryman, ch.6. For Che origins of free-market thought, see Joyce O. Appleby, Economic Thought and Ideology in Seventeenth-Century England (Princeton, NJ.: Princeton UP, 1978); C.B. MacPherson, The Political Theory of Possessive Individualism: Hobbes to Locke (Oxford:-Clarendon Press, 1962). Back
- The following paragraphs draw heavily on the following state studies: R.A. Ryerson, “Republican Theory and Partisan Reality in Revolutionary Pennsylvania: Toward a New View of the Constitutionalist Party,” in Ronald Hoffman and Peter J. Albert, eds., Sovereign States in an Age of Uncertainty (Charlottesville: Univ. Press of Virginia, 1982), pp. 95-133; for Maryland, Hoffman, A Spirit of Dissension, chs.8, 9; for Massachusetts, David P. Szatmary, Shays’ Rebellion: The Making of an Agrarian Insurrection (Amherst: Univ. of Massachusetts Press, 1980), and Pole, Political Representation, pp. 226-43; Countryman, chs. 7-9, on New York, and Isaac on Virginia; and, for South Carolina, Nadelhaft, The Disorders of War. Back
- J.T. Main, “Government by the People: The American Revolution and the Democratization of the Legislatures,” William and Mary Quarterly, 23 (1966), 391-407. Gee also Main’s Political Parties Before The Constitution (Chapel Hill: Univ. of North Carolina Press, 1973). Back
- Wood, Creation of the American Republic, Parts 4 and 5. Back
- Countryman, chs. 9 and 10; Alfred F. Young, The Democratic Republicans of New York: the Origins, 1763-1797 (Chapel Hill: Univ. of North Carolina Press), pp. 101, 387. For the artisans.’ progress from protest through self-assertion to active citizenship, see the materials in the Appendix. Back
- See Alan Kulikofl; “The Progress of Inequality in Revolutionary Boston,” William and Mary Quarterly, 28 (1971), 375-412. Back
- See Edward Countryman and Susan Deans, “Independence and Revolution in the Americas: A Project for Comparative Study,” Radical History Review, No. 27 (1983), 144-71. Back
- D.G. Adair, “That Politics May Be Reduced to a Science: David Hume, James Madison and the Tenth Federalist,” Huntington Library Quarterly, 20 (1957), 343-60. Back
- Linda K. Kerber, Women of the Republic: Intellect and Ideology in Revolutionary America (Chapel Hill: Univ. of North Carolina Press, 1980); Norton, Liberty’s Daughters. Back
- Ira Berlin and Ronald Hoffman, eds., Slaaery and Freedom in the Age of the American Resolution (Charlottesville: Univ. Press of Virginia, 1983). Back
- For such an argument in the context of a particular community, see Gross, The Minutemen and Their World. Back
- Wood, Creation of the American Republic, ch. 12. Back
- “A Tradesman,” broadside (Philadelphia, 1770), no. 11892 in Charles Evans, American Bibliography: A Chronological Dictionary of All Books, Pamphlets and Periodical Publications Printed in the United States of America, 1639-1800, 14 vols. (Chicago, 1903-59). Back
- Blacksmiths’ minutes, 8 Sept., 8 Nov. 1774, Massachusetts Collection, American Antiquarian Society, Worcester, Mass. Back
- “Tanners, Curriers and Cordwainers to the Inhabitants of Pennsylvania,” broadside (Philadelphia, 1779), Evans no. 16547. Back
- Boston Independent Chronicle, 13 Dec. 1787. Back
- Pennsylvania Gazette, reported in Connecticut Courant, 21 July 1788. Back
Top of the Page
Edward A. Abramson, The Immigrant Experience in American Literature
BAAS Pamphlet No. 10 (First Published 1982)
ISBN: 0 946488 00 2
- Immigrants and Literature
- The Frontier and Scandinavian Pioneers
- Rolvaag and Moberg
- Lesser Lights – and Willa Cather
- The City and the Jewish Tradition
- Irish, Italians and Jews
- The Earlier Jewish-American Writers
- The Jewish-American Novel Since 1945
- Literature and Immigrants
- Guide to Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1: Immigrants and Literature
He is an American, who, leaving behind him all his ancient prejudices and manners, receives new ones from the mode of life he has embraced, the new government he obeys, and the new rank he holds. He becomes an American by being received in the broad lap of our great Alma Mater. Here individuals of all nations are melted into a new race of men, whose labours and posterity will one day cause great changes in the world.
Hector St. John de Crevecoeur, 1782 [1]
The immigrant experience constitutes a central shaping force of American culture. Many of the myths which form an inherent part of the idea of America originate in the dreams and experiences of the millions who migrated from the Old to the New World. These have been related and analysed not only by historians but also by literary artists, who have presented a more subjective, more humanized, sometimes more didactic, depiction of the immigrant past. As Henry Seidel Canby has written,
Literature can be used, and has been magnificently used by Americans, in the service of history, of science, of religion, or of political propaganda. It has no sharp boundaries, though it passes through broad margins from art into instruction or argument. The writing or speech of a culture such as ours which has been so closely bound to the needs of a rapidly growing, democratic nation, moves quickly into the utilitarian, … in a literature which is most revealing when studied as a by product of American experience.[2]
In much literature written by Europeans as well as in that of their American descendants, there is a wealth of writing about the immigrant experience. It is not always the product of direct observation, quite often being written by those who stayed behind or by the descendants of the actual immigrants. Different national groups have been associated with different periods of immigration and specific themes. In British literature it mostly takes the form of writing about the problems of settlement and colonization. In French writing it is more often associated with a romantic dream of freedom as in the agrarian optimism of Crevecoeur or the sentimentalization of the noble savage in Chateaubriand.
In the nineteenth century the experience itself was broadened as it became associated with the migrations of different classes and national types: Germans, Irish, Italians, East and Middle Europeans and Scandinavians, to name the most important groups. The phenomenon also assumed massive proportions, with thirty-five million immigrants arriving in the United States between 1800 and 1924 – the largest voluntary migration of people in history. The literature produced by this wave of immigration – whether autobiographical or fictional, whether written by participants or their descendants or by those whose imagination was fired by contact with immigrants – tends to concentrate rather on the problems of adjustment to a new environment and assimilation into a society that had already established a definite white ‘Anglo-Saxon’ and Protestant (or WASP) character.
There have been two distinctive traditions in this last stream of immigrant writing, each of them focusing on one of the two essential destinations of the immigrant – the land or the city. It is of course no accident that the Scandinavians were primarily responsible for developing the literature about immigrant settlement on the land, especially in the West, nor that Jewish immigrant writers took for their preserve the American city with its promise of freedom and problems of assimilation and secularization. The contribution of these two literatures to American thought and culture is incalculable, and it is with them and associated works that this essay will primarily concern itself.
2: The Frontier and Scandinavian Pioneers
The West and the frontier, by the mid-nineteenth century, had come to occupy a central place in the American imagination. Just as generations of Europeans had long believed that a land of opportunity lay across the Atlantic, so Americans saw great opportunities beckoning them to cross the Appalachians into the great Mississippi valley. Not merely was the West the route to fabled wealth, but this fertile heartland could be transformed into an agrarian paradise, a new Garden of Eden tilled by yeoman farmers, economically self-sufficient and spiritually enriched by their closeness to Nature. This Jeffersonian attitude found its expression in literature, in James Fenimore Cooper’s Leatherstocking, the epitome of the “natural gentleman,” and in Walt Whitman’s aspirations for a thrusting, democratic people. These optimistic views of frontier life were expressed, however, at a time when most pioneers were American-born and the area being settled lay primarily east of the Mississippi. Not until after the civil war did immigrants, mainly from Northern Europe, contribute significantly to westward expansion, on the prairies and plains west of the great river. The imaginative writers marked by this immigrant experience significantly changed the way the conquest of the wilderness gained expression in works of literature.
Scandinavians, in particular, played a major role in the settlement of the Far West. Neither Norway nor Sweden could support its population because of the nature of its laws of inheritance and a dearth of productive soil. “Between 1825 and 1914 … Norway lost a higher proportion of her people through emigration to America than any other European country except Ireland.”[3] In Sweden, the 1860s saw both an economic crisis and three years of crop failures which came to be known as “The Great Famine” (1865-68). Thousands of Swedes left for the New World at this time, attracted by the cheap or free land provided by the Homestead Act of 1862, or offered by the railroad companies to intending settlers.
Rolvaag and Moberg
Although many novels were written during the nineteenth century by Scandinavian-American authors, especially Norwegians, very few have been translated and the literary value of those that have is not high. It is largely in the twentieth century that fiction of literary distinction describing Scandinavian-American settlement in America has been created. Most notably, Ole Edvart Rolvaag and Vilhelm Moberg both wrote trilogies describing the experiences of their forbears in America – Rolvaag for the Norwegians, Moberg for the Swedes. In both trilogies, it is the first novels – Giants in The Earth (1927) and The Emigrants (1951) – which are most successful as literature and deserve most attention. Both authors wrote about immigrant pioneers, and stressed the frustrations of the journey from Europe or across America or both, the difficulties of actually settling on the frontier, the importance of the family for survival, and the conflict which developed between the first and second generations. The issue of assimilation into American society was also an important concern to Rolvaag and to some other Scandinavian authors.
Though concerned with the fate of individual characters, Rolvaag and Moberg both stress the importance of the survival and success of the group in forming outposts of the Old World in the New and eventually relinquishing Europe to embrace America. Giants in The Earth opens with Per Hansa and his party already pushing across the American prairie toward their new home; The Emigrants begins in the Old World and ends with the Swedish emigrants setting foot on American soil for the first time. Sophus Winther comments:
It is of value to note that the emigrant novel has a longer history than the picaresque story, that it is more profoundly rooted in human experience, and it concerns itself with the struggle of man for the primary needs of existence, with a representative group, sometimes a whole people. But whether the characters are few or many, their adventures grow out of a desire to escape from bondage into a new world where the spirit of man may achieve the freedoms necessary to a full life.[4]
This struggle is vividly depicted throughout both Rolvaag’s and Moberg’s trilogies in what is basically a realistic form: the meticulous presentation of characters, environment and their interrelation.
Giants in the Earth and The Emigrants successfully evoke the pioneer experience because of their authors’ powers of description and characterization. Rolvaag’s evocation of the prairie presents both its startling beauty and its horrific aspects. Eschewing understatement, he presents a winter evening in highly metaphorical and visually imagistic phrases:
Evenings … magic, still evenings, surpassing in beauty the most fantastic dreams of childhood! . . . Out to the westward – so surprisingly near – a blazing countenance sank to the rest on a white couch … set it afire … kindled a radiance … a golden flame that flowed in many streams from horizon to horizon.
The numerous ellipses slow one’s reading of the passage and give a sense of things unsaid, of the ineffable quality of the experience described. Set against this description is one of a far different sort:
Monsterlike the Plain lay there – sucked in her breath one week, and the next week blew it out again. Man she scorned; his works she would not brook… . She would know, when the time came, how to guard herself and her own against him!” This sentience of the prairie pervades the novel to such an extent that the setting becomes an active element in the depiction of the characters, who must pit themselves against it.[5]
Moberg’s descriptive powers in The Emigrants are seen most clearly in his documentary depiction of conditions in Sweden for poor farmers. In preparing to leave for America, we are told that “Kristina packed eight rye-meal loaves and twenty of barley, a wooden tub of strongly salted butter, two quarts of honey, one cheese, half a dozen smoked sausages, a quarter of smoked lamb, a piece of salt pork, and some twenty salted herrings.” But alongside this matter-of-fact listing of items, one also finds a highly sensuous prose: The fire sparkled and all enjoyed the coziness of the inn after the cold road. Their senses as well as their limbs thawed. There was an odor of food and brannvin, snuff and chewing tobacco, greased leather and warm, wet wadmal [coarse woollen clothing], there was a fragrance of mothers’ milk as the women suckled the children.
The details are preserved , but are evocatively presented as they impinge on the consciousness of the characters. This creative use of detail is more characteristic of Rolvaag’s style than the mere piling up of facts which Moberg frequently resorts to.[6]
Rolvaag presents American nature as inimical to man, and the characterization reflects the settlers’ abilities to respond to the challenges of this hostile environment. Thus, Beret mirrors Old World attitudes in her fatalistic and superstitious fear of this new environment; whereas, as Joseph Baker puts it, “her husband, Per Hansa, is a man of the West; he glories in the fact that he is an American free-willer, self-asserting. He rebels against destiny and tries to master nature.” Baker even sees the struggle presented in cosmic terms: “Here the pioneer struggle with the untamed universe may serve as a symbol for the condition of man himself against inhuman destiny.”[7]
The characterization in The Emigrants elicited some hostile criticism from Swedish critics on account of the dialogue of the peasants. The earthy language which they employed, particularly that of Ulrika, the parish whore, and Jonas Petter, who enjoys telling vulgar stories, scandalized some readers, despite Moberg’s insistence that this was the way they actually spoke. Indeed, it is the characterization which primarily accounts for the novel’s realism. Moberg manages to combine a great deal of factual information with a warmth of characterization, and by making empathy with the characters possible, the facts are humanized and dramatized.
One must be careful in appraising the literal truthfulness of the factual information presented in The Emigrants. All of Moberg’s aunts and uncles had gone to America, and there were many letters and photographs from them to which he had access. He relied upon answers to direct questions which he posed in letters to individuals who had been involved in the migration and upon historical documents. Thus, the information was decidedly of second-hand origin. He did finally decide to emigrate in 1948, having almost done so over thirty years earlier. Indeed, part of The Emigrants was written in Sweden and part in America, the remaining volumes of what in Swedish was a tetralogy being written in the United States. Philip Holmes points out that The documentary elements of the novel are partly based on historical evidence, and are partly Moberg’s own creation. So much is real that the remainder, the merely realistic, takes on the illusion of reality. For example Gunnar Eidevall made the surprising discovery that the population figures given in the novel for Ljuder on l January 1846 have no basis in reality.[8]
The essential truth of the novel, however, lies in Moberg’s depiction of the combination of historical and cultural forces which drove the emigrants from their homeland (poverty, a rigid class structure, religious persecution) and the emotional effects of these forces and of the uprooting to which they led. One critic has noted that “Moberg does not always succeed in subordinating his data to the lives of his people. At times he writes a sort of romantic history.”[9] At other times Moberg drifts into the style of a history where documentation takes over from the characters rather than providing support for them. This is why The Emigrants is ultimately not as successful artistically as Rolvaag’s earlier Giants in the Earth, where the characters are always kept in focus and the facts of their experiences are seen in terms of and never detached from their own inner lives.
The struggle for survival is dramatized in Giants in the Earth through a series of concretely realised events which stress rising action leading to dramatic tension. One instance of this occurs when Per Hansa appears to have planted his wheat too early and firmly believes that the frost has killed it all. When the first green shoots appear above the earth, Per Hansa’s boundless relief and thanksgiving is vividly dramatized:
There he stood spellbound, gazing at the sight spread before him. His whole body shook; tears came to his eyes, so that he found it difficult to see clearly. And well he might be surprised. Over the whole field tiny shoots were quivering in the warm sunshine. Store-Hans was standing now by his father’s side; he looked at him in consternation. “Are you sick, father?” No answer. “Why you’re crying!” “You’re. . .so – foolish, Store-Hans!”[10]
Rolvaag wrote this novel at a time when a shift was taking place in American literary styles. At the end of the nineteenth and beginning of the twentieth centuries Naturalism, with its stress upon determinism and concern with the excesses of an industrial civilization, tended to avoid a close scrutiny of characters as individuals and concerned itself rather with the impersonal forces that shaped them. By the time Giants in the Earth was being written (1923-24), a change had already occurred in the literary approach to character, “a shift from the sociological to the psychological. It is no longer the world of objective fact that obtrudes as the significant reality, but the subtler world of emotional experience.” While one is constantly aware of the prairie as a force, the focus of attention is directed more upon the moods and emotions of the characters. It is through this depiction of the emotional reactions to the pioneer experience that Rolvaag transcends mere documentary. As V.L Parrington has written,
With the growth of a maturer realism we are beginning to understand how great was the price exacted by the frontier; and it is because Giants in the Earth, for the first time in our fiction, evaluates adequately the settlement in terms of emotion, because it penetrates to the secret inner life of men and women who undertook the heavy work of subduing the wilderness, that it is – quite apart from all artistic values – a great historical document.[11]
Rolvaag wrote all three novels of the trilogy in Norwegian, in America. He was not a pioneer himself, having arrived in the United States in 1896 and worked on his uncle’s farm in South Dakota for three years. However, he did possess a knowledge of the psychology of ordinary Norwegian farmers. He understood, also, what it meant to be an immigrant – artistic imagination could provide the rest. Thus, despite the fact that Giants in the Earth is set in the 1870s and early 80s, almost twenty years before Rolvaag set foot on American soil, Julius Olson’s comment is still valid: “For a novel of our pioneers may, if well done, present the history of our pioneers in a nutshell. It will be history which never happened quite as it is presented, but which is essentially true nevertheless.”[12]
Neither Rolvaag nor Moberg succeeded in producing novels of equal literary quality to Giants in the Earth or The Emigrants in the later works of their respective trilogies. In Peder Victorious (1929) and Their Father’s God (1931), where Rolvaag is concerned with the generational, cultural, religious, and social issues confronting an established community, the quality of description, characterization, and dramatic tension is not as impressive as that in Giants in the Earth. In Peder Victorious, Beret’s desire to preserve her son’s “Norwegianness” founders as Peder becomes more and more American in speech and customs. This pseudo-sociological approach incorporating inadequately fictionalized materials continues in Their Father’s God where the conflicts are religious and social, with the negative results of assimilation being stressed. One of Rolvaag’s basic beliefs was that a strong ethnic culture made for a strong and stable America, and he worked all his life to preserve Norwegian culture in America.
Moberg is not so concerned with issues of assimilation. In Unto a Good Land (1954) and The Last Letter Home (1961), the main themes revolve about the problems of arriving at the place of settlement, the settlement itself, and the growth of prosperity. Half of Unto a Good Land is concerned with inland travel in America, presented through a great deal of detail; the other half describes the place and process of settlement. The continent itself provides the unifying factor: its vastness and the effect of this upon the consciousness of the travellers contrasts with a focus upon one spot. The Last Letter Home traces the story of the settlers for almost forty years, ending in 1890 with the death of the protagonist, Karl Oskar, who expresses satisfaction in his emigration. Although both novels contain individual scenes which possess a high degree of dramatic tension, neither is as satisfying as The Emigrants.
In this trilogy, when Moberg’s style works well, as it often does, the sharpness of characterization is not blurred by the documentation which fills all three works. The main characters are presented vividly in thought, action and language, Moberg’s commitment to an overall realistic approach remaining strong throughout. The trilogy stands as a major statement of the Swedish immigrant experience in America, and along with Rolvaag’s treatment of Norwegian settlers, graphically depicts the opening and settlement of the American frontier by these two important groups.
Lesser Lights – and Willa Cather
The issue of assimilation is one which occupied a number of Scandinavian writers besides Rolvaag. Hjalmar Hjorth Boyesen and Hans Mattson both believed, unlike Rolvaag, that the best course for their respective peoples was complete assimilation into American society. Boyesen was Norwegian and had emigrated to the United States in 1869. His novel Falconberg (1878) is set almost entirely in America and is concerned with immigrants in the Norwegian settlement at Pine Ridge, Wisconsin (called “Hardanger” in the novel). Through the protagonist, Einer Falconberg, Boyesen states his attitude toward Americanization:
Here in this wondrous land a new and great people is being born; a new and great civilization, superior to any the world has ever seen, is in the process of formation. It would be a foolish and ineffectual labor if we were to cling to our inherited language and traditional prejudices, and endeavor to remain a small isolated tribe.[13]
The similarity to Crevecoeur’s viewpoint, expressed in the epigraph, is clear: Americanization demands the giving up of Old Country traditions for the benefits of being completely involved in the creation of a new society. That this ‘melting’ process did not take place as thoroughly as Crevecoeur thought it would or as Boyesen desired does not lessen the forcefulness of their sense of the nature of the American experiment: the belief that the American was a man with new principles and ideas who would have a profound effect upon the rest of the world.
This exhortation to assimilate can also be seen in Hans Mattson’s autobiography, Reminiscences: The Stop of an Emigrant, published in English in 1892. Mattson traces his life from his childhood in Sweden through a spectacularly successful career in America, where he became not only a Union colonel in the Civil War but Consul General of the United States in India. He states at the outset that he “owes whatever he has accomplished in life to the opportunities offered by the free institutions of this country.” While full of praise for America, he retains some affection for his birthplace, “its people and institutions.” But the attitude of the Swedish aristocracy toward the common people and work appals him, and he feels that it is the respect for what an individual can do, rather than for his background, that makes America great. Citing experiences from his travels in many countries, he concludes that “the only desirable immigrants to this country are those who cease to be foreigners, and merge right into the American nation.”[14]
The style of the two works is very different. Boyesen’s is stilted and literary, tricked out with allusions and references in Latin, Greek, and Italian which seem out of place in the wilderness of “Hardanger.” By describing events in his life in simple, non-figurative prose, Mattson, on the other hand, creates a heightened sense of reality. In spite of the selection of incidents which any autobiographer makes, Mattson’s account illustrates how historical events can be personalized and humanized through autobiography.
Like Mattson, Boyesen was an immigrant. He lived in and studied Norwegian communities in the Midwest, and through many stories, poems and the novel Falconberg attempted to present the Scandinavian to the American reader by a discussion of the scenery, traditions, and characteristics of Scandinavia. Secondly, he discussed what immigration meant, to both the new and the old country, and to the individual Scandinavian himself.” Despite its stylistic weaknesses, Falconberg does illustrate some of the problems faced by Norwegian pioneers. Boyesen was “The first Norwegian to use English successfully as a literary language, and to find a place in American literature.”[15] Both he and Mattson, like Rolvaag and Moberg, depict in their writing the struggles, both physical and emotional, of Scandinavian pioneers trying to come to grips with the New World.
Willa Cather’s insights into the physical and psychological problems faced by pioneers and their offspring, and her ability to create effective novels from these insights, justify her inclusion here despite the fact that she was neither a pioneer nor an immigrant. In O Pioneers! (1913), The Song of the Lark (1915) and My Antonia (1918) can be seen the influence of a decade spent in a rural community in Nebraska, which had been settled primarily by Norwegians and Bohemians. When she arrived in Nebraska in 1883, at the age often, her family was surprised to discover that “native-born Americans were in the minority on the divide… . During the 1870s Nebraska grew by 310 percent, and it was no accident that foreign settlement comprised the majority of this growth. In fact, twenty-three nations were represented among the purchasers of Burlington land.”[16] Her first novel concerned with pioneers (the second novel she wrote) contains references to Swedes, Norwegians, Germans, Russians, Bohemians, and Frenchmen.
O Pioneers! (the title borrowed from Walt Whitman’s paean to the settlers of the American frontier) and The Song of the Lark both concern courageous women, but stress very different aspects of the frontier experience. In O Pioneers! Alexandra Bergson dreams of the world beyond the farm but is devoted to the land, which provides the major unifying factor in the novel. Cather presents her “as a kind of Earth Mother or Corn Goddess, a Ceres who presides over the fruitful land, symbol of the success of the pioneers in taming the reluctant but immensely promising soil.”[17] She has the centrality of place of Per Hansa or Karl Oskar, both of whom are forward-looking and, along with a most necessary sense of practicality, can view the land in a non-materialistic and spiritual sense.
By contrast, Thea Kronborg, in The Song of the Lark, like Alexandra a second-generation Swedish-American, is most concerned with leaving the small Colorado town where she was raised, to achieve artistic success in the wider world as a singer. In the preface Cather states that she had originally intended to call the novel Artist’s Youth, to “tell of an artist’s awakening and struggle; her floundering escape from a smug, domestic, self-satisfied provincial world of utter ignorance.”[18] This attitude parallels that expressed in Sinclair Lewis’ novel Main Street (1920), where the Midwest has become a place of limited vision, materialism, and conventional morality. Rolvaag’s and Moberg’s heroic male pioneers have no place in The Song of the Lark and Main Street; in O Pioneers!, however, Cather shows how this heroism can be carried on by a woman.
In My Antonia (1918) Cather presents the quintessential pioneer woman. Antonia Shimerda was born in Bohemia and arrived in Nebraska as a young girl. She develops the ability to overcome hardships and retain her sense of life’s worth. Not only does she represent the prairie country to Cather and Jim Burden, the narrator, but “She lent herself to immemorial human attitudes which we recognize by instinct as universal and true. . . . She was a rich mine of life, like the founders of early races.”[19] Like Alexandra Bergson and Thea Kronborg, Antonia must overcome many difficulties. The main difference between her and the previous two heroines is that while they are working toward a specific career goal – Alexandra, the prosperity of her farm; Thea, artistic achievement – Antonia is concerned just with living fully.
H.L. Mencken claimed to “know of no novel that makes the remote folk of the western prairies more real than My Antonia makes them, and I know of none that makes them seem better worth knowing.” And Granville Hicks pays a similar tribute to her blend of observation and imaginative skill when he reminds us that, “After all, Miss Cather saw at first hand the Nebraska of the eighties and nineties, and her accounts of the life there are not without authenticity. However much she emphasizes the heroism and piety of the pioneers, she does not neglect the hardships and sacrifices.”[20]
An important difference between My Antonia and the other two novels lies in the choice of a different point of view. Cather tells the story through the eyes of Jim Burden, thus distancing herself from the material. We see Antonia in terms of what she means to Jim: as someone with whom he grew up and who, after he has left her for a larger world, remains imprinted on his memory. For the reader, Jim Burden’s life in the larger world only serves to heighten the admiration felt for the one who stayed at home, for Antonia’s struggle for a place in life and her ultimate discovery of the right niche for her – rearing a large family on a prairie farm.
Stylistically, My Antonia marks an improvement over The Song of the Lark in that Cather does not indulge in the large-scale use of unimpressive detail which she used to depict the various stages of Thea Kronborg’s career in the earlier novel. Similarly, My Antonia exhibits a greater literary skill than O Pioneers! in which “there is much description and elucidation of character; in My Antonia comparatively little, the people being so solidly set before us that little is needed.”[21] All three novels contain, however, strong characterizations, and all three present heroines who symbolize the drive and determination of the best of the pioneers. Similarly, all three novels condemn the materialism of those individuals who have lost the ability to dream and who desire only a sterile conformity which Cather sees as greatly at odds with the spirit which brought the first generation from the Old to the New World.
Although there were great differences in the experiences of the pioneers on the frontier and those of immigrants in cities, there were important similarities as well. Immigrants who settled in both environments had been uprooted from the Old World and were determined to succeed in the New; something which many in both areas failed to do. As we shall see, the struggle for survival in the cities as presented in autobiography and fiction may have lacked some of the romance of the struggle of the pioneers on the frontier; however, dreams existed in both environments, as did- courage and the fortitude necessary to overcome often hostile surroundings. City immigrants had a ‘frontier’ of their own with which to contend.
3: The City and the Jewish Tradition
Mass immigration to the American city coincided with – and largely contributed to – a massive growth in the size of cities. Immigrants settled in the great cities because of the many opportunities they offered, not least the multitude of unskilled jobs provided by the industrial revolution of the age. However, the rapid growth of the city brought with it many difficulties: economic life became more structured and impersonal, and social divisions widened as rich and poor came to inhabit different districts of the city. The uncertainties of the new industrial order were compounded by the worst consequences of rapid urban growth – low wages, appalling housing conditions, bad sanitation, and high levels of morbidity and mortality amid the squalor – and the immigrants, themselves coming mainly from peasant communities, suffered the hardship, degradation and disillusionment undergone by many migrants, American and foreign, who moved into industrial metropolises. In the city, the immigrant experience was complicated by involvement in the fundamental social and economic changes that were transforming America in the late nineteenth and early twentieth centuries.
Irish, Italians and Jews
This experience was shared by many different ethnic groups, and many of them have left some record of it in literature. The Irish were the first to establish themselves strongly in the burgeoning cities, but have not produced a major author to chronicle their experiences. The most skilful Irish-American author to come close is James T. Farrell in his Studs Lonigan trilogy (1932-35), which focuses on second and third-generation immigrants. This impressive naturalistic series, filled with detailed social observation, traces the decline and finally, the destruction of a young man who, in spite of his parents’ early hopes, gives in to the wrong influences in his urban environment and becomes a brutal barbarian. Some immigrants, like Studs’ housepainter father, do improve themselves socially and economically, but it is too easy for lower-class Irish children in Chicago to adopt twisted versions of the American dream – which, in Studs’ case, finally destroys him.[22]
Like the Irish, the Italians came to the United States in large numbers (over four million in both cases), settled mainly in the great cities, and produced impressive, if minor, novelists interested in the immigrant experience. For example, Guido D’Agostino, born in New York City in 1906, was particularly concerned with the problems of cultural assimilation. In Olives On The Apple Tree (1940), he contrasts Emile, a doctor who wishes to gain acceptance in upper-middle-class society, and Marco, a failure who cannot accept the harsh competitiveness of America. Marco sees Americans as never satisfied, and himself prefers the old Italian warmth and sense of community, “the conversation, the laughing and everything else that makes life worthwhile.” Italians who try to leave their old culture in a headlong rush to become American not only deprive American society of something valuable but cheat themselves:
No more the Italian but a bastard Italian. Quick he forgets everything from the old country to make money and have a car and buy food in cans and become just like the American he is working for. But he does not become the American and he is no more the Italian. Something in the middle – no good for himself and no good for the country. A real bastardo.
Even Emile eventually achieves real peace of mind only when he returns to live with other Italian-Americans.[23]
The best known Italian-American novel is undoubtedly The Godfather (1969), written by Mario Puzo, who was born in New York City in 1920. However, his skilful and dramatic novel, The Fortunate Pilgrim (1964), is more revealing, for it deals with the problems of a first-generation immigrant family in the 1930s. The family is held together by a powerful, domineering mother, Lucia Santa, a matriarch determined to set her children on the road to a successful life in the United States. Around them are irresistible pressures towards Americanization, and a society that rewards unethical practices; as Lucia Santa says of one local Italian family who are successful criminals, “What animals. And yet when they have money they dare look everyone in the eye.” Many characters are culturally adrift, and the combination of new values and the depression becomes too much for them. Lucia Santa insists that the family stay together and provide mutual support through all their difficulties; if the family can endure, then there are hopes her children will do something more than just survive and improve themselves materially: they may find other values to respect.[24]
Though such problems of adjustment to a new society and culture have been common to all urban immigrant groups, there can be no doubt that Jewish writers have done more than those of any other group to transform that experience into literature. Indeed, with the award of Nobel prizes for Literature to Saul Bellow in 1976 and Isaac Bashevis Singer in 1978, Jewish-American writing has taken its place on a world stage. Of course, by no means all the works, and in some cases very few indeed, of the major Jewish-American authors deal specifically with the immigrant experience. But it is true to say that much Jewish-American literature contains the essence of the immigrant experience in its stress upon the individual at odds with the values of the world in which he finds himself. The success of this literature rests upon its awareness that life is difficult and problematical for all men and has always been so. In the twentieth century, however, there are the added strains of alienation related to the pressures of an urban, fragmented society in which man is very small indeed and in which the old sureties of widely accepted value systems no longer exist. Abraham Chapman comments that Jewish-American writers present an underlying attitude toward life that derives somehow from the core of the Jewish experience: learning how to live and cope with the continuous expectation of uncertainty, contradictions, the unpredictable, the unanticipated and the unfathomable, with the realization that adversity, trouble, grief, and sorrows, all embodied in the Yiddish word tsuris, are the normal conditions of life. Calamities are not the end of the world but realities in the struggle for survival [25] Despite this rather pessimistic description, this literature also contains a strongly optimistic note, a feeling that while the world may be a difficult place, man’s task is to attempt to understand his role in it and, more importantly, to have compassion for the difficulties of his fellow men who are in the same position as he. Perhaps this attitude is best summed up in a remark attributed to the former Israeli prime minister Golda Meir: “Pessimism is a luxury which a Jew has never been able to afford.”
The majority of Jews in America today are descended from poor and religiously orthodox Russian and East European Jews who fled from harshly restrictive laws and pogroms after 1880 and settled in American cities, particularly New York. They had been preceded by much smaller groups of more cosmopolitan German-speaking Jews, who began arriving around 1836, and by Portuguese and Spanish Jews, who began entering America after 1654. Large numbers of these German-speaking and ‘Sephardic’ Jews were traders, with many of the former moving around the country and advancing from peddling to the ownership of large department stores or becoming financiers and entrepreneurs. However, as they comprised the largest migration of Jews in history, it is the Russian and East European Jews who provided most of the inspiration and the authors for the development of Jewish-American literature.
The Earlier Jewish-American Writers
One touchingly expressive member of the “huddled masses” was Mary Antin, who arrived in Boston in 1891 from Polotzk, in the Russian Pale of Settlement. Her autobiography The Promised Land (1912) is a moving document of the significance of America to one who experienced at first hand the persecution of Jews in Russia and their freedom in the United States. Jews could not leave the area known as the “Pale” in order to live in the rest of Russia – they were, in fact, prisoners in a part of the country not large enough to provide a living for them. They were taxed far beyond the level of their gentile neighbours and ran the risk of being forced into the Czar’s army and of being forcibly baptized. She resigned herself to being spat upon by a gentile boy, since there was nothing she could do about it. Within this “exile,” however, she notes that “A poor scholar would be preferred in the marriage market to a rich ignoramus. In the phrase of our grandmothers, a boy stuffed with learning was worth more than a girl stuffed with bank notes.” This devotion to learning was the most important asset which the Jews brought to the New World, for it assured their success in a free society. For her father, education for his children was “the essence of American opportunity, the treasure that no thief would touch, not even misfortune or poverty.”[26]
Shortly after arriving in the United States, Mary is sent to the free public school. It opens up an entirely new world to her, including a profound love for America. She even wins praise for a poem about George Washington, which is printed in a local newspaper to the elation of her parents. The family’s Americanization rapidly accelerates, her father relinquishing orthodox religious practices as he feels they will hamper the assimilation process. The solid base of accepted values is removed, and the parents learn from the children what American mores are, even as their own authority as parents is undermined. Perhaps the price for Americanization is too high, but Antin believes that even immigrant parents found some joy in the process as they saw their children becoming real Americans.
What America meant to her, Mary Antin thought it also meant to immigrant Jews as a whole. In her Introduction, she wrote:
Although I have written a genuine personal memoir, I believe that its chief interest lies in the fact that it is illustrative of scores of unwritten lives. I am only one of many whose fate it has been to live a page of modern history. We are the strands of the cable that binds the Old World to the New.[27]
Like Hans Mattson, Mary Antin saw herself as a representative of her people. Both authors saw the countries in which they were born as places of injustice and America as a truly ‘promised land.’ The greater harshness which Antin experienced in Russia may help explain the more emotional approach which she takes in her autobiography as compared to Mattson’s “plain recital” of his life. America meant opportunity to Mattson; it meant life itself to Antin along with opportunity. Both were successes in the United States: the many failures by definition tended not to get their autobiographies published!
A somewhat less affirmative attitude was expressed by Abraham Cahan, who arrived in America from Vilna in 1882 and, in that year, gave the first socialist lecture in Yiddish in the United States. He had an enormous influence upon Russian and East European Jews through his editorship of The Jewish Daily Forward, which became the most important of many Yiddish language newspapers in the United States and provided an important aid to the immigrant Jewish population in its attempt to understand and become a part of the new country. While Cahan’s short stories [28] highlight the problems of the immigrant and depict the process of Americanization, his major work is undoubtedly The Rise of David Levinsky (1917), a classic novel of the Jewish immigrant experience.
Like Mary Antin, Cahan describes the harsh conditions under which Jews lived in Russia, in this case in the fictional town of Antomir. David Levinsky’s mother is murdered by anti-Semites, and the period around 1881 with its virulent anti-Jewishness is described as is the great poverty of most Jews. Levinsky is a budding Talmudic scholar but is attracted by secular books and un-Jewish thoughts even before arriving in America. When he considers going to America, he is told that it is a country in which Jews cease to be observant of the commandments, but he believes that it is possible to be a good Jew there. This is shown, in his case, to be a vain hope. Levinsky becomes a clothing manufacturer, employs cheap labour to increase his profits, and avoids union regulations. He is violently anti-socialist, and little survives in the wealthy cloak manufacturer of the poor scholar from Europe, who at least had tried to preserve his sense of ethics.
Cahan saw himself as a Realist and consciously dedicated himself and his work to the overthrow of the romantic “Genteel Tradition.” In Cahan’s opinion, American capitalism had created an unjust and corrupt society. He therefore believed that the best way to further the ends of socialism was to depict society as it was. In 1913, when The Rise of David Levinsky was published, literature of a sentimental nature was still enormously popular. Despite the writings of the Naturalists the American public was not interested in fiction which presented the real problems of an industrial society.
The Rise of David Levinsky was based upon a series of articles entitled “The Autobiography of an American Jew,” which Cahan published in McClure’s Magazine. Jules Chametzky comments that Cahan knew well the type of businessman portrayed in Levinsky; he adds: “That book, as John Higham perceptively notes, combines the distinctly American theme of success with a Jewish subject-matter and a Russian artistic sensibility. Into it Cahan put all of his rich experience, all he had learned about life and writing.”[29] Unfortunately, Cahan’s style remains flat and often unconvincing because Levinsky, his narrator, never manages to communicate a sense of true emotional reaction even when we suspect that he must be feeling deeply. He relates rather than dramatizes, in a rather distant, rhetorical style.
The comments of Hector St. John de Crevecoeur, relevant as always to the American immigrant experience, certainly apply to David Levinsky in particular: Crevecoeur talks about “the new man” who leaves “behind him all his ancient prejudices and manners” in order to receive “new ones from the new mode of life he has embraced.” In achieving the language and customs of the new land, Levinsky gives up, chiefly, his mother-tongue and the older Jewish values.[30]
In The Rise of David Levinsky, Abraham Cahan depicts not only the intense love for America which those immigrants who are successful there feel but also the sense of loss which many of them, whether or not they are successful, experience. The final paragraph of the novel presents David Levinsky bemoaning his lost past: “I cannot escape from my old self. My past and my present do not comport well. David, the poor lad swinging over a Talmud volume at the Preacher’s Synagogue, seems to have more in common with my inner identity than David Levinsky, the well-known cloak-manufacturer.”[31] We must take Cahan’s word for this as he never depicts this state of mind through Levinsky’s actions. Nonetheless, the novel provides both a fascinating history of the New York garment industry in the decades around the turn of the century and a realistic account of the Americanization of a Russian Jew.
By contrast, it was the barriers to assimilation that impressed Ludwig Lewisohn, who was born in Germany and arrived in the United States in 1890, at the age of seven. German Jews had been the most assimilated Jews in Europe, developing a form of worship, Reform Judaism, which had eliminated those differences in dress, liturgy, and language which so separated Jews visibly from their gentile neighbours in some other European countries. They were modernists rather than traditionalists. Lewisohn experienced a deep sense of frustration upon discovering that subtle restrictions and social attitudes against Jews existed in the United States. Disenchanted with the possibilities for assimilation, he became convinced that the only way in which Jews could remain well-adjusted individuals was to return to a sense of Jewish peoplehood. He wrote of his disappointment in America in Up Stream (1922) and to greater effect in his best work, The Island Within (1928).
The Island Within is concerned with the psychological effects of anti-Jewishness on first and second-generation American Jews. Although the first fifth of the book is set in Europe, it is merely preparation for the issue of assimilation which occurs in the main body of the work. The protagonist is Arthur Levy, who assumes that he is as American as the next fellow, but an awareness of his Jewish origins is slowly impressed upon him from without. However, because he has not been brought up as a Jew, he does not know how Jews behave, and lacking any real identity he loses all confidence in himself. He wonders, “How was it that, before they went to school, always and always, as far back as the awakening of consciousness, the children knew that they were Jews?… There was in the house no visible symbol of religion and of race.” He wonders how both he and his sister have developed the awareness of being Jews, but the answer is not difficult to find – non-Jews have not let them forget who they are.
Arthur begins to question the nature of Jewishness, as in conversation with his friend:
“Can we be less Jewish than we are?” Arthur asked. “Isn’t it only that we’re not honest about it? What is specifically Jewish about you and me?” Joe’s eyes were suddenly veiled by their deepest melancholy. “I don’t know. I’ll be damned if I know. And yet Oh, for Christ’s sake let’s talk about something else. I’m sick of it.”
He looks at this father and notices how Jewish he looks, then realizes how absurd this is: “Fancy an Irish-American boy saying to himself: How Irish my father looks!” Thus, from its not being an issue, Arthur’s Jewishness becomes the central factor in his life in his attempts to come to grips with American ideas of equality for all. He starts moving more and more toward the idea that Jews must preserve their sense of peoplehood if they are to remain whole and not become self-hating through doomed attempts at gaining access to gentile society. Ironically, he finally appreciates his father’s point of view that people should stay with their own kind, that there is a limit to how far Jews can assimilate into American society.[32]
The period in which the major portion of this novel is set, the late nineteenth and early twentieth centuries, was a time when there was resistance to the rise of Jews in American society, first the German and then the East European and Russian. Lewisohn experienced great difficulty in obtaining a university post, simply because he was Jewish. Much of The Island Within is semi-autobiographical and traces Lewisohn’s own increasing concern with Jewish consciousness. Like Lewisohn, other second-generation American Jews continued to experience discrimination and wrote from their preoccupation with the marginal status they felt their Jewishness conferred on them; but they also developed larger, more socially aware concerns.
The poverty and hardship on the Lower East Side of Manhattan were extreme, as many fine photographs demonstrate. Not that the Jews living there were not accustomed to poverty, but in Europe they had had cohesive communities and a set of shared beliefs which had helped to cushion their problems; in America the cohesiveness of the communities was nothing like it had been in Europe, and orthodox beliefs tended to disappear under the pressures of earning a living and of Americanization. One not uncommon reaction of second-generation Jews, especially in New York and Chicago, was to turn to a new orthodoxy to help them explain and confront their situation: that of the Communist Party. Jews made a significant and disproportionate contribution to its leadership and activists in general; they also served as many of its most articulate spokesman and ideologues, using social realist fiction for political purposes. One of the best examples of this genre – and a vivid description of the seamy side of immigrant East Side life – is found in Michael Gold’s autobiographical novel Jews Without Money (1930).
Gold was born Irwin Granich, in 1893, of Roumanian Jewish parents, on the Lower East Side. To Gold, Jewish messianism came to mean communism, and he became editor of The New Masses in 1928. Jews Without Money depicts the experiences which led Gold along this path. He points out that pious Jews had to tolerate the prostitutes and sweat-shops because they had no choice. The severe overcrowding of a big city ghetto made any waste bit of land a valuable playground, despite the possible presence of “perverts, cokefiends, kidnapers [sici, firebugs, Jack the Rippers.” The East River provided swimming facilities:
Our East River is a sun-spangled open sewer running with oily scum and garbage. It should be underground, like a sewer . . . . Often while swimming I had to push dead swollen dogs and vegetables from my face. In our set it was considered humor to slyly paddle ordure at another boy when he was swimming.
In this sort of environment, there were “Thousands of tuberculars and paralytics; a vast anemia and hunger; a world of feebleness and of stomachs, livers, and lungs rotting away. Babies groaning and dying in thousands: insomnia – worry.” Gold presents anti-union employers such as David Levinsky as the arch-enemies of humanity. A young doctor tells a patient that no medicine can cure him: “You slave too many hours in your lousy sweatshop; you need food and rest, brother. That’s what’s wrong with you! Join a labour union.”
The doctor’s advice is the sort that Gold took to heart, and then took further. It was not just the formation of unions which was important, but the transformation of the entire capitalist system. Gold recounts his father’s bitterness at his own failure to succeed: there is no gold in the streets; he has to work extremely hard just to make a living and comes to believe that money is everything in America. “It is all useless. A curse on Columbus! A curse on America, the thief! It is a land where the lice make fortunes, and the good men starve!”[33]
The book was the first important work of proletarian literature in the United States, a piece of social protest which at the very end provided an answer to the East Side’s problems in a worker’s revolution. It is not a great work of art, being more of a collection of remembered scenes interlaced with authorial comment. But it certainly possesses the ring of truth to observed reality, and it reflected contemporary Jewish concern with problems of class and status. But Jews in the thirties, for all their involvement in proletarian problems, did produce a number of works that transcend a particular time and achieve a true artistic success – most notably, Henry Roth’s Call it Sleep (1934).
In Leslie Fiedler’s opinion, Call it Sleep is “the best single book by a Jew about Jewishness written by an American, certainly through the thirties and perhaps ever.”[34] While one can argue about whether the novel is concerned with ‘Jewishness’ as such, its artistic quality is unquestionable. Like Jews Without Money, Call it Sleep is concerned with poor immigrant Jews, but it would not be entirely correct to refer to it as a “proletarian novel.” It is, if anything, a psychological novel about the adjustment needed to cope with his father and a new environment by a six to eight-year-old boy. It was strongly criticized by the Communist press for not being committed enough to social revolution, but that may well be the reason why it achieved the artistry it did: Roth created a work of art, not a polemic. He later found himself pulled in different directions: art for art’s sake or literature as a means to further social justice he has not yet completed another novel.
The novel is semi-autobiographical and presents the immigrant experience as an important element in David Shearl’s psychic problems. In the Prologue, Roth describes the arrival of David and his mother in America. David is dressed in his European clothes, and his father cannot bear to be thought a “greenhorn,” to be mocked. He throws the boy’s blue straw hat into the bay, symbolically throwing off a European attachment. The entire arrival scene is presented not with hopefulness but with despondency. Genya comments that “this is the Golden Land,” but Albert just grunts and says nothing. Even the Statue of Liberty is presented as a frightening object.
They go to the Brownsville section of Brooklyn to live, later moving to Manhattan’s Lower East Side. Henry Roth himself was born in Galicia in 1906 and, like David Shearl, came to the United States with his mother in order to join his father. As an immigrant child, Roth enjoyed a certain amount of security in the Jewish Lower East Side, but lost it when the family moved to Harlem and he had to face Irish anti-Semitism. He says that he changed his depiction of the East Side in Call it Sleep to one far more negative than he had, in fact, found it: “Call it Sleep is set in the East Side, but it violates the truth about what the East Side was like back then … .In reality, I took the violent environment of Harlem – where we lived from 1914 to 1928 – and projected it back onto the East Side.” For David, the streets of the East Side are threatening because of gangs of gentile youths, and his apartment dangerous because of his unpredictable, paranoid father, who is not certain that David is his own child. The ‘truth’ of Call it Sleep is, as in Willa Cather’s depiction of immigrants in Nebraska, an artistic truth. Roth said later. “I was working with characters, situations and events that had in part been taken from life, but which I molded to give expression to what was oppressing me. To a considerable extent I was drawing on the unconscious to give shape to remembered reality.”[35]
Roth presents the tensions between the various cultures. A Jew and an Italian are seen hurling epithets at each other; gentile gangs attack David because he is a Jew. One also sees the loss of religious life amongst the second-generation Jews, who grow up on the streets and have no use for the cheder [Hebrew school]. One cannot say that Albert Shearl’s problems are due solely to his uprooting and the problems inherent in adjusting to a new society. Albert’s problems are more deeply embedded. But certainly part of his difficulty lies in the necessity of coming to grips with a new urban society, which makes far different demands on him than did his native Austria, where he lived in the countryside.
Bertha, Genya’s sister, has mixed reactions to America. On the one hand, she loves the clothes and the excitement. Albert points out that America requires more effort than it is worth, but Bertha comments, in her usual colourful language: “True I work like a horse and I stink like one with my own sweat. But there’s life here, isn’t there! There’s a stir here always … . Veljish [back in Austrial was still as a fart in company. Who could endure it?” Yet, at other times she bemoans her fate: “Why did I ever set foot on this stinking land? Why did I ever come here? Ten hours a day in a smothering shop – paper flowers! Rag flowers! Ten long hours, afraid to pee too often because the foreman might think I was shirking.” America provided an escape from persecution in Europe, but for most first-generation Jewish immigrants the streets were not paved with gold.
The urban immigrant experience was made much harsher for those whose native language was not English. Until he could understand English and make himself understood, an immigrant was at a severe disadvantage in the New World. Neither Genya nor Bertha can speak English, only Yiddish. Albert’s English is extremely limited, as can be seen when he tries to communicate with a policeman at the end of the novel, “My sawn. Mine. Yes. Awld eight. Eight en’ – en’ vun mawnt’. He vas bawn in -.” David speaks Yiddish in the house and a mixture of broken English with Yiddish inflections in the street. The speech patterns of the gentile children in the streets and of the neighbours are also suggested. Influenced by James Joyce, Roth also uses stream-of-consciousness, which he presents through a choppy sentence structure. He shows great skill in his presentation of these various dialects and in his general command of literary technique.[36]
Call it Sleep is a major American novel both for its artistic quality and for its sensitivity in presenting the fears and problems of a Jewish immigrant child. In this way the novel marked a move away from social concern and group awareness towards an obsession with the sell, with the problems of the individual surrounded by an unfriendly, impersonal society to which he or she has difficulty relating. This was to be a major feature of Jewish-American writing after World War II, though there have remained some writers concerned above all with essentially Jewish problems.
The Jewish-American Novel since 1945
Chaim Potok is that rarity amongst Jewish-American authors, a writer who sees orthodox Judaism as relevant and necessary in the twentieth century. He is a rabbi, and his five novels to date each stress moral and religious themes. His first and arguably best novel, The Chosen (1967), is concerned with Russian and East European Hasidic Jews, most of whom are immigrants. Potok reflects reality when he presents this group as not being interested in assimilating into American society. Quite the contrary: their goal is to recreate the religious and social forms in America which had existed in Europe. Unlike those individuals, Jewish or not, who began assimilating into American society as soon as possible, the Hasidic characters in The Chosen do their best to isolate themselves, in Brooklyn, from the rest of America. Indeed, the basic conflict in the novel revolves about whether Danny Saunders, the genius son of the sage-like leader of a particular Hasidic group, will refuse to take on the inherited role of ‘rebbe” (or rabbi) after his father or leave the group and move out into the non-Hasidic world.
Potok’s style combines weak dialogue, in which he fails to individualize his characters, with intrusive though interesting accounts of Jewish history and religious lore. The Chosen, like his other novels, works because of his story-telling ability and the fact that his characters are still interesting as people despite the technical weakness in his presentation of them. Also, they fulfil certain concepts inherent in the American dream – concepts which are at the root of the immigrants’ ideal America in that the story is played out by an improbable but possible “only in America” cast of Hasidic and orthodox Jews, who demonstrate that people can still make good through hard work, and that severe difficulties can be overcome by pluck, integrity, and dedication. At the story’s end the novel’s two young heroes are about to realize the reward they have earned: a limitless future. In sum, The Chosen can be interpreted from this standpoint as an assertion of peculiarly American optimism and social idealism. Very simply, it says Yes.[37]
As a novel, then, The Chosen is not entirely successful, but as a social document, a work which encapsulates some of the problems of an immigrant group having beliefs and practices which are very different from those of the majority society and battling with the problem of assimilation, the book has much to recommend it. Potok seeks to demonstrate that this group’s values fit them for success in America, in that both the extreme and somewhat less extreme levels of Jewish orthodoxy inculcate honesty, loyalty and respect for hard work and learning. Thus, the novel presents an ideological defence of the relevance of orthodox Judaism to modern, secular American life as well as an accurate picture of the physical and social aspects of a particular group of people in a particular place and time.
In The Chosen Potok stresses religious and cultural reasons for his characters’ difficulties in feeling at peace in the American environment; but for most post-Second World War Jewish writers there is another experience, besides immigration, which lends them a critical distance from the everyday routines of American life: the Holocaust. In Isaac Bashevis Singer’s novel Enemies: A Love Story (1972) and Edward Lewis Wallant’s The Pawnbroker (1961), the protagonists’ difficulties stem from their being haunted by memories of Nazi persecution. The heroes are both Jewish immigrants among the 150,000 who managed to drag themselves to America after World War II with little more baggage than their lives. Both are haunted men for whom America means not so much freedom as a place where they can be left alone to try to forget – unsuccessfully. Singer’s Herman Broder was never in a concentration camp, but spent the war years in Europe hiding and being hunted. He walks about Brooklyn remembering the hayloft in Lipsk where he eluded the Nazis, but he is still looking for hiding places in case the Nazis come to Brooklyn to hunt him there. His frenetic sexual life confirms his inability to form a human relationship. Wallant’s Sol Nazerman has succeeded in stifling all human emotion in his pawnshop in Harlem. But as the anniversary of the death of his family in the camps approaches, he finds that memories which he thought he had thoroughly repressed begin to come to consciousness again. As in his other three novels, Wallant slowly and agonizingly moves the protagonist to a point where he can rejoin the human race, where he can feel for his suffering fellow man again. In both of these novels the immigrant experience is peculiarly a nightmare; the protagonists live in their memories more than in an actual place called America.
Singer has published a number of short stories which contain characters who are immigrants in the United States. However, because of his orientation to Eastern European myths and Judaic practices, these tales are often very similar to those set in shtetls (or Jewish villages) in Europe in terms of their characterization, dialogue, and atmosphere. When he uses an American setting, the tales are rarely as satisfying as those with an Eastern European milieu. A brilliant writer, Singer’s best work does not concern the immigrant experience in America but is rooted in a European context.[38]
Unlike Singer, the major writers of Jewish-American literature since World War II – Saul Bellow, Bernard Malamud, and Philip Roth – are primarily concerned with presenting various aspects of American life. Though none can be said to be overwhelmingly concerned with the immigrant experience, each presents aspects of it in various novels and short stories. It is, of course, very much the case that the further away from the era of large-scale immigration one moves, the less direct concern there is with the immigrant and the more with the problems of ethnicity in later generations. These problems, however, are not dissimilar from those faced by newly arrived immigrants in terms of Jewish identity and assimilation. Most often the problems of physical survival jobs and money – have been solved, but the problem of the nature of an individual’s Jewishness in a society which grants equality of opportunity has not.
In Saul Bellow’s novel, Mr. Sammler’s Planet (1970), the protagonist has, like Sol Nazerman and Herman Broder, escaped from the Nazis – but only barely. Artur Sammler, a Polish Jew, has dragged himself out of a mass grave. Like Herman Broder, he spent the war years in hiding, but also was a member of a guerilla band for a time. Before the war he was a journalist in London. Now he is living in New York City, and unlike Sol Nazerman his experiences have not made him indifferent to the society around him – quite the contrary. Sammler delivers a series of criticisms of American society in the 1960s which stem directly from his European attitudes and experiences: an image of civilized order in Bloomsbury and the falling apart of the civilized world during the war.
With his one good eye (the other smashed by a rifle butt), he views what he perceives as the decay of Western civilization’s values. He deplores the youth cult, the new liberal intellectual attack upon what he believes to be true values, the lack of restraint, the self-centredness of people, and the lack of respect for the humanity of others:
Like many people who had seen the world collapse once, Mr. Sammler entertained the possibility it might collapse twice. He did not agree with refugee friends that this doom was inevitable, but liberal beliefs did not seem capable of self-defence, and you could smell decay. You could see the suicidal impulses of civilization pushing strongly.
Viewed by most of the younger characters as out of touch with contemporary approaches to important issues, Sammler suffers from extreme culture shock. Bellow, Singer, and Wallant all present those protagonists who have survived the Holocaust and arrived as immigrants in America as being peculiarly unable to deal with American society. Only Bellow, however, creates a protagonist who is a vibrant social critic – though stylistically the novel is overburdened by interior monologues and “lectures” presenting this criticism – and who also preserves his faith in mankind despite his recognition of all man’s negative aspects: “There are still human qualities. Our weak species fought its fears, our crazy species fought its criminality. We are an animal of genius.”[39]
In his earlier novel, The Victim (1947), Bellow is concerned with responses to immigration on the part of a ‘WASP.’ Kirby Allbee believes not only that Asa Leventhal, a second-generation Jewish American, is responsible for the loss of his job, but that the Jews have taken over New York City from the earlier English Protestant settlers:
“Do you know, one of my ancestors was Governor Winthrop. Governor Winthrop!” His voice vibrated fiercely; there was a repressed laugh in it …. “It’s really as if the children of Caliban were running everything …. The old breeds are out. The streets are named after them. But what are they themselves? Just remnants.”
This resistance to the rise of the Jews from their ghettos to positions of success in American society was resented by many of the older, more settled Americans. As Ludwig Lewisohn’s The Island Within demonstrated, this resentment often took the form of social exclusion before World War II, though since then the more blatant exclusionary practices against Jews in universities, clubs, hotels, and society at large have disappeared. Allbee is, indeed, a “remnant,” but of pre-World War II ethnocentricism.
Now a tramp, Allbee is desperate to show that his fall is not his own fault, that these relative newcomers cannot fully understand the American culture of which he is so integral a part:
I saw a book about Thoreau and Emerson by a man named Lipschitz … “What of it?” “A name like that?”Allbee said this with great earnestness. “After all, it seems to me that people of such background simply couldn’t understand “Of all the goddamned nonsense!” shouted Leventhal. “Look, I’ve got things to attend to.”
Although Bellow takes further the issue of how much responsibility any man has for another, the direct implication of these passages is that the established group must find a scapegoat for any loss of power or prestige in order to remove the burden of failure from itself.[40] Thus, the Jews assume their classic role. Being too successful, they bring down upon themselves the wrath of those with whom they are competing.
By contrast, Morris Bober, the protagonist of Bernard Malamud’s novel The Assistant (1957), never learned how to compete successfully in materialistic America. A first-generation Jewish immigrant, he came to America from Russia with great hopes and found himself stuck in a poor grocery store. He bemoans his lost youth and castigates himself for not making the right decisions at the right time. For Morris “America had become too complicated. One man counted for nothing. There were too many stores, depressions, anxieties. What had he escaped to here?” Morris’ problem is that his values clash with those of America. Though he is not a religious Jew, he believes that all men are responsible for the welfare of others; he believes in the Jewish law: “This means to do what is right, to be honest, to be good. This means to other people.” At Morris’ funeral, the rabbi reiterates the truth of the dead grocer’s beliefs. Unfortunately, these are not the values which make for success in the United States, and by American (and his wife’s) standards Morris is a failure.[41]
Walter Shear points out that “In The Assistant two cultures, the Jewish tradition and the American heritage (representing the wisdom of the old world and the practical knowledge of the new), collide and to some degree synthesize to provide a texture of social documentation which is manifested in a realistic aesthetic.”[42] The sense of the Old World is conveyed through Morris’ heavy Yiddish inflections, which are replete with an ironic humorous tone: “You should sell long ago the store,” she remarked after a minute. “When the store was good, who wanted to sell? After came bad times, who wanted to buy?”
The basis of this style “owes something to the wile of Yiddish folklore, the ambiguous irony of the Jewish joke. Pain twisted into humor twists humor back into pain.”[43] There is a bitter-sweet quality of hope encased in pain in this style, and the ultimate paradox of the novel lies in the fact that Morris’ failure in materialistic America is the cause of his success as a human being. Because he refuses to forsake high standards of honesty and goodness, he cannot succeed in the timeless, depressed neighbourhood where fate has placed him. The novel documents not only the life and death of a good man and the effect for good which he has upon another failure by New World standards, but also the tension between certain Old and New World values and the culture shock which many immigrants had to endure. Most Jewish American fiction would endorse Malamud’s pessimistic conclusion on this score rather than Potok’s exceptionally affirmative attitude.[44]
A man who is not a failure by American standards is Ben Patimkin, in Philip Roth’s novella Goodbye, Columbus (1959). He has left the area of first settlement in Newark, the area to which his European-born parents had come, and made it to wealthy Short Hills. Neil Klugman, the young protagonist, relates some of the history of the place: “The neighborhood had changed: the old Jews like my grandparents had struggled and died, and their offspring had struggled and prospered, and moved further and further west. . . . Now, in fact, the Negroes were making the same migration, following the steps of the Jews.” Unlike Morris Bober, Ben is willing to be a bit of a “thief’ in order to succeed – and America rewards him.
The novella concerns a choice amongst values: the materialism of the Patimkins, living on the cool heights of Short Hills, as compared to a somewhat obscure set of non-materialistic, more human values held or aspired to by the protagonist, Neil Klugman. Neil is living with his Aunt Gladys, a first-generation immigrant who has not escaped Newark, but he is sexually drawn to Brenda Patimkin and, initially at least, to her lifestyle. At the very beginning of the novella Roth makes the difference between these two generations clear. The streets of Short Hills are named after eastern colleges, and as he drives through them to meet Brenda, Neil muses:
I thought of my Aunt Gladys and Uncle Max sharing a Mounds bar in the cindery darkness of their alley, on beach chairs, each cool breeze sweet to them as the promise of after life, and after a while I rolled onto the gravel roads of the small park where Brenda was playing tennis.
The differences are great, but the elder Patimkins still retain their old Newark furniture in the attic, an old refrigerator (now full of fruit), and a somewhat moralistic preference for good eaters and strong fabrics in clothes. It is almost as though, as Brenda puts it in talking of her mother,” Money is a waste for her. She doesn’t even know how to enjoy it. She still thinks we live in Newark.”[45]
Although Roth’s depiction of the conflict between the American success ethic and more human values clearly stresses, as did Malamud in The Assistant, the danger of the loss of the more basic values as American materialism takes over, some critics feel that he is dishonest in his presentation of the characters. Certainly the Patimkins and Aunt Gladys, like the assimilated Jews of Woodenton in his story “Eli, the Fanatic,”[46] are not presented as having any real insight into the deeper issues which their actions imply; at times they become mere caricatures. Yet, the problem of having to contend with the “swamp of prosperity,” to use Saul Bellow’s phrase, is vividly presented in Goodbye, Columbus . The protagonist finally rejects it, but what, as a third-generation American Jew, he is left with remains in doubt.
These works represent only a small sample of the scores of works which Jewish-Americans have produced. Why have they been so prolific? First,Jews have long been known as “the people of the book,” not only for their devotion to the Old Testament but because of the great value which they have always placed upon sheer literacy, if only for males and only in Hebrew and Yiddish. The poorest European ghetto might manage to have its Talmud students, who would be supported by the community to study the ancient commentaries of great sages. Wives would work so that their husbands could study the holy books. As Mary Antin’s autobiography showed, scholarship was held in extremely high regard, and it was but a short step from the devotion to religious texts to a similar devotion to secular literature, as can be seen in the experience of David Levinksy. In a broader sense this explains, in part, the great success of Jews entering the professions.
Second, Jews found that they had an audience. Originally it was a Yiddish-speaking one, and then became English-speaking. Jews are great consumers of books (Israel has one of the highest per capita purchase of books in the world), and a Jewish writer or journalist writing in New York around the turn of the century (like Abraham Cahan) knew that he had readers. As the United States became more and more urban, non-Jews began finding that some Jewish writers were speaking for them as well as for Jews. Jews were the experts on marginality,’ which became an increasingly important topic in the twentieth century; Jews were experts on urban life and its problems, again becoming cultural spokesman for all Americans, of whatever ethnic origin, who felt ill at ease in the new urban American culture. Thus, the Jewish writer’s audience grew rapidly.
Other immigrant groups had possessed some interest in literary matters, but none had the combination of centuries of great respect for words and learning, centuries of experience as marginal outsiders, a large segment with long experience in cities, and a fervent messianic hope which could be transmuted into optimism for the future in spite of the nature of the present reality. These elements made the Jews ideal interpreters of the twentieth-century American experience.
4: Literature and Immigrants
The writings over which this survey has glanced offer many realistic glimpses of the manners, the speech, the ethnic prejudices, the exotic imagination of Scandinavian or Jewish immigrants as they confront, adapt, succumb to or are corrupted or strengthened by the American environments, frontier or urban, into which their uprooting had thrown them. These works usually depicted transitional modes of life and even employed (with honourable exceptions like the traditionalist Singer) transitional modes of literary expression, halfway between their own traditions and those of the host society. This is clear in their adaptation of speech patterns and mannerisms from languages other than American English, and their use of imported folk-naivety and folk-wisdom for stereotyping characters. Traditional themes persisted, too, in the form of Christian heroism or Jewish suffering, though, as the mid-twentieth century approached, their characters fitted increasingly well into a noticeably more American landscape.
This literature relating to the immigrant experience, all in all, proved a most welcome injection into the American literary tradition. By the late nineteenth century the traditional currents of American literature no longer represented the mainstream of American experience. The so-called “Genteel Tradition,” America’s version of British Victorianism, did not adequately reflect the main developments of the time – the settling of the Far Western frontier and the industrialization of the cities. Immigrants were at the heart of these processes, and provided an intricacy of social and cultural mores different from those of native Americans. Thus the subject-matter available to authors concerned with writing a peculiarly American literature was increased enormously, and the work of most late nineteenth-century literary Naturalists, whether relating to farm or factory, responded to the plight of people who may not have been presented as immigrants but who were often probably recent arrivals in the United States. Moreover, immigrant writers themselves often consciously worked for a shift in the direction of American culture, as was most obvious in the socially conscious and politically aware young Jewish socialist writers who became so prominent during the depression of the 1930s. Whereas the writers of, say, the expatriate tradition had represented an educated, literate elite, able to travel abroad and experience an older, fuller, more confident culture, the writers who portrayed the immigrant experience marked a ‘proletarianizing’ influence which in some ways reflected more accurately what was happening in the United States itself.[47]
After World War II, however, immigrant literature lost much of its distinctiveness. Most Scandinavian writing by the 1950s seemed to become just another variation of American regional literature, which no doubt reflected the comparative ease with which most North European Protestants were assimilated and began to adopt American cultural perceptions and attitudes. By contrast, American Jews, though they have largely accepted American cultural values, have yet produced major novelists who appear to represent a distinctive tradition. Even so, their preoccupations have ceased to be purely Jewish; instead they have become effective literary spokesmen for educated urban men in contemporary America. There is an obvious compatibility between the witty, sophisticated temperament which marks many modern Jewish fictions and the sense of smart alienation of the bourgeois American intelligentsia – so that the Jewishness of a character like Bellow’s Henderson is not a matter of ethnicity or religion but of outlook and personal tone. Some Jewish-American writers have fused older European forms and perceptions with American literary traditions and cultural experience, and the Jewish literary mode has interpenetrated with the American without becoming indistinguishable. In the process it has made American literature more universal, more accessible to other peoples, more significant for modern educated men everywhere.
5. Guide to Further Reading
There are a number of general works on American immigration, each of which offers a suitable starting point. In American Immigration (Chicago: Chicago UP, 1960), Maldwyn A. Jones neatly surveys all aspects of the subject, while his Destination America (1976) concentrates more upon the various immigrant groups and the problems of assimilation. Philip A.M. Taylor’s The Distant Magnet: European Emigration to the U.S.A. (New York: Harper and Row, 1971) is a carefully documented account of the immigration process from both the European and American vantage points; it is particularly informative about the Atlantic passage, as also is Terry Coleman’s lively Passage to America: A History of Emigrants from Great Britain and Ireland in the Mid-Nineteenth Century (London: Hutchinson, 1972). Oscar Handlin’s The Uprooted: The Epic Story of the Great Migrations that Made the American People (1951; rept., Boston: Little, Brown, 1973) is a minor classic, written in a readable, almost novelistic style, which stresses the difficulties of adjusting to America. More recent writing has drawn a less gloomy picture, as notably in Stephan Thernstrom, The Other Bostonians: Poverty and Progress in the American Metropolis, 1880-1970 (Cambridge, Mass.: Harvard UP, 1973), while Josef J. Barton, Peasants and Strangers: Italians, Rumanians and Slovaks in an American City, 1890-1950 (Cambridge, Mass.: Hazard UP, 1975) splendidly demonstrates the different patterns of assimilation possible even within the same ethnic group in Cleveland, Ohio. John Higham’s Strangers in the Land (1955) remains the most impressive analysis of American reactions to mass immigration.
Leonard Dinnerstein et al.’s useful text, Natives and Strangers: Ethnic Groups and the Building of America (New York: Oxford UP, 1979), shows the contribution of a large number of immigrant groups throughout American history to the country’s economic growth. Carl Wittke’s The Irish in America (Baton Rouge: Louisiana State UP, 1956) and William V. Shannon’s American Irish (1963; rev. ed., New York: Collier, 1966) deal with the problems faced after arrival. Joseph Lopreato, Italian Americans (New York: Random House, 1970) discusses this group’s development and problems with assimilation within America. Charlotte Erickson, Invisible Immigrants: the Adaptation of English and Scottish Immigrants in Nineteenth-Century America (London: Weidenfeld and Nicholson, 1972) reveals the tensions even British settlers faced in the nineteenth century. Nathan Glazer and Daniel P. Moynihan, Beyond the Melting Pot: The Negroes, Puerto Ricans, Jews, Italians and Irish of New York City (1963; 2nd ed., Cambridge, Mass.: M.I.T. Press, 1970) presents a sociological study of each of these groups.
Maurice Davie includes in his World Immigration (New York: Macmillan, 1936) a useful bibliography listing immigrant biographies and fiction. David Bowers presents a series of essays concerned with immigrant and American culture and institutions in Foreign Influences in American Life (New York: Peter Smith, 1952). In The Rediscovery of the Frontier (New York: Cooper Square, 1970), Percy Boynton discusses various literary aspects of the treatment of the frontier in fiction and includes a chapter on “The Immigrant Pioneer in Fiction.”
A useful book on the Scandinavian immigrant in literature is Dorothy Skardal’s The Divided Heart: Scandinavian Immigrant Experience through Literary Sources (Oslo: Universitetsforlaget, 1974), which follows the implications of its title in terms of social history. Theodore Blegen’s Norwegian Migration to America, 1825-1860 (Northfield: Norwegian American Historical Association, 1931) is a standard work on its subject. For critical comment on O.E. Rolvaag, see Robert Steensma, “Rolvaag and Turner’s Frontier Thesis,” North Dakota Quarterly, 27 (1959), 100-04, which discusses Rolvaag’s attitude toward the frontier as “safety valve” and inspiration of democracy. Benjamin Wells discusses the work of H.H. Boyesen in terms of style and theme in his biographical essay, “Hjalmar Hjorth Boyesen,” Sewanee Review, 4 (1896), 299-311.
An interesting and enjoyable book relating to Swedes is by H. Arnold Barton, ed., Letters from the Promised Land: Swedes in America, 1840-1914 (Minneapolis: Minnesota UP, 1975), which presents both background and letters. Discussion of the work of Vilhelm Moberg can be found in Gerhard Alexis, “Moberg’s Immigrant Trilogy: A Dubious Conclusion,” Scandinavian Studies, 38 (1966), 20-25, in which he discusses the publisher’s inappropriate excisions and changes to the final two Swedish volumes of the tetralogy in order to make them into one English volume and so produce a trilogy.
The quality of the literature produced in America by German-speaking groups was not as high as that of the Scandinavians. Worthy of mention, however, is Austrian Charles Sealsfield (Karl Anton Postl), whose best work (though not concerned with immigrants as such) is a western adventure novel, The Cabin Book; or Sketches of Life in Texas, trans. C.F. Mersch (New York: J. Winchester, 1844). An interesting comparison of literary approaches to, among other things, frontier characters can be found in Karl J. Arndt, “The Cooper-Sealsfield Exchange of Criticism,” American Literature, 15 (1943), 16-24. The Pennsylvania ‘Dutch’ (a German group who settled in the seventeenth and eighteenth centuries) produced a literature describing their customs. Helen Reimensyder Martin’s Tillie: A Mennonite Maid, A Story of the Pennsylvania Dutch (1904; rept., Ridgewood, New Jersey: Gregg, 1968) is critical of many of the group’s folkways.
A wide variety of Italian-American authors and their works is discussed in Rose Green’s useful text, The Italian-American Novel: A Document of the Interaction of Two Cultures (Rutherford: Fairleigh Dickinson UP, 1974). The Italians produced several good minor novelists beside Guido D’Agostino and Mario Puzo. Pietro DiDonato, in his autobiographical novel Christ in Concrete (New York: Bobbs Merrill, 1939), documents the difficulties of Italian immigrants in the corrupt construction industry. Later novels, This Woman (New York: Ballantine Books, 1958) and Three Circles of Light (New York: Julian Messner, 1960), depict social and religious themes as they apply to immigrants. In The River Between (New York: E.P. Dutton, 1928), Louis Forgione presents the clash between first and second generations in America. John Fante stresses the psychological problems of Italian-Americans in such works as Wait Until Spring, Bandini (New York: Stackpole Sons, 1938) and in the stories in Dago Red (New York: Viking, 1940). The most useful single text concerning Jewish life in America is Oscar Handlin’s Adventure in Freedom: Three Hundred Years of Jewish Life in America (1954; rept., New York: Kennikat, 1971). Nathan Glazer’s American Judaism (1957; rept., Chicago UP, 1970) presents a short history of Jewish immigration to America and the changes to Jewish life which took place there. Louis Wirth’s The Ghetto (1928; rept., Chicago: Chicago UP, 1975) describes both the history and the psychological effects of the ghetto upon Jews in Europe and America. A most readable social history is The American Jews: Portrait of a Split Personality (New York: Paperback Library, 1969), by James Yaffe. A study combining immigration history and excerpts from first person accounts of experience as an immigrant is Abraham J. Karp, ed., Golden Door to America: The Jewish Immigrant Experience (Harmondsworth: Penguin, 1977).
The most useful study concerning Jewish life in its major American centre is Irving Howe’s The Immigrant Jews of New York: 1881 to the Present (London: Routledge and Kegan Paul, 1976); the American title is The World of Our Fathers. Moses Rischin, The Promised City: New York’s Jews, 1870-1914 (Cambridge, Mass.: Harvard UP, 1962) is a much-respected scholarly study. A graphic text dealing with Jews in New York is Allon Schoener’s Portal to America: The Lower East Side (New York: Holt, Rinehart, Winston, 1967), which contains newspaper items, letters, pictures, as well as authorial comment. Hutchins Hapgood’s The Spirit of the Ghetto (1902, rept., Cambridge, Mass.: Belknap Press, 1967) is a classic – but over-selective – look at the Lower East Side by a non-Jew. Jacob Riis is another observer of the slums, whose How the Other Half Lives (1890; rept., New York: Dover, 1971) deals with all of the major ethnic groups living on the Lower East Side. Two autobiographies presenting useful insights into the Jewish immigrant experience are Charles Reznikoff’s Family Chronicle (1929; rept., London: Norton Bailey, 1969), a fascinating account of the experiences of a father, mother, and son in Russia and New York, and Alfred Kazin’s A Walker in the City (New York: Harcourt, Brace, 1951), tracing the author’s movement from Brooklyn to Manhattan. Each of the following novels contains a depiction of the reaction against the Judaism of their fathers by second-generation American Jews. Haunch, Paunch and Jowl: An Autobiography (New York: Garden City Publishing, 1923), by Samuel Ornitz, is the “autobiography” of the protagonist and, like Meyer Levin’s The Old Bunch (1937; rept., New York: Citadel, 1942), shows this second generation’s intense desire for Americanization. In Summer in Williamsburg (1934), Homage to Blenholt (1936), and Low Company (1937), published in one volume as The Williamsburg Trilogy (1934-37; rept., New York: Avon, 1972), Daniel Fuchs presents first and second-generation American Jews struggling to succeed. Clifford Odet’s play Awake and Sing (New York: Modern Library, 1939) is a folk drama of a Jewish-American middle-class family, containing both first and second generations, trying to cope with the Depression. A portrait of an amusing and endearing Jewish immigrant attempting to learn English can be found in The Education of Hyman Kaplan (1937; rept., New York: Harcourt, Brace and World, 1965) and The Return of Hyman Kaplan (1938; rept., Harmondsworth: Penguin, 1968), by Leo Rosten (pseud. Leonard Q. Ross). A recent tour de force of Jewish and other humour seen in terms primarily of second-generation American Jews is Joseph Heller’s Good As Gold (London: Jonathan Cape, 1979).
The most useful single text of literary criticism of Jewish-American literature which has relevance to the immigrant experience is Allen Guttman’s The Jewish Writer in America: Assimilation and the Crisis of Identity (New York: Oxford UP, 1971). Also useful because of its stress upon the Jewishness of the literary works but relevant to the immigrant experience as well is Irving Malin’s Jews and Americans (Carbondale: Southern Illinois UP, 1966) while Bernard Sherman’s The Invention of the Jew: Jewish-American Education Novels, 1916-1964 (New York: Thomas Yoseloff, 1969) stresses novels of initiation in terms of different generations and has a strongly immigrant orientation. Postwar Jewish-American novels are discussed critically in the context of modern fiction as a whole in Stan Smith, The Comic Self in Post-war American Fiction (1981), the fifth pamphlet in this series.
6. Notes
- Hector St. John de Crevecoeur, Letters From an American Farmer (1782; rept., London: J.M. Dent, 1962), p. 43. Back
- Henry Seidel Canby et al., “Address to the Reader,” in R.E. Spiller et al., eds., Literary History of the United States, 3rd rev. edn. (1946; rept., New York: Macmillan, 1965), pp. xx-xxi. Back
- Maldwyn A. Jones, Destination America (1976; rept., Glasgow: Fontana, 1977), p.98. Back
- Sophus K. Winther, “Moberg and a New Genre for the Emigrant Novel,” Scandinavian Studies, 34 (1962), 172. Back
- O.E. Rolvaag, Giants in the Earth, trans. L. Colcord and the author (1927; rept., New York: A.L. Burt, 1929), pp. 212, 249. Back
- Vilhelm Moberg, The Emigrants, trans. G. Lannestock (Stockholm: Albert Bonniers Forlag, 1951), pp. 154,186. Back
- Joseph E. Baker, “Western Man Against Nature: Giants in the Earth,” Cottege English, 4 (1942), 24, 19. Back
- Philip Holmes, Vithelm Moberg: Utvandrarna (Hull: Studies in Swedish Literature, 1976), pp. 26-27. See also idem, Vilhelm Moherg (Boston: Twayne, 1980). Back
- Gerhard Alexis, “Sweden to Minnesota: Vilhelm Moberg’s Fictional Reconstruction,” American Quarterly, 18 (1966), 87. Back
- O.E. Rolvaag, pp. 297-98. Back
- Vernon Louis Parrington, Main Currents of American Thought (1927; rept., New York: Harcourt, Brace and World, 1958), vol. 3, pp. 393, 387. Back
- Julius Olson, “Rolvaag’s Novels of Norwegian Pioneer Life in the Dakotas,” Scandinavian Studies and Notes, 9 (1926), 48. Back
- Hjalmar Hjorth Boyesen, Falconberg (1878; rept., New York: Charles Scribner’s Sons, 1899), p. 23. Back
- Hans Mattson, Reminiscences: The Story of an Emigrant (St. Paul: D.D. Merrill, 1892), pp. i, 311. Back
- George L. White, Jr., “H.H. Boyesen: A Note on Immigration,” American Literature, 13 (1942), 366; CarlWittke, “Melting Pot Literature,” ColtegeEngtish, 7 (1945-46), 193. Back
- Philip Gerber, Willa Cather (Boston: Twayne, 1975), p. 21. Back
- David Daiches, Willa Cather: A Critical Introduction (Ithaca; N.Y.: Cornell UP, 1951), p. 28. Back
- Willa Cather, The Song of the Lark, new rev. edn. (1915; rept., Boston: Houghton Mifflin, 1937), p. vi. Back
- Willa Cather, My Antonia (1918; rept., London: Hamish Hamilton, 1962), p. 353. Back
- H.L. Mencken, “Four Reviews,” and Granville Hicks, “The Case Against Willa Cather,” both in James Schroeter, ed., Willa Cather and Her Critics (Ithaca, N.Y.: Cornell UP, 1967), pp. 9,141. Back
- T.K. Whipple, “Willa Cather,” ibid., p. 40. See also Edward and Lillian Bloom’s useful Willa Cather’s Gift of Sympathy (Carbondale: Southern Illinois UP, 1962) and James Woodress’s excellent Willa Cather: Her Life and Art (New York: Western Publishing, 1970). Back
- James T. Farrell, Young Lonigan (1932), The Young Manhood of Studs Lonigan (1934), and Judgment Day (1935); published in one volume as Studs Lonigan (rept., New York: Avon, 1977). Back
- Guido D’Agostino, Olives on the Apple Tree (New York: Doubleday, Doran, 1940), pp. 26, 294. Back
- Mario Puzo, The Fortunate Pilgrim (New York: Atheneum, 1964), p. 77. See also The Godfather (New York: G.P. Putnam’s Sons, 1969). Back
- Abraham Chapman, ed.,Jewish-American Literature: An Anthology (New York: New American Library, 1974), pp. xlvii – xlviii. Back
- Mary Antin, The Promised Land (Boston: Houghton Mifflin, 1912), pp. 37,186. Back
- Ibid., p. xiii. Back
- Abraham Cahan, Yekl and the Imported Bridegroom and Other Stories of the New York Ghetto (rept., New York: Dover, 1970). This volume contains the novella Yekl (1896) and five of Cahan’s stories. Back
- Jules Chametzky, From the Gbeuo: The Fiction of Abraham Cahan (Amherst: Massachusetts UP, 1977), p. 128. See also John Higham, ‘Strangers In The Land.’ Patterns of American Nativism, 1860-1925 (New Brunswick, N J.: Rutgers UP, 1955). Back
- Chametzky, p. 142. Back
- Abraham Cahan, The Rise of David Levinsky, introd. by J. Higham (1917; rept., New York: Harper and Row, 1966), p.530. Back
- Ludwig Lewisohn, The Island Within (New York: Harper and Brothers, 1928), pp. 103, 147,148. Back
- Michael Gold ,Jews Without Money (1930; rept., New York: Avon, 1972), pp. 39, 24, 162,168, 79. Back
- Leslie Fiedler, “The Jew in the American Novel,” in his To the Gentiles (New York: Stein and Day, 1972), p. 96. Back
- David Bronsen, “A Conversation With Henry Roth,” Partisan Review, 2 (1969), pp. 267, 268. Back
- Henry Roth, Call it Sleep (1934; rept., New York: Avon, l965), pp. 153, 158, 437. Back
- Sheldon Grebstein, “The Phenomenon of the Really Jewish Best Seller: Potok’s The Chosen,” Studies in American Jewish Literature, 1(1975), 25; Chaim Potok, The Chosen (New York: Simon and Schuster, 1967). See also Robert H. Fossum and John K. Roth, The American Dream (1981), the sixth pamphlet in this series. Back
- Isaac Bashevis Singer, Enemies: A Love Story (1972; rept., Harmondsworth: Penguin, 1977); Edward Lewis Wallant, The Pawnbroker (1961; rept., New York: Manor, 1973). Back
- Saul Bellow, Mr. Sammler’s Planet (New York: Viking, 1970), pp. 37, 308. Back
- Saul Bellow, The Victim (1947; rept., Harmondsworth: Penguin, 1971), pp. 121, 122. Back
- Bernard Malamud, The Assistant (1957; rept., Harmondsworth: Penguin, 1971), pp.183,112-13. Back
- Walter Shear, “Culture Conflict,” in Leslie A. and Joyce W. Field, eds., Bernard Malamud and the Critics (New York: New York UP, 1970), p. 208. Back
- Malamud, p. 20; Ihab Hassan, “The Qualified Encounter,” in Field, eds., Malamud, p. 200. Back
- Though Malamud’s later novels move away from the subject, he also explores immigrant themes in a number of his short stories: see his collections entitled Idiots First (1963) and The Magic Barrel (1958), both reprinted (Harmondsworth: Penguin, 1966 and 1968). Back
- Philip Roth, Goodbye, Columbus and Five Short Stories (1959; rept., London: Corgi, 1971), pp. 64, 6,18. Back
- Ibid. For the critics, see Jermey Larner, “The Conversion of the Jews,” Partisan Review, 27 (1960), 761-68, and Irving Howe, “Philip Roth Reconsidered,” Commentary, 54 (1972), 69-77. Back
- Malcolm Bradbury, The Expatriate Tradition in American Literature (1982), the ninth pamphlet in this series. Back
Top of the Page
David Murray, Modern Indians
BAAS Pamphlet No. 8 (First Published 1982)
ISBN: 0 9504601 8 4
- Indians, Real and Imagined
- Outlasting Government Policies
- The Attack on Tribalism, 1887-1934
- The Coming of a New Deal
- The New Deal in Operation
- Termination and its Termination,1946-1970
- Urban Indians and Internal Colonialism
- Pan-Indian Movements
- Religious
- Political
- The Cult and Culture of Native Americans
- Epilogue: Tribalism and the Future
- Guide to Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1: Indians, Real and Imagined
In the popular imagination, supplied with its images by fiction, film and television, the North American Indian effectively disappears at the end of the nineteenth century, with the end of armed resistance to white encroachments on his land. The massacre at Wounded Knee in 1890 has become for many a symbolic event, signalling the end of the Indians as a proud and independent people. It has become an occasion now for regrets about the ferocity and confusion which accompanied westward expansion and for moralizing about the blinkered and racist attitudes of settlers and government officials alike.[1]
But this elegiac mood, comfortable as it may be, is dangerous if it obscures the fact that the “vanishing American,” as he was known, did not in fact vanish. It is true that many tribes had been wiped out, and the total Indian population in the United States had been drastically reduced from a figure variously estimated at between one and ten million before white contact to 248,253 in 1890.[2] By the 1920s, however, the figure was increasing again, until in 1970 it was 827,108, including the Aleuts and Eskimos of Alaska.[3] The United States Government, through its Bureau of Indian Affairs (BIA), today recognizes almost five hundred separate tribal entities, over three hundred of which still function as quasi-sovereign nations under treaty status. These range in size from tiny groups of a dozen or two, right up to the Navajo with a population of some 132,000 inhabiting a sixteen-million acre reservation. In addition, modern Indians are among the fastest-growing ethnic or racial groups in the United States, with the Navajo growing fastest of all.
But if they have not yet disappeared, it can still be argued that they have never gained any identity in the public mind to replace the earlier image of the proud but doomed warrior. The result has been a lack of public awareness which has had disastrous cultural, and economic consequences. On every available indicator—poverty, illness, life-expectancy, educational attainment—Indians are the most deprived group in the United States. A modern Indian spokesman has complained that “to be an Indian in modern American society is in a very real sense to be unreal and ahistorical.”[4] This pamphlet is an attempt to make modern Indians real and historical and, since they affect Indians so fundamentally, to trace the changing attitudes and policies of whites towards. them. While some of the analysis applies equally well to Canadian Indians, the differences between Canadian and United States policies have been substantial enough to make it impractical to deal with them together here.[5]
The fixing of Indians in a single historical role, as simple savages overwhelmed by the relentless march of progress, persistently presented not only in the entertainment industry but in standard history text-books,[6] has had the effect of obscuring the great diversity and richness of the original Indian cultures. These included the highly developed agricultural societies of the Pueblos in the Southwest, the fishing communities of the Northwest, and the organized confederacies of the Northeast Woodlands—as well as the nomadic hunters of the Plains, on whom the stereotype of the Indian came to be based. The aristocratic society, based on lineage and the accumulation of wealth, found amongst the Natchez tribe in the Southeast and the fishing tribes of the Northwest, would have been inconceivable to the hunters and gatherers of the Great Basin, who had few personal possessions and little centralized leadership. The torture of enemies practised by the Iroquois would have been even more alien to Pueblo Indians than to Europeans, and the organized and restrained religious ceremonial of the Pueblos was itself very different from the vision-seeking and shamanistic ecstasies found elsewhere.
Similarly diverse were the reactions and adjustments of the various tribes to white contact. The Hopi of New Mexico, shielded by their remoteness from Spanish and then American interference, have maintained their original culture largely intact. Many other Pueblo communities, subject to four centuries of white control, have developed a complex combination of Christian and traditional ceremonial and social life. Similarly, the Five Civilized Tribes of the Southeast[7] adapted promptly to new conditions in the early nineteenth century, becoming efficient farmers, producing their own newspaper-even owning black slaves. The hypocrisy of many white justifications for land-grabbing was revealed when they were compelled to remove to Oklahoma in the 1830s, for instead of just roaming over the land, these Indians ‘used’ it properly for settled agriculture as effectively as their white successors. Once in the West these particular tribes adjusted successfully once more, but other tribes resisted the transition bitterly. The pattern of Indian response to the new situations created by whites has continued to be a complex one, varying between tribes and between individuals within tribes, but this complexity has been obscured by the stereotype of the “vanishing American,” of a people whose demonstrated military weakness supposedly indicated a more general unfitness to survive in the face of progress, a stereotype which has proved one of the most important and insidious legacies of the past.
Contemporaries assumed that once their traditional way of life was destroyed the Indians would either die out altogether from disease, demoralization and alcohol, or become gradually absorbed, if lucky and talented enough, into white society. Reservations and lands set aside for Indians were accordingly seen as temporary arrangements rather than independent and self-sustaining units. An absolute distinction between a doomed but coherent Indian society and a demoralized remnant, vanishing either literally or culturally, persists in the white view of Indians up to the present, with damaging consequences. It ignores the actuality of cultural continuity and of creative adaptation to another culture. As Jeanne Guillemin argues, “when a people survives over generations, the first questions asked should be about continuity, not discontinuity.”[8] At the heart of the contrast between past cultural coherence and present degradation lies the idea of Indianness as essence,[9] with the corollary that if the essence is diluted by acculturation it disappears altogether. This makes an intriguing contrast with the traditional definition of a Negro, where as little as one-sixteenth Negro blood could be decisive and the style of life was totally irrelevant. Race is clearly not the only consideration in the definition of Indianness, as it is for negritude.
Concern with the essential or pure Indian has always been at the heart of much pro-Indian feeling and activity, and it is significant that early anthropologists, in their attempts to collect material from cultures they thought were disappearing, paid little attention to the cultures that were surviving and adapting. Anthropology for a long time was disinclined to examine culture-change, being more concerned with reconstructing the pre-contact ‘ideal’ culture from salvaged material. One result was the use of an “ethnographic present tense”—a way of writing about an abstracted, reconstructed culture as if it was existing in the present. This methodological assumption that one can deal with a culture without regard to history is crucial for much anthropology, in that the opposition between civilized and primitive has largely revolved around it. Consequently, in anthropologists’ eyes, once an Indian society was affected by white contact, once it had entered history, it was contaminated, ceased to be primitive, but also ceased to be anything else. As a result, while the study of blacks in American developed as sociology and the black community produced its own eminent sociologists, the Indian experts were anthropologists, and many Indian intellectuals were closely involved in anthropological work. The effect of this has been to strengthen the polarization of past and present, of pure and adulterated, and to draw attention away from the contemporary actualities of Indian society.
A more important factor in the establishment of this ‘essentialist’ idea in the popular mind, though, was the development of the ‘Western’ as a genre, in fiction and then in film, which effectively ‘froze’ the image of the Indian. Since the Western deals with a fixed period of the past, often taking on the ahistorical qualities and the fixed categories of allegory, Indians have had a clearly defined roleusually savages, but occasionally noble ones. While silent films included some sympathetic portrayals and dealt with a number of different groups of Indians, sound Westerns, whether feature films or the serials which reached their peak in the late 1930s, have been almost exclusively hostile portrayals, relying on the stereotype of Plains Indians—feathers, horses, scalping and all. In this genre Indians operate as an undifferentiated group, almost an element of the landscape, rather than as individuals.[10] Ability to deal with them and understand them is on the same level as ability to cope with natural dangers and is the hallmark of the experienced scout. As mere landscape they do not have an articulate voice—in fact, since most Indians were played by whites or Mexicans, very few film Indians spoke in any actual Indian language. Indeed, one film serial solved the problem by reversing the film, thus transforming the actors’ English into something alien while preserving the synchronization.[11]
All genres operate within conventions, of course, and we need not expect Westerns to operate as realism, or condemn them for using stereotypes rather than historically accurate and differentiated portrayals. In literature or film, the Indian has been seen as a repository of a set of values, both negative and positive, rather than as an individual. Genres like the Western develop and feed upon themselves and their own conventions, exploring issues at an abstract level, using stock characters and types. The trouble is that, while cowboys, sheriffs and gun fighters live only in the mythic past and hardly threaten the identity of modern whites, modern Indians are made invisible by the presence of their mythic predecessors. In this way it has been argued they are different from the inhabitants of other genres, like Transylvanians or vampires, and the use of stereotyped Indians unfortunately has a social and political dimension it does not have in some other cases.[12]
The most recent revisions of the genre have reversed and subverted its conventional values in spoof and anti-heroic Westerns. They have challenged many of its own assumptions, have often changed the role of the Indians by giving them fuller and more sympathetic roles and by presenting them individually as well as in groups. At its worst this approach produces a sentimental portrayal of noble savagery which just reverses the stereotypes rather than abolishes them. At the time of American involvement in Vietnam, the revisionist Western was also used to examine white American policies and attitudes towards Indians as part of a larger attitude towards other races, and to present Indian policy as the first part of a developing imperialism. Indians and others have pointed out the frequent descriptions of the war in Vietnam in terms borrowed from Westerns.[13] It is as part of a re-examination by Americans of their own history that this new type of Western and the huge success of Dee Brown’s Bury My Heart at Wounded Knee (1971) must be seen, rather than as a proper recognition of present-day Indians and their situation. The more relevant contemporary parallel, as far as Indians are concerned, is not between nineteenth-century Indians and the twentieth-century Vietnamese but more importantly, and more complexly, between contemporary Indians and underdeveloped colonial peoples in South America and Africa. In the following chapters which outline white policies since 1887 and Indian responses to them, the key issues are not armed confrontations, genocide or physical suppression, so much as assimilation, paternalism and control of economic resources. The effects may be equally serious, but are harder to evaluate, particularly if we are hampered by images appropriate only to the earlier period of military conflict, expropriation and forcible resettlement.
2. Outlasting Government Policies
i. The Attack on Tribalism, 1887-1934
If the massacre at Wounded Knee in 1890 remains a symbol of the end of Indian armed resistance and, popularly, the end of the ‘real’ Indian, a lesser-known event, the passing of the Dawes Act in 1887, was equally significant, and certainly more important for the new terms of Indian existence. In particular it anticipated what was to become a permanent characteristic in this century—that bewildering mixture of the philanthropic and the predatory whereby all measures, however harmful in effect, are presented as being in the best interests of the Indians.
Senator Dawes’ proposals were seen by philanthropists as an attempt to plan a possible future for Indians which would be better than the neglect and demoralization of reservation life. Many tribes had been removed from their original homelands to the alien land of a reservation and once there were expected to become farmers rather than hunters. The combination of inexperience, poor land, inadequate provision of equipment, and a deep antipathy to work considered demeaning for warriors and hunters, ensured the failure of farming and a consequent dependence on the government agent for the distribution of rations. This meant that the agent became the ultimate authority, subverting the role of traditional leaders and regulating behaviour by handouts or punishments. In addition, the Indians’ claim to the land they occupied was still liable to attack by settlers and railroads using legal or illegal means, a state of affairs which had prompted General Sherman to define a reservation as “a tract of land set aside for the exclusive use of Indians, surrounded by thieves.”[14]
Rapid changes in the Indians’ situation, from independence to something between objects of charity and prisoners of war, had left them extremely vulnerable, not least because of the complexity of their legal status. As original sovereign nations, Indian tribes had always dealt with the federal government rather than with state governments or individual citizens. In particular, treaties or agreements over land could be made only by the federal government. Defeated tribes were made subject to federal authority exercised through the Bureau of Indian Affairs, which was part of the War Department until 1849 when it was transferred to its present location in the Department of the Interior. The BIA’s role was, and remains, to administer federal programmes as directed by Congress, and to act as trustee for Indian resources, mainly land. It has always been a position of considerable power, whether used despotically or paternalistically, and since most Indians on reservations in the nineteenth century were neither United States citizens nor members of another independent nation they came increasingly to be seen as wards of the BIA. Chief Justice Marshall’s description in 1830 of Indian tribes as “domestic dependent nations” whose “relation to the United States resembles that of a ward to his guardian” was an early attempt to describe the newly developing status of Indian tribes, and subsequent legal decisions tended to fix upon this idea of dependency and play down the idea of the inextinguishable sovereignty of Indian nations which Marshall developed in the later important case of Worcester v. Georgia in 1832. As a result the Indians had the worst of both worlds. As a recent re-evaluation of Indian legal history points out, Indians were “enough ‘within’ the United States to be as much subject to Congress as citizens, but enough ‘outside’ the United States to lack constitutional protection. A more ideal legal status for tribes could not have been demanded by those bent on forcing them to be white.[15]
The Dawes Act (or, more properly, the General Allotment Act of 1887) proposed to end this situation by the simple but Draconian method of making Indians into American citizens. If Indians in their original form could play no part in an expanding and dominant white civilization, then they must either die out in poverty and demoralization on reservations or join the mainstream of American life, and the ways in which they were to be encouraged to become Americans rather than Indians reflected clearly some of the ideological assumptions which had generated much past conflict between whites and Indians.
Under the Act each head of a household was to be allotted 160 acres of land, with 80 acres being given to single persons over 18 and to orphans. For twenty-five years the land would be held in trust by the Secretary of the Interior and could not be sold. After that time it would become fully the Indian’s property, and he would then be subject to all the normal state and federal laws. This was seen as a way of encouraging Indians to see themselves, and be accountable for themselves, in individual and nuclear family units rather than as a tribal entity holding its land in common—an anathema to traditional American individualism. Theodore Roosevelt, in his 1901 State of the Union message, promised to split up and allot tribal funds as well as tribal lands. He saw the original Act as a “mighty pulverizing engine to break up the tribal mass. It acts directly upon the family and the individual.”[16] It was presented in general as something beneficial to both Indians and whites. “Shall he remain a pauper savage, blocking the pathway of civilization, an increasing burden upon the people?” asked one proponent of the bill, “Or shall he be converted into a civilized taxpayer?”[17]
The Act, it was felt, would at least give the Indians a chance of becoming civilized, guaranteeing them a livelihood, some land, and the rights and protections of full citizenship. What exactly being a civilized American entailed is well revealed in a ritual for admission to citizenship specially designed by whites for Indians. The Indian was forced to give up his name and take a white name. He then shot an arrow, and was told “you have shot your last arrow .. . . Take in your hand this plow [for women it was a work-bag. This act means that you have chosen to live the life of the White man—and the White man lives by work.” He was then given a purse since “the wise man saves his money so that when the sun does not smile and the grass does not grow, he will not starve,” and swore allegiance to the flag.[18] Clearly revealed here are the assumptions about land, work and thrift which the Dawes Act hoped to impose on Indians whose traditional cultures notably lacked them. For Indians land was not a commodity to be owned, divided and exploited so much as a source of spiritual power, held in a kind of trust by the tribe as a whole as an expression of the benevolence of a supernatural power or culture hero. The faith in the interlocking strengths of the tribe, the land and supernatural power must have made the American idea of individual selfsufficiency through thrift appear very alien.
In retrospect, the failure of the Act to create American citizens in one generation was inevitable, but even in practical terms the scheme was misconceived. Most of the reservations were grasslands too dry or infertile for farming, and even though there was provision within the Act to allow for larger, more economically feasible allotments for grazing in these cases, most allotments were still too small to be viable. Many Indian farmers went into debt, mortgaged their land to obtain tools or seed, and when the land was fully theirs were forced to sell or lease it to whites. So the overall effect of the Dawes Act was not to create self-sufficient citizens but to release to whites large areas of land. Furthermore, one of the key provisions of the Act authorized the federal government to buy up the remainder of the land after the allotments had been made and use the money, or keep it in trust, for the benefit of the tribe. By this means and by the sale of individual allotments by Indians, Indian land holdings were reduced by twothirds, from 138 million acres in 1887 to 47 million acres in 1934, when the policy was changed. In addition, the land most in demand by white buyers was of course the most fertile or mineral-rich, with the consequence that by 1934 much of the land remaining in Indian hands was of poor quality. Some warning voices were raised at the time of passage predicting this loss of Indian lands, and it is now hard to distinguish how much of the support for the Act was benevolent optimism and how much hard-headed land-grabbing.
In one area at least, though, reformers and exploiters were united, and that was in the assumption that the tribal Indian must go. In consequence the attack on reservations and common ownership of land was parallelled by a determined attack on Indian values in education. Boarding-schools where Indian children were forbidden to speak their own language or maintain their traditional style of dress and hair-length were seen as means of creating American citizens. Traditional religions—which were not even recognized as such and were dismissed as pagan or heathen superstition—were replaced by Christianity, and the ideal product of a Christian education was an Indian who renounced his own family and background, and succeeded in the white world. Education thus became synonymous with civilization and competence in white society, and it was assumed to be unfair to a group not to equip its members fully in the ways of the dominant society and eradicate their ‘backward’ ways.
This insistence on assimilation at all costs was partly an expression of a missionary impulse to civilize and Christianize the savages, but it was also related to an ideal of America as a melting-pot. Whereas most European immigrants chose to become Americans and had a great deal to gain from citizenship, for Indians the benefits were less obvious. Neither the Dawes Act nor the extension of citizenship to all Indians in 1924 released Indians from economic dependence on the federal government. Nor did they manage to destroy Indian tribes as communities. Perhaps in the end the most significant assimilation to take place in this period was the ‘assimilation’ of a great deal of Indian land into white ownership.
ii. The Coming of a New Deal
The assimilationist assumptions underlying the Dawes Act were increasingly being challenged by the 1920s. Partly in enforced recognition of the fact that the Indians who were supposed to disappear had manifestly failed to do so, the Secretary of the Interior commissioned an independent report, published in 1928 as The Problem of Indian Administration. This large and painstaking study, usually known as the Meriam Report after its chief architect, showed clearly both the failure of government policies, including the General Allotment Act, and the desperate problems facing the Indians in the fields of economics, health and education. Reflecting the changing climate of opinion, the Report envisaged the possibility that Indian culture should continue in its own right, and while it still assumed that the aim of government policy should be to help Indians adjust to white society, it drew attention to the developing tools of social science as instruments to make this possible.
The Meriam Report needs to be seen against a background of increasing activity on the part of reformers and intellectuals in support of Indian claims to retain their own culture and autonomy. From the early years of the century there had been an increasing interest in the possibilities of cultural pluralism, and as a result the values rather than the wretchedness of Indian life began to be emphasized. Seeing American culture as thin and bland, many intellectuals increasingly valued the older ethnic characteristics being brought to America by immigrants, and began to give them primacy. Rather than seeing the ideal society as based on voluntarist individualism, they began to see the values of conservative institutions and heritages as supportive and creative of personality. A society or culture was more than just a collection of economically motivated individuals.
This organicist approach to cultures and societies, with its view of a culture as a self-sustaining whole, had important ramifications in attitudes towards immigrant and urban communities,[19] but it was also developing in anthropological theory, largely through the antievolutionist work of Franz Boas and his many followers. The supportive and integrated nature of traditional cultures was contrasted with the anomie of modern society (though Boas himself avoided such generalizations) and Indians became subject to a different sort of attention. Precisely those characteristics which had previously marked them off as primitive and inferior—their lack of a historical dynamic, their lack of individualism, their intuitive and religious rather than rational mode of thought—became admired. As a result it was natural that the intellectuals would look to the Southwest and to the least assimilated or, to see it in the new way, least disintegrated cultures.
The agricultural cultures of New Mexico and Arizona had managed to escape the effects of allotment, and although affected by successive white cultures they had maintained a traditional culture to a remarkable degree. It was this culture, as well as those of South America, which greatly influenced D.H. Lawrence, through his stay at Mabel Dodge Luhan’s artists’ colony at Taos in New Mexico. His reactions were characteristic of the intellectual climate, both in their idealization of the culture and, equally important, their denigration of Indians who failed to correspond to this idealized picture.
The Indian who sells you blankets on Albuquerque station or who slinks around Taos plaza may be an utter waster . . . . He may have broken with his tribe, or his tribe itself may have collapsed finally from its old religious integrity and ceased, really to exist. Then he is fit for rapid absorption into white civilization, which must make the best of him. But while a tribe retains its religion and keeps up its religious practices, and while any member of the tribe shares in those practices, then there is a tribal integrity, and a living tradition.[20]
Once again, it is all or nothing. Proper, pure Indians are noble. Anything less does not deserve to exist. Eventually many of the Indians subjected to this idealization tired of their role. One publicly offered to exchange his home for Mabel Luhan’s, with its mod. cons., when she tried to block the modernization of the Taos Pueblo by the introduction of sanitation.
Excesses apart, at least this approach encouraged the survival of Indian cultures, and it was turned into concrete practice in the work of another visitor to Taos, John Collier. Having worked at developing and sustaining communities among immigrant groups in New York City, he became increasingly pessimistic about modern white society and found in the Pueblo Indians something of what Lawrence found. In his autobiography Collier describes the impact of their communities:
The discovery that came to me there in that tiny group of a few hundred Indians, was otpersonality-forming institutions, even now unweakened, which had survived repeated and immense historical shocks, and which were going right on in the production of states of mind, attitudes of mind, earth-loyalties and human loyalties, amid a context of beauty which suffused all the life of the group . . . . It might be that only the Indians, among the peoples of this hemisphere at least, were still the possessors and users of the fundamental secret of human life—the secret of building great personality through the instrumentality of social institutions. And it might be, as well, that the Indian life would not survive.[21]
During the 1920s Collier had fought against the federal government in support of Indian groups, and in particular against the Bursum Bill, which attempted to deprive Pueblo Indians of some of their land and water rights. He developed a deep and genuine commitment not only to Indians and their cultures but to a view of human society which was at odds with the atomized and commercial society he saw around him. These views, combined with his experiences in the Southwest, influenced him to place an exaggerated stress on communal as opposed to individual life. To some extent he already knew what he wanted to find among the Indians and was determined to find it. Where earlier observers working from a Social Darwinist perspective had seen Indians as doomed, Collier saw them in the context of his reading of Kropotkin’s Mutual Aid as an example of how to survive by community rather than be destroyed by individualist competition. In 1934 Collier took office as Commissioner of Indian Affairs under Roosevelt, and the same year the idea of culture as an organic whole was given its most influential anthropological rendering in Ruth Benedict’s Patterns of Culture. Here the communal Pueblo culture was treated much more sympathetically than the apparently possessive individualism of the Northwest coast fishing cultures. Benedict’s views have since been challenged, and it is now obvious that her book reflected her own intellectual climate as well as the cultures she described.[22] This sort of idealization of one group had serious consequences, though. It led Collier to generalize about Indians from the single model of the Pueblo Indians, with their tight social structure, highly organized religious hierarchy and freedom from the effects of allotment. He then tried to apply his generalizations to other groups like the Plains Indians whose traditional culture inclined more towards individual action and expression, and who had in many cases made fairly full adjustments to a system of individual land-holdings. In trying to develop policies which would repair the damage done by allotment, in terms both of loss of land and break-up of cultures, he tended to force Indians into his preconceived mould. Nevertheless, his policies were perhaps the first serious and generous attempt to help the Indian which was not also a pretext for taking their land.
iii. The New Deal in Operation
Collier’s sweeping proposals for an Indian New Deal became, in an adapted and curtailed form, the Indian Re-organization (or WheelerHoward) Act of 1934. It applied to all states except Oklahoma, and was eventually accepted by 192 of the 263 tribes who voted on it. It included:
- the end of the policy of individual allotment and of the subsequent alienation of Indian land;
- comprehensive plans for the setting up of tribal governments, thereby developing a degree of Indian self-determination by granting “certain rights of home rule”;
- the establishment of a revolving credit fund, which, together with an increase in the land base, would help to improve the economic life of the reservations.
These were undoubtedly helpful and well-intentioned proposals, which in the end helped to ease the extreme poverty and demoralization of many Indians, preserving their land-holdings and setting in motion the complex process of achieving a degree of selfdetermination. However, the Act also envisaged a more active role for the Bureau of Indian Affairs, placing a greater reliance on the expertise of social scientists and planners. This contradiction generated a great deal of friction and animosity. As one historian remarks, “The reconciliation of local democracy at the tribal level with the bureaucratic expertise needed in Washington, D.C., to run a complex colonial policy was a fundamental challenge that Collier failed to meet.”[23]
The most extreme example was Collier’s insistence on trying to reduce the severe over-grazing of Navajo land by forcing the Navajo to reduce their stock. Even though reduction was technically the best remedy and compensation was offered, the seemingly arbitrary slaughter of sheep, which represented their main security, alienated the Navajo. Their attitude to land and livestock involved considerations other than just the economic, and the apparently wasteful killing of animals, based on cold estimates of cost-effectiveness, underlined the difference in attitudes between the government experts and the people they were helping. Collier’s insistence on imposing his policy pointed up the streak of authoritarianism in his approach. Rather than neglect, the Bureau was now offering advice—but since the experts always knew best, the advice had to be taken—and of course the Commissioner had the last word, tribal self-government or not.
In his determination to develop the economic resources of the tribes, Collier sometimes ran rough-shod over Indian groups suspicious of the big companies who were all too happy to exploit mineral resources on land leased from the Indians—an issue later to become of central importance. Similarly, his prejudice in favour of triballyheld rather than individually-held land meant that he was accused by some Indians of trying to return them “back to the blanket.” In addition he was committed to the double task of introducing the teaching of traditional native values in the schools while dismantling the boarding school system which had been designed to encourage children to take on white values. These policies, some Indians complained, would produce children and adults ill-equipped to deal with life in modern white America.
One further, and important, irony about Collier’s attempts to develop and sustain genuine community deserves attention. The form of tribal self-government which was encouraged was of course that which seemed to whites most representative. However, the more traditional Indians were suspicious, if not downright dismissive, of elections and constitutions, seeing them as relevant to white rather than Indian forms of government. As a result, those Indians who co-operated and achieved power in tribal councils were individuals who already inclined towards white patterns of life, and in many cases had only small amounts of Indian blood. This became an important issue on reservations in the 1960s and ‘70s when objections were raised to the financial and political power of tribal councils dominated by white-oriented Indians. These “progressives,” as they have become known, were felt to be unrepresentative of traditional Indians, whose interests were ignored by them and who were consequently the most deprived groups. Ironically, while the New Deal administration was concerned to support traditional ways, the long-term effect of strengthening the BIA was to improve the political and financial position of “progressives” rather than traditionalists.
Clearly Collier’s programme was not lacking in benevolent intentions, in expertise, or, in its earlier stages, in resources. What it lacked was a real understanding of the paternalistic and colonialist nature of its policies and their direct political implications. Steve Talbot’s verdict is perhaps too sweeping, but it highlights the terms in which the Indian New Deal was to be seen by later generations. “In essence the Act marked a shift from the government’s policy of direct rule of reservations as internal colonies to one of indirect rule, a shift from outright colonialism to a system of neo-colonialism.”[24]
iv. Termination and Its Termination, 1946-1970
As government spending increasingly flowed into the war effort after 1941, Collier’s schemes became starved of funds. In addition, a growing opposition to New Deal policies had developed, and by the end of the Second World War the reaction had fully set in. Pressure began to be exerted not to give financial support to Indians as special communities, but to help them to full and speedy independence as American citizens. The formation and operation of the Indian Claims Commission revealed the changes in government thinking. The Commission was set up in 1946 to hear and decide claims against the federal government for broken treaties or agreements, usually about land, with provision for financial recompense if the claim was justified. Previously Indians needed to obtain a special Act of Congress to pursue a claim, and such an Act could severely limit the terms on which the claim could be made to the Court of Claims. The original intention behind the Commission was, in response to recommendations from both Meriam and Collier, to make it easier for Indians to obtain justice. Once it was set up, though, many whites came to see it as a way of clearing up all claims against the government so that, the slate cleared, federal responsibility for the Indians could be completely ended. This was certainly a move away from the New Deal approach, which, while committed to selfdetermination, also spelled out a long-term role for the BIA. The Claims Commission turned out to be more important and longerlived than had originally been envisaged, and was even extended until 1978, by which time some 484 of the 615 claims had been decided, with awards totalling $669,200,000. Remaining cases were then transferred to the Court of Claims.
Ironically, an agency apparently established to settle injustices has come to be seen as part of a conspiracy against Indian rights. In this way scepticism arises about the rhetoric of benevolence in which all actions concerning the Indians have been presented, nowhere seen to better effect than in the important policy of Termination developed in the early 1950s. In 1953 House Concurrent Resolution 108 aimed to erminate all federal services to the Indians “at the earliest possible time.” One of the architects of termination policy, Senator Watkins of Utah, saw it as a “return to the historic principles of much earlier decades” after the deviation of the New Deal. The long-term movement was towards “full freedom,” and he saw the Claims Commission’s role as “to clear the way toward complete freedom of the Indian by assuring a final settlement of obligations—real or purported—of the federal government to the Indian tribes and other groups.” The emphasis here, for Watkins, was on the word “final” and it reveals the rationale behind termination policy. After years of subjection to bureaucracy and government, went the argument, the Indians were now being given their full rights, and Watkins’ terms were similar to those used earlier in support of the Dawes Act:
Self reliance is basic to the whole Indian-freedom program. Through our national historic development the Indian was forced into a dependent position, with federal government more and more, as America advanced westward, tending to sublimate his natural qualities of self-reliance, courage, discipline, resourcefulness, confidence, and faith in the future. Congress has realised this and has steadily acted more positively to restore to the Indian these qualities.[25]
Translated into practice, this meant that twelve tribes considered to be ready for independence were to have all special federal services discontinued. Indians would be liable for taxation on their land and would lose their specially provided health and education services. In return for being divested of all treaty rights, each Indian would be given a cash settlement, representing a share of the liquidated assets of the tribe—since the tribe as a property-holding entity would cease to exist.
However unsatisfactory the paternalism and bureaucracy of the BIA, it turned out to be preferable to this sudden “freedom.” Most of the tribes were ill-prepared for such a sudden transition. Individuals given large sums of money often used it unwisely. Land and other assets were sold off, often at low prices, and those who did try to hold parts of the original lands together and operate as a business, as in the case of some of the Paiutes of Utah or the Klamath of Oregon, found themselves administered, not always paternalistically, by banks.[26] Welfare provisions which should have been available at a state level once federal services were discontinued were difficult to obtain because in many cases the Indians did not have the relevant documents, like birth certificates.
The well-documented experience of the Menomini Indians of Wisconsin reveals many of these intractable problems. Owed payments of $1,500 apiece by the Treasury for an earlier claim, they were told they would be paid only if they accepted termination. Having accepted, they received only half of this sum, but this was the least of their problems. Occupying a 234,000 acre reservation, the Menomini had for some time been able to finance many of their activities from their substantial tribal assets, but they were unprepared for the enormous expense involved in termination. To preserve themselves as a unit they became a county of Wisconsin, and the business operations became incorporated as Menominee Enterprises, Inc. This, however, made them liable to the same taxes and liabilities as all other counties, and as a very poor county, the Menominee tax-base was totally inadequate for what was required of it. In addition, they lost their previous exemption from hunting and fishing restrictions, an important additional source of food, especially to the poor. The unemployment rate was 50 per cent and badly needed welfare programmes had to be reduced. In this case, as in others, the “freedom” promised amounted to freedom to sell tribal land and assets, resulting in yet another reduction in the economic base.
Termination policy fell into disfavour even with white politicians by the 1960s, when it was clear that only tribes that were fully ready, and genuinely willing, to terminate their relationship with the federal government should do so. Between 1972 and 1976 Congress passed several acts improving its provision of educational, health and financial assistance, and even returned lands to Indians, including the previously terminated Menominee reservation in Wisconsin. This followed President Nixon’s important statement in 1970 of his policy of “self-determination without termination.” As he pointed out, enforced termination was practically disastrous and, more important, morally indefensible:
Termination implies that the Federal government has taken on a trusteeship responsibility for Indian communities as an act of generosity toward a disadvantaged people, and that it can therefore discontinue this responsibility on a unilateral basis whenever it sees fit. But the unique status of Indian tribes does not rest on any premise such as this. The special relationship between Indians and the Federal government is the result instead of solemn obligations which have been entered into by the U.S. Government . … To terminate this relationship would be no more appropriate than to terminate the citizenship rights of any other Americans.
The effect of termination policy has been long-lasting, and not only on those groups directly involved. Long after it was discontinued a mistrust of federal attempts to encourage independent action on the part of Indians remains. It created a situation where the more successful and independent a tribe became, the more vulnerable it was—or felt it was—to sudden and arbitrary termination. There seemed no way out of the trap of paternalism and dependence as long as the alternative was the threat of the abrupt withdrawal of special status. The task of successive recent administrations has been, as Nixon put it, to “make it clear that Indians can become independent of Federal control without being cut off from Federal concern and Federal support.”[27]
3. Urban Indians and Internal Colonialism
Traditionally we think of Indians as living on reservations, and so far this study has concentrated on these groups, but during the twentieth century the number of urban Indians has been steadily increasing until it now comprises about half of the entire Indian population. Partly through relocation programmes and partly as a result of individual actions, Indian communities have developed in many major cities, with the largest numbers going to Los Angeles (an estimated 50,000, drawn from tribes throughout the United States), San Francisco, Chicago, Dallas, Denver and Minneapolis-St. Paul. While some urban areas, like Minneapolis-St. Paul, have fairly homogeneous Indian populations, most have a mixture of groups, and the development of pan-Indian thinking is one important result of this mingling of tribes.
On moving to the city, Indians find themselves having to survive unaided and resourceless in an unsympathetic white society, and so they fall back on these new communities of similarly uprooted tribesmen for immediate help with work and accommodation. They are frequently invisible in many surveys: they neither make much contact with official bodies, nor do official bodies seem very interested in making much contact with them. Urban Indians in effect experience the abrupt loss of the protection and administration of the Bureau of Indian Affairs with which termination policy threatened reservation communities, since BIA services cover only Indians on or near reservations.
In establishing the pattern of services for Indians in the 1930s, the BIA followed the Meriam Report findings of 1928, which seemed to suggest that Indians moving to cities assimilated rapidly and achieved living standards comparable to whites. Later evidence clearly suggests otherwise, but only relatively recently has the BIA shown much awareness of urban Indians. Thus, in spite of concerted efforts by the government to relocate young Indians in cities, there was a lack of real planning and provision for them. The relocation policy of the 1950s can be seen as complementary to termination. Establish Indians in cities, the argument went, and you have effectively solved the problem. Like all other immigrants, they will find their place. Looked at more generously, it can be seen as a reasonable response to the chronic unemployment problems on reservations caused by lack of industry and investment—itself often the result of the isolation and lack of good communications on many reservations. In addition, an increasing population put additional strain on the often shrinking land base. Unfortunately, since many of those who went to the cities were untrained and unprepared they failed to find work, and the unemployment problem was merely transfered from reservation to city.[28]
Actual living standards of urban Indians are not always easy to compare with Indians on reservations. Certainly their average wage is higher—in 1969 it was almost double the reservation average and was closely comparable with average black income—that is, about 60 per cent of average white income. At this rate, “about 20% of urban Indian families had incomes below the poverty line in 1969: the proportion was more than twice that high among rural Indian families.”[29] By itself, this would suggest that, while Indians as a whole are the poorest group in the country, rural Indians are markedly worse off than their urban counterparts, but set against this is the fact that free medical care is available on reservations, while in many cases Indians are living on their own allotted or tribal land and so have no rent. In addition, prices are often higher in cities. While welfare services are theoretically available in cities to all who need them, Indians are often reluctant to use them, and administrators often feel that they should be dealt with on the reservation anyway. The result is that Indians commonly work in cities until they become ill or unemployed and then return to the reservation, if only temporarily, for medical treatment.
The average school-leaving age of Indian migrants is higher than that of the on-reservation Indians, suggesting that the more able young people leave the reservations. But most of them are unskilled, at least on their arrival in the city, and so tend to operate at the bottom of the labour market, vulnerable to fluctuations in employment levels. This predominance of casual unskilled work combined with the tendency to return periodically to the reservations or to move to another town leads one authority to categorize the majority as “an unstable lower working-class group which is marginal to the economy and social structure of the metropolis.”[30] When one combines this with the high incidence (and especially high visibiliy) of drunkenness, it is perhaps easy to see urban Indians as alienated, marginal individuals cut off from community and Indianness altogether.
Some recent work, however, suggests a more positive way of viewing this latest development of Indian cultures, while not minimizing or condoning the real deprivations involved. In suggesting the idea of a network of relations rather than a fixed community, Jeanne Guillemin in a study of Micmac Indians in Boston argues for a more positive view of mobility and marginality:
The network concept permits a definition of community that can put aside the usual concern with place and property and instead consider enduring patterns of culture spread over time and space. In societies like our own, minorities have been urbanized for generations, yet remain a people apart, without the establishment ofconventional land-based communities . . . . An urban minority community, whether or not the label “tribal” is attached to it is inevitably a network of relationships among the propertyless, among people for whom the city is a back-drop, a setting, and for whom survival often means maintaining a high rate of mobility beyond any initial migration to the city. The urbanization ofminorities has failed to be the transformation ofindividual country bumpkins into alienated cosmopolitans: it has been typified instead by the development of a variety of social networks which have defensive characteristics as well as an internal social organization.
What distinguishes Indians from other minority groups similarly locked into the ‘secondary labour market’ is the relation to the reservation, and Guillemin argues that this network model allows us to see the two environments as an inter-related whole, with the reservations providing “a conglomeration of ‘home bases,’ that is, extended families which will host individual adults and children for longer or shorter periods of time.” One big advantage of this functionalist approach is that it avoids seeing Indians as either rural and ethnically pure or urban and deracinated, and stresses a continuity of culture and community based on kinship and tribal ties. As Guillemin says, “the seemingly random organization ofurban households and social networks reveals itself as an efficient means of maximizing the participation of adults in a cash economy while providing for the care of children.” She compares it to the so-called matrifocal black family which can also be seen as a logical and “efFicient way for a group to divide its cultural labor, given the demands of economic marginality.”[31]
Does this sort of comparison mean that urban Indians and the smaller groups or rural Indians, particularly in the East and Midwest, constitute primarily what has come to be called a “culture of poverty”—which has so much in common with other disadvantaged groups that it ceases to be useful to call it Indian? It could be argued that what are seen as characteristically Indian social traits are products of their status in relation to white society rather than traditional norms or values:
American Indian composite households and family household cycles are not retentions of aboriginal customs, but are products of their meager and unstable incomes, lack of skills, and lack of control over resources . . . . Indian family households change from composite to nuclear to composite as their economic conditions change, making the Indian family similar to other families living in poverty in the Western world.[32]
An alternative way of putting it is to say that traditional Indian culture had usually involved minimal resources, and this state comes to be seen as poverty only in relation to the dominant white society. As Murray L. Wax expresses it, “Hardship becomes expressed as ‘poverty’ only when it is linked in a socio-economic system with those who are better off—the rich—thus establishing an asymmetric relationship.”[33] If it were just a question of different and separate life-styles, freely chosen, it would perhaps not matter that Indians had less formal education, sanitation, etc., but of course it is a fact that the endemic poverty of Indians is closely related to the systematic expropriation of their resources by white society. Crucially, too, this poverty has shown no signs of decreasing, at least in relative terms, in spite of large increases in BIA staffing and funding.
This situation has prompted more far-reaching analyses and hypotheses which examine the situation of the Indians in terms of colonialism. According to this sort of hypothesis, “the conditions of the ‘backward’ modern American Indians are not due to rural isolation nor a tenacious hold on aboriginal ways, but result from the way in which United States’ urban centers of finance, political influence and power have grown at the expense of rural areas.”[34] Joseph Jorgensen elsewhere uses the terms metropolis and satellite in preference to urban and rural to suggest that the centralizing and conglomerating tendencies of modern business and government are not just embodied in the city itself. The problems of the inner cities, where most urban Indians find themselves, are precisely the product of the concentration of power outside the inhabitants of those areas. This model is clearly applicable to the larger areas of colonialism,[35] and its importance is that it attacks at base the assumption that under-development is just a temporary stage on the way towards integration and acculturation. As Jorgensen says, “Indian development is the product of the full integration of U.S. Indians into the United States political economy—albeit as super-exploited victims of that society.”[36]
Just as in the nineteenth century land was taken away under the pressure of agricultural and industrial expansion, so today small farming, under pressure from the growth of “agribusiness,” has become uneconomic. With land often split up into small units because the complexities of the inheritance system cause it to be sub-divided among many descendants, Indians have found it difficult to succeed, but it would be wrong to think that most Indian land is undeveloped. White interests regularly take two-thirds of the total agriculture product of Indian lands, which they rent, while large industrial concerns have arranged very cheap mining leases. So Indian land and mineral resources are being exploited—but not for or by Indians.
The political position of the BIA is crucial here. It is responsible for the Indians, but answerable to the Secretary of the Interior whose brief includes both Indian administration and the development of national resources. The creation of the post of Assistant Secretary for Indian Affairs in 1977 was meant to assure the BIA of a strong voice in the Department of the Interior, but pressure from big energy corporations still prevails. Nevertheless with a budget of over one thousand million dollars in 1981, and extensive legal and economic control over the people it administers, the BIA has a great deal of power. From the Indian standpoint this power can seem tyrannical. BIA officials have ultimate control of almost all tribal operations and undertakings, although since the Economic Opportunity Act of 1964 there has been an increase in ventures initiated and run by Indians themselves, where the government’s role is limited to financial support. The BIA decides who qualifies as an Indian eligible for services and whether he is responsible or capable enough to control his own assets or dispose of them in his will. The combination of a stifling bureaucratic paternalism and erratic changes in policy has created a situation where, it has been said, Indians are being implicitly taught three lessons: “Selfrealization is frustrated . . . . Dependency is a virtue . . . . Alienation is rewarded.”[37]
If this were the whole picture, if, since little is to be hoped for from the very organizations set up to help Indians, no real improvement is possible, it would be a gloomy outlook. But what this ignores, in its emphasis on white actions and views, are the changes constantly taking place within the Indian communities themselves, and in particular the growing awareness of the situation in political terms on the part of at least some young Indians. For Indians themselves are now increasingly taking a hand in determining their future, in defiance of big business and government alike.
4. Pan-Indian Movements
Implicit in the idea of self-determination is the recognition of what Indians have known and accepted all along—that there are many ways of being Indian. The older broad distinction between “traditional” and “progressive” which correlated roughly with degrees of Indian blood has become less useful as relations between different groups have become more complex. In addition tribal identity, which has always been paramount for traditionalists, has been supplemented by a broader conception of Indian unity. The newer generation of Indians, and particularly urban Indians, has had to develop new ways of being Indian, and if a ‘new Indian’ seems a contradiction in terms it is useful to consider why.
Attempts early in the century to develop Indian organizations tended to be dominated either by white philanthropists or by Indians who had become to a large degree ‘civilized.’ As a result these Indians were in the position of having no-one really to represent but themselves. What they had in common was their loss of tribal identity and their achievement of a questionable role and status.[38] This is not to say that Pan-Indian developments were not taking place in other spheres. In earlier times Indians made political alliances, some of them large-scale and enduring like the great Iroquois confederacy of the Northeast or the Creek confederacy of the Southeast, without surrendering their tribal or cultural identities. There were also religious movements which spread through wide areas, such as that stemming from the nameless Delaware prophet in the eighteenth century, which was partly responsible for the uprisings under Pontiac, or the Handsome Lake religion still practised among the Iroquois today. The nineteenth century saw the development of a generalized Plains culture, initially because of the increased use of the horse by the many tribes attracted on to the Plains by the relative affluence offered, later because of the need to combine to fight white encroachment. It was really not until Indians were confined to reservations, though, that the new conditions forced upon them produced common responses, both religious and political.
i. Religious Movements
The earliest of these was the Ghost Dance, a millennialist religious movement that spread quickly through Western tribes at the end of the nineteenth century. The main element was the belief that “the time will come when the whole Indian race, living and dead, will be reunited upon a regenerated earth to live a life of aboriginal happiness, forever free from death, disease and misery.”[39] By taking part in the dance one could hasten this state—from which, of course, whites would be excluded. Scholars have debated how far the movement was a novel and direct response to deprivation, and how far it was a continuation of an older religion which was now adapted to nativistic and revivalistic ends as conditions under acculturation became intolerable. At the time white authorities saw it as potentially dangerous, and, interpreting it as a threatened insurrection, the Seventh Cavalry over-reacted in the infamous exercise of force which led to the massacre at Wounded Knee in 1890. Later commentators have agreed that many manifestations of the movement offered less threat to the whites than was feared. The Ghost Dance religion offered a substitute for the ritual and spiritual structures that were breaking down under the new conditions and, more importantly, it gave a key role to the power of visions. Traditionally Plains tribes placed great weight on the significance of individual visions, sought or induced by physical privation.[40] Though their, importance was mainly to the individual, their truth was accepted by the society as a whole.
The Ghost Dance did involve a specific nationalist element, which was what led the authorities to suppress it. What succeeded it was a movement equally adapted to fill the gaps left by cultural breakdowns, but representing a withdrawal from the actual political world rather than a messianic confrontation with it. The Peyote Cult, later the Native American Church, managed to incorporate both the traditional individual search for a vision and the recognized need for community and solidarity among Indians—a sharing of spiritual power. The solidarity, though, was spiritual only—an end in itself “Dreaming both symbolized withdrawal from the world of white men, and was its realization. Peyote was the agency through which such introversion could be manifested.”[41] Again, there are disputes as to the provenance of the different elements, but it seems accepted that it was from the Kiowa, Comanche and Wichita reservations that the Peyote Cult spread, until it became, as it remains, a major element in many Indian cultures and one of the most important non-tribal ways of being Indian.
The worshippers meet at night, around a crescent-shaped mound of earth, eat four or more peyote buttons, and then, while singing, pass around a special drum, carved staff and rattle. The peyote buttons are the tops of a cactus, Lophophora Williamsii, which grows mainly in northern Mexico and southern Texas. The complex and often unpredictable effects of peyote, ranging from intense exhilaration and visions to nausea, as well as the fact that the meetings were held at night, led to suspicion from whites and other Indians. Early in this century there were strenuous attempts to ban the use of the drug and the ceremony itself, and partly as a defence against these efforts, practitioners of the religion formed themselves into the Native American Church in 1918. Centred in Oklahoma, it gradually developed its intertribal nature. In defending itself as a respectable Church, it described the use of peyote as a sacrament, and certainly the Christian element was not just propaganda aimed at securing public acceptance. The belief in a ‘Heavenly Father’ could be accommodated with the Plains belief in a generalized Supreme Being, but peyotists also believed that peyote contained part of the Holy Spirit, and its use was granted to Indians as the equivalent of the white ‘use’ of Jesus Christ. The ingestion of the peyote button is, like the taking of the consecrated bread and wine of Christianity, a way of acquiring the Holy Spirit within oneself. As one of the developers of the Christian element explained, “The Peyote is a part of God’s body, and God’s Holy Ghost is enveloped in it.”[42] This was not enough to appease the Christians of the BIA, though, who charged the cult with orgies and condemned peyote as narcotic, though it is not in fact habit-forming. Eminent anthropologists weighed in for the Indians (probably the first time anthropologists were used as expert witnesses in support of Indians), but it was not until the 1930s and the advent of a sympathetic Commissioner of Indian Affairs in Collier that the Church was protected from persecution, leading to yet another of those anomalies which surround Indians—that peyote is prohibited as a dangerous drug except to adherents of the Native American Church.
While the forms it takes and the reasons for its emergence differ from area to area, peyotism clearly offers a communal experience which is easier to sustain in changing conditions than the elaborate traditional structures. With its basic ingredients of a ban on alcohol, an emphasis on moral and peaceful behaviour and on Indianness, it is seen as more relevant to Indians whose earlier sense of community has been threatened. For instance David Aberle, who studied Navaho peyotism, saw a direct link between the rise of peyotism and the seemingly arbitrary policy of slaughtering their sheep to conserve grazing forced on the tribe by the federal government in the 1930s. He describes peyotism as “a mode of expressing rejection of the traditional system and of the American system, a mode of coping with feelings of helplessness, and a way of engendering a total reorientation which assists in adjusting to wage-work and cash-cropping.”[43] This move away from the tribal past is by no means an abandonment of all Indian community, but it does represent a retreat from any engagement with white society.
ii. Political Movements
It was only after the New Deal policies, and perhaps as a product of the same sort of thinking that produced them, that the National Congress of American Indians (NCAI)—an organization which is at the same time national, political, and yet strongly tribal—came into being. Many of those who founded it in 1944 were important tribal leaders, and the Congress has remained, certainly until the 1960s, the most powerful and representative national body. Many factors, of course, were influencing Indians’ sense of their identity. The 25,000 Indian servicemen involved in World War II returned with a new sense of the world outside the reservation, and while for many it was a negative and bewildering experience, they at least were made aware of their situation. The shock of the threat posed by termination policies also forced a more urgent approach and led in 1961 to a “Declaration of Indian Purpose” aimed at achieving a national consensus on priorities and goals that could be worked for politically. The actual Declaration asked for self-determination (“the inherent right of self-government and sovereignty”), the protection of existing lands (“each remaining acre is a promise that we will still be here tomorrow”), and continued federal aid. Behind the agreement reached over this statement, though, were substantial disagreements over tactics, reflected in the founding, in the same year, of the National Indian Youth Council (NIYC) by radical college-educated young Indians. Like an increasing number of native American college students, unwilling to let white education “de-Indianize” them, they were dissatisfied with what they saw as the unrepresentative and ‘establishment’ nature of the NCAI, since it rejected activism of the sort being developed by civil rights workers, and concentrated on acting as a Washington pressure group. Many of the NCAI’s major figures were wealthy and successful “progressive” tribal leaders, often very sympathetic towards white business and government procedures and therefore, in the eyes of the young radicals, unrepresentative of the majority of Indians.
What was produced through NIYC, and later through the more controversial American Indian Movement (AIM) formed in 1968, was a concept of tribal nationalism which had a national network of co-operation and information, giving rise to a new situation, where “radical” and “traditional” Indians found more in common with each other than with the “progressives” of the NCAI. Differences arose partly over ultimate goals and partly over methods of exerting political pressure. Building on the model of black civil rights, but often exhibiting a flair and wit less evident in black demonstrations, Indian radicals have used tactics of non-violent confrontation and passive resistance with an eye to the symbolic and public nature of their actions, both to raise Indian awareness of what could be done and to publicize their case to a white public.
The original occupation of Alcatraz exemplifies these tactics. In 1969 a small group of Indians occupied the former prison site and claimed “ownership by right of discovery.” Continuing the parody of white colonialism, they offered to buy it from the government for $24, to be paid in glass beads, and promised to “give to the inhabitants of this island a portion of that land for their own, to be held in trust by the American Indian government—for as long as the sun shall rise and the rivers go down to the sea—to be administered by the Bureau of Caucasian Affairs.” With more contemporary relevance they explained their action by pointing out that Alcatraz already had “all the necessary features of a reservation: dangerously uninhabitable buildings; no fresh water; inadequate sanitation; and the certainty of total unemployment.”[44] The tone of the whole action was in keeping with Vine Deloria’s witty and scathing attack on white attitudes and action in his Custer Died for Your Sins, published in the same year.
The “Trail of Broken Treaties” organized by the American Indian Movement in 1972 was an angrier and more explosive event, though originally planned as a peaceful protest march involving groups from all over the country which would converge on Washington just before the presidential elections. Angered at inadequate accommodation and arrangements, the protesters occupied the BIA headquarters. Amid threats of forced eviction, protracted negotiations produced an undertaking from the authorities to consider at least parts of the twenty-point proposal drawn up by AIM. The proposal included demands for the restoration to Indian tribes of the status of sovereign nations, which would mean that their relation with the federal government would be by mutually agreed treaty. Following from this came proposals for the abolition of the BIA, the restoration of a substantial land base and the repeal of laws maintaining state or federal jurisdiction over Indians.[45] Partly to secure the building against the police, partly out of frustration and anger, the occupiers caused considerable damage to the building and its fittings by the time they left, but in spite of considerable press coverage of the violent aspects of the event, and condemnation from some other Indian groups, the occupation and its leaders found a considerable degree of support. Some of the AIM leaders like Dennis Banks, Russell Means, and Clyde and Vernon Bellecourt were becoming well-known figures—and attracting the attention of law-enforcement agencies in the process.
However useful this sort of national event and its publicity was, it was a long way from the specific and local concerns of most Indians, as the young activists were well aware. Since 1964 at least they had been developing tactics in concrete confrontations over local issues. In 1964 in the state of Washington the first substantial expression of civil disobedience occurred. It was significant because, although it related specifically to the fishing rights of small local tribes, the operation was aided and given national publicity by activists from outside. A series of illegal “fish-ins” was held to protest against the removal of fishing rights from the Nisqually, Puyallup, Quinault and other tribes. These rights had been originally gained in exchange for land ceded in 1854, but the state Supreme Court, in denying these rights, argued that modern methods of fishing had not then been envisaged and that the survival of the fish supply was at stake. The irony of the white authorities preaching ecology to Indians was further compounded by the fact that fish were being caught at other points on the river by much larger commercial concerns. Here as elsewhere, Indians had long struggled to preserve their own subsistence fishing against the interests of commercial fisheries and of sports fishermen supported by state officials.
A pattern developed of highly publicized and occasionally violent confrontations—whether over fishing, or over selling or exploitation of land without local consent—which led to protracted court battles. In the past, Indian groups, often without adequate counsel, had fared badly in this situation, but a series of decisions have more recently been going their way, even if very slowly. In 1975 the fishing rights of the Washington tribes were finally upheld, for instance, and a number of tribes have either had land returned or gained substantial compensation. In 1970 the Taos Pueblo regained Blue Lake with its surroundings, and in 1975 the Havasupai secured 185,000 acres from the Grand Canyon National Park. Other cases have ended favourably for the Indians, like the Passamaquoddy claims in Maine. The Sioux claims to ancestral land in the Black Hills of Dakota have produced offers of financial compensation, but many Sioux insist on the religious and cultural significance of the land, which cannot be replaced by money. Even when state or federal authorities have eventually yielded to demands, local hostility to Indian actions has often been intense, and the reaction of the authorities have sometimes escalated the difficulties. The most protracted and violent event, and the one with greatest symbolic overtones, was the occupation for seventy-one days in 1973 of the village of Wounded Knee on Pine Ridge reservation in South Dakota.
Pine Ridge, one of the biggest and poorest reservations, home of the Oglala Sioux, was an ideal place to highlight many of the most persistent problems. In spite of owning much of the land of South Dakota the Indians in that state had a per capita income of less than half that of the whites. Half the total Indian population was in fact below the official poverty line, and figures for health and education were comparably bad. Added to this was a political situation which Robert Thomas, in his 1964 anthropological study of Pine Ridge, described as “powerless politics.” He noted that cultural continuity on the reservation was maintained in the traditional religious groups which had, against all the odds, survived. These groups, though, were not represented in the tribal government, since it had been set up in a ‘democratic’ and white-oriented pattern by Collier and was not therefore acceptable to traditionalists. At Pine Ridge in particular the conflict between the tribal council under the leadership of Richard Wilson and the militants (comprising both traditionalists and young activists) had led to considerable violence. Wilson was accused of using the tribal police force to suppress and intimidate opposition to the nepotism and corruption involved in the distribution of BIA funds, tribal assets, and jobs. As one inhabitant remarked, “Jobs are so scarce that a janitorial job becomes a political appointment.”[46] Certainly the large amounts being spent by the government did not seem to be improving the lot of the most deprived. According to Edgar Cahn, “At the Pine Ridge Reservation in South Dakota, the second largest in the nation, $8,040 a year is spent per family to help the Oglala Sioux Indians out of poverty. Yet median income among these Indians is $1,910 per family.”[47] In addition, the one resource the Indians did have, the land, was often leased to white farmers or business interests at low rates, meaning that any profits to be made rarely accrued to the Indians.
This unhappy and long-persisting situation came to a head when floods brought more extreme hardship and tribal authorities overreacted to militants who tried to highlight the problems on the reservation. The occupation was accompanied by demands for investigation of the BIA and its support for Wilson, and for more direct recognition of the traditional Oglala Sioux, who constituted most of the population of Wounded Knee and who as traditionalists wished to see a suspension of the tribal constitution so that they could govern themselves. AIM’s willingness to risk violence was partly a reaction to the constantly high level of violence on Pine Ridge,[48] and partly a consistent development of AIM’s tactics. About 300 Indians were eventually involved in the armed occupation. Two were killed, and two others were wounded, as was one federal marshal. In the end thirty Indians were charged with serious offences. The event left its marks on the reservation in terms of unresolved enmities, adding to an already embittered atmosphere, and it also helped to define for the more revolutionary elements in AIM the terms of its rhetoric. Figures like Dennis Banks and Russell Means gained a great deal of publicity through their trials and their charges of police misconduct, but the almost constant charges and trials have tied up the energies of many leading members of AIM.
Clearly the activities of AIM have much in common with the pattern of black activism—even including the movement from peaceful protest, aimed at the conscience of liberals and appealing to the principles of justice, to armed confrontation to demonstrate and challenge the repressiveness of the white establishment. In 1974 Akwesasne Notes, the most important Indian newspaper to support AIM, published an article by Stokely Carmichael setting both Indian and black movements in a context of liberation struggles in the third world.[49] Throughout the seventies this sort of comparison reappeared, with extensive features in the same newspaper on the destruction of native peoples and cultures in South America, Australia, and even an article in support of Iran’s revolution against American imperialism. In addition, Indian groups conscious of their energy resources, and the political power that may involve, have approached OPEC for help and advice, an act denounced as unpatriotic by some white politicians.
It would be wrong, though, to interpret these actions as representing any growth of solidarity with other minority groups amongst Indians in general. Indians continue to stress their uniqueness, and mistrust analyses which stress the economic similarities of groups rather than traditional cultural differences. Identification with other groups is most likely to occur in future among urban Indians, who are not benefitting from special status. The cuts in urban and other special aid programmes by the Reagan administration will hit urban Indians and groups like the Chicanos (i.e., Mexican Americans) equally, and it is perhaps here that common ground will be found in future.
5. The Cult and Culture of Native Americans
White response to the growth of Indian political activism has often been curiously sympathetic in recent years. While whites directly involved in land or fishing disputes, or living in towns near reservations, are often hostile, the general public takes a different view. Even the Pine Ridge occupation in 1973 did not produce the public hostility which might have been expected—and would certainly have followed similar action by black militants. According to the New York Times, “51 % of those questioned supported the Independent Oglala Nation at Wounded Knee.” The breakdown of this figure is revealing. “Most sympathetic to the cause are persons in the East, those who live in the suburbs, young people under thirty, the college-educated, blacks, people with incomes of over $15,000, union members, independent voters, and Catholics.”[50] The overall pattern whereby sympathy increases in proportion to distance from the actual Indians has been well-established since the nineteenth century, but the sympathy of young people in the poll reveals a new element, namely the taking up of the Indian by what came to be called the counterculture. In part this was the radical chic which espoused and lumped together all anti-establishment activities as liberatory, but in addition it represented the use of the Indian as a model of the primitive—again!
This particular version of the noble savage had all the basic ingredients—simple but profound religious sentiments, a life-style uncluttered by modern technology, and a corresponding closeness to nature. The idea of ecology enabled white society to see that the interrelatedness of all aspects of existence found in traditional Indian thought had direct political and economic implications which modern society, with its piecemeal exploitation of natural resources, had ignored at its peril. In its concern for spiritual rather than material values and its apparently different approach to what constituted objective reality, Indian traditional society seemed to offer a critique of modern America which was taken up with enthusiasm by individuals and movements concerned to expand their consciousness outside the limits imposed by Western rationalism. The idea of the tribe involves that of consensus and community sustained and held together by ritual and oral performance rather than by the printed word, and Marshal McLuhan’s view of the electronics revolution “re-tribalizing” society was seen as forecasting the return of the integrated and total vision that had been lost under rationalist and print-oriented Western society.
The attention paid to visions, dreams and other ‘irrational’ states of consciousness in Indian religions was attractive for the same reason. Carlos Castaneda’s best-selling accounts of his encounters with a Yacqui Indian sorcerer, beginning with The Teachings of Don Juan published in 1968 and continued in five subsequent volumes, present a highly intellectualized American graduate anthropologist, anxious for the sort of knowledge that can be recorded, verified and assessed, being confronted by experiences of knowledge and power outside the terms he has at his disposal. By a combination of mental disciplines and skills taught him by Don Juan and the controlled use of psychotropic plants, he is able to experience a “separate reality” where physical laws seem to be inoperative, where a coyote talks and men fly. The books record Castaneda’s attempts to find a way of making room for both realms of experience. In their concern to give full weight to the validity of the primitive view and in Castaneda’s own full participation in the experiences themselves, the books are a departure from the detached, even condescending tone of earlier anthropology. In the development of the characters of Don Juan and Castaneda, and the structuring of the cycle of books, Castaneda draws on techniques of fiction more than anthropology. This, together with some internal inconsistencies, has raised serious doubts about the veracity of the whole story,[51] and these doubts have been increased by the evasiveness of Castaneda himself. True or not, he clearly said what a generation of students wanted to hear—that answers and alternatives to the impasse of logic and rationality reached in their own repressively rational society could be found within primitive cultures which had been previously despised.
At one level this general concern led to a commercial exploitation ranging from ‘Indian’ styles of dress to the publication and republication of a wide variety of materials by and about Indians. While some of this was spurious[52] or grossly sentimental, some was important ethnography at last given a wide audience. Whether as initiators or followers of this trend, many modern American writers, and particularly poets, were certainly profoundly influenced by American Indian material. The use of Indian material was, of course, not new. Earlier in the century Hart Crane had used Pocahontas as a way of repossessing the Indian origins of America, and twentieth-century novelists such as Hemingway, John Barth, Thomas Bergen William Eastlake and Ken Kesey have used the Indian to represent values, both positive and negative.[53] The recent poets are distinctive, though, in the degree to which they are influenced by Indian ideas and materials rather than just using them within a system of stereotypes. Perhaps the foremost example is Gary Snyder who, after writing a master’s thesis on Indian myths, has taken into his poetry both specific myths and a more general sense of ecological responsibility from Indian materials. Using Asian material too, he forecasts, and attempts to embody in his writing, the breaking up of industrial society and re-institution of values associated with primitive or archaic cultures.[54] Ed Dorn has also incorporated Indian materials into his work, and perhaps with less sentimentality, ranging from an early account of his encounter with modern Indian life in The Shoshoneans to the terse poems of Recollections of La Gran Apacheria.[55]
It is not, though, just a matter of using Indian subject-matter, translated into English and into pre-existent poetic forms. One of the most interesting and controversial developments has been the work done on ethnopoetics, where anthropology and poetry meet in the problematic area of translation. In trying to find appropriate forms to convey the full meaning of Indian materials, poets like Jerome Rothenberg and Dennis Tedlock have had to stretch our expectations of what poetry can be. In particular, in stressing the oral and performative aspects of Indian songs, chants and ceremonies, they have contributed to, and perhaps been aided by, a growing interest in performance in American poetry. What they have produced may sometimes run the risk of being artificial, abstracted and unacceptable to both cultures, but at its best it could perhaps reforge the links between ourselves and earlier, more varied forms of poetry. The concern to make ethnographical material relevant and accessible by ‘total translation’ is not just a taste for the exotic or antiquarian. In many cases it differs from earlier ‘salvage’ anthropology in being linked directly with a concern for the living traditions of Indian groups. The aim is to make available to Indians themselves, who may have lost the continuity of an oral tradition, the written records of that tradition.[56] If there is an irony about re-constituting an oral tradition from written texts, it is only one of many in which modern Indians live, and these ironies are equally evident in the state, even the definition, of Indian arts.
For the traditional arts and crafts of Indians, there is a steady and well-defined commercial market. Navajo blankets and silver and turquoise jewellery, Pueblo pottery, and wood-carving from the Northwest, are recognized and accepted as authentic Indian work. This authenticity seems to be crucial. In some sense, one is buying, it seems, some authentic Indian ‘essence.’ But what about Indian artists and craftsmen working in different styles? What room is there for innovation outside the range of choices available within the traditional forms? The question is a complex one since it relates once again to the question of how much change can occur before a culture ceases to have recognizable continuity. It is made more difficult in many cases by the fact that ‘Indianness’ as recognized by whites is forced upon the Indians, since whites are the main buyers. This operates particularly at the lower end of the market. Whereas some Indian artists have achieved recognition by distinctive and individual adaptations of Indian themes and styles, at the mass-market level only the cliches of Indianness are required or produced. Through the important and developing area of tourism this can have serious repercussions on Indian communities. For some, tourism offers a substantial part of their income through the manufacture of souvenirs or the performance of dances, and the effect can be to offer a stereotyped and ultimately degrading image of themselves. They have been put in the position of having to act out an ahistorical role as reassurance that nothing has changed. In the words of an earlier Indian intellectual, Arthur C. Parker, who himself had to act out this role, they “have to play Indian to be Indian.”[57] As a source of income it is a paradigm of the worst sort of Indian-white relationship. Indians receive money as long as they conform to reassuring white stereotypes of the primitive.
There must be room for both traditional forms and innovation in Indian societies and in their art, and Indian writers present a good example of the possibilities of innovation. Since most Indian languages are not regularly written down, and accurate and accessible conventions for doing so have not been developed, Indian writing is mostly in English, which in itself raises the question of whether fiction written by Indians has more in common with white traditions than with Indian. Clearly, in using the novel form, writers like N. Scott Momaday, James Welch and Darcy McNickle have taken over assumptions about character and plot together with the form, but in varying ways Indian writers have increasingly tried to find forms to express their particular vision. Leslie Silko, for instance, has tried to blend ritual and fiction, not only in the subject-matter but in the form of her book. In Ceremony the sickness and despair of the hero can finally be overcome only by ritual, but a ritual which has changed in response to the times, and the novel itself attempts to create this new form by using elements of traditional ritual interspersed with the modern narrative. Momaday’s House Made of Dawn is an ambitious attempt to set the cultural and psychological displacement of a young Indian in some historical and social context, and Momaday uses dislocations of narrative and time-sequence in an attempt to express the complex set of ‘times’ in which modern Indians live. James Welch’s accounts of young Indians’ lives are sharply observed, and he avoids falling into the Lawrentian rhetoric to which Momaday is prone when trying to represent a world-view different from the normal white one.
This is partly a stylistic problem inherited from a white tradition of fiction, where the attempts to represent a simple but noble and timeless world-view have been bedevilled by the use of a selfconsciously simple and noble style. This has been refuelled recently by the renewed popularity of the set-piece speeches of Indians of the past, which in most cases are not so much translations as versions written by whites in the style of eloquent simplicity they felt appropriate to noble savages. Vine Deloria has complained about the spate of books concentrating on the glories of the past and the appearance of a flood of anthologies with oratory such as ChiefJoseph’s surrender speech, and to judge from books and films, when the silent Redskin does speak, whites have insisted that he speak in ways they have predetermined. The stereotype of the silent Indian was cleverly exploited in Ken Kesey’s One Flew Over the Cuckoo’s Nest, and now, with literature and polemic being produced in English by Indians and with an increase in collaborative translations, perhaps the supposedly silent and stoic Indian can be heard in his own voice.[58]
That voice may not have the ‘Indianness’ considered requisite by whites, but it does offer to Indians the possibilities of self-determination in its fullest sense: Indians in future may define themselves to themselves rather than be defined culturally by stereotypes, and economically and politically by paternalistic administrations. Whereas whites have tended to define Indians by specific attributes and behaviour, research into groups of Indians at apparently very different degrees of acculturation has shown that the sense of being Indian, and of identifying with a continuous culture, can be just as strong amongst those with fewer of the traits that whites identify as Indian.[59] Identity and group-identity may thus be formed from within rather than identified from outside. The various ways of being Indian should not have to include living up—or down—to white stereotypes, and must rest on their ability to maintain or develop independently their economic, social and cultural resources.
6. Epilogue: Tribalism and the Future
Indians are unique in being the only group specifically identified in the Constitution, and this has meant that they have been regarded not as a racial or ethnic group like others, but as a distinct political entity or series of entities. Each tribe has historically a specific relationship to the federal government, and all efforts to obliterate that relationship have been resisted. Now that unilateral termination seems to have been discontinued as a possible course of action, the most urgent task is to define how Indian communities, whether on or off reservations, can best overcome the effects of years of maladministration from outside. “The contemporary problem,” says Deloria, “is one of defining the meaning of tribe. Is it a traditionally-organized band of Indians following customs with medicine-men and chiefs dominating the policies of the tribe, or is it a modern corporate structure attempting to compromise at least in part with modern white culture?”[60] The definition of tribe has also been a relevant issue in recent legal disputes. The argument in the case of the Mashpee Wampanoag Indians’ claim for land on Cape Cod in 1977-78 revolved around whether they were a tribe at key points in their history. They were required to produce a chief and medicine man, as well as expert witnesses about their past.[61] Clearly some fuller definition than this must be found, especially for smaller or less traditional groups, a definition which offers the possibility of a new relationship with white society and government.
Underlying most of the arguments about tribal identity is the fundamental claim to sovereignty, and it is the continued sovereignty of Indian tribes which has become linked with the honouring of the particular terms of treaties made in the past. In their important analysis of the status of Indian tribes Barsh and Henderson point to the dangers implicit in a 1978 Supreme Court opinion that “Indian tribes still possess those aspects of sovereignty not withdrawn by treaty or statute, or by implication as a necessary result of their dependent status.”[62] The last phrase could justify almost anything. In particular it could be construed as making the right to independence consequent upon the present possession of it, thus justifying the withholding of sovereignty from those who have been made dependent by whatever means. Dependence, too, is now often seen in terms of incompetence. Writing in 1980 in the preface to a glossy and uncritical BIA publication outlining its activities, the Commissioner of Indian Affairs stressed self-determination but insisted that “the harsh reality is that you can only be self-determining if you have the abilities necessary to manage your resources.”[63] Meanwhile, it is implied, the BIA must manage them on the Indians’ behalf. Perhaps the best answer to this was given by one of the most eminent jurists to be associated with Indian law, Felix Cohen. Talking specifically about the development of Indian self-determination, he said, “By self-government I mean that form of government in which decisions are made not by people who are wisest, or ablest, or closest to some throne in Washington or in Heaven, but rather by the people who are most directly affected by the decisions.”[64]
Indians are beginning to play a greater role in initiating and administering schemes in which the role of the government is limited to funding rather than total administration. This change has been described as a move from the “administered community” to the “sustained enclave.” In the first system, since “the source of decision—making lies outside the community for whom the decisions are made . . . destructive stress is built into the system.” Under the second system communities would still be economically sustained, since most reservations are not economically self-supporting, but would be allowed to control for themselves the running of their community.[65] Under present schemes this is happening to a limited extent. Increasingly, community schools incorporate traditional skills and languages into the normal American curriculum, hospitals utilize Indian medicine-men as well as white ones, and factories are run by the local community. The more complex and technical the undertaking, of course, the less easy it is for a community without expertise to retain real control, and this has led to some re-examination of the sort of technological development really required. An article in Akwesasne Notes in 1978 headed “Regaining Control Of Our Lives” argued that “Appropriate technology is ‘appropriate’ to Native people only if it returns to them control over their lives. What Native people need to develop are technologies appropriate to the exercise of sovereignty.“[66] Recent refusals to allow mineral resources on Indian land to be exploited, in spite of the promise of financial advantages, reflect this thinking.
The prevailing hostility to public spending on welfare—a hostility reaching a new pitch under President Reagan—has already put Indian communities in cities and on reservations on the defensive. While recent BIA statements have explicitly excluded the possibility of a new policy of termination, the federal government’s sights remain firmly fixed on the objective of ending the Indians’ economic dependency on the public purse. In addition, there are pressures at the state level to deprive Indians of traditional rights. Perhaps the strongest defence Indians can make is that the issues their status raises are not particular to them. After all, “the ideals of local self-government and political diversity” cherished by the Indians are of the utmost importance in all modern societies where local power and individual competence are increasingly being undermined by bureaucratic paternalism and centralized powers.[67]
7. Guide to Further Reading
An important work of general reference is Francis Paul Prucha’s A Bibliographical Guide to the History of Indian-White Relations in the United States (Chicago and London: Chicago UP, 1977), though it is not, of course, confined to the twentieth century. On the ethnographic side a basic guide to materials is George P. Murdock and Timothy J. O’Leary, Ethnographic Bibliography of North America (New Haven, Conn.: Human Relations Area Files Press, 1975), although here again much of the material relates to Indian cultures of the past. Russell Thornton and Mary K. Grafmick, Sociology of American Indians: A Critical Bibliography (Bloomington: Indiana UP, 1981) provides an up-to-date listing of the sociological material.
There are surprisingly few full-length general studies of modern Indians. Perhaps the most useful place to start is Murray L. Wax’s Indian Americans: Unity and Diversity (1977),33* which includes statistical appendices. John A. Price, Native Studies: American and Canadian Indians (1978),5* is curiously organized but full of useful information, an attempt to gather together the materials for Native Studies as opposed to Indian-White relations as part of American history. William A. Brophy and S.D. Aberle, The Indian: America’s Unfinished Business (Norman: Oklahoma UP, 1972), was a provocative report of the Commission on the Rights, Liberties and Responsibilities of the American Indian, reassessing the government’s role.
There are a number of good collections of essays. Howard M. Bahr, ed., Native Americans Today (1971),6* and Stuart Levine and Nancy O. Lurie, eds., The American Indian Today (Baltimore: Penguin, 1972), between them cover many diverse aspects of modern Indian life, and in 1957 the Annals of the American Academy of Political and Social Sciences (Philadelphia, 1957), vol. 311, devoted a special issue to “American Indians and American Life,” edited by George E. Simpson and J. Milton Yinger. When it appears, Volume 2 of the Handbook of North American Indians (Washington, D.C.: Smithsonian Institution, forthcoming) will be devoted to contemporary Indians and is bound to be an indispensable volume.
Most general studies of Indians have a—sometimes rather perfunctory—final section on modern Indians, for instance, Alvin M. Josephy, The Indian Heritage of America (New York: Knopf, 1971); Angie Debo, A History of the Indians of the United States (Norman: Oklahoma UP, 1971); Edward H. Spicer, A Short History of the Indians of the United States (New York: Van Nostrand Reinhold, 1969), which also includes a good collection of documents, some relating to the modern period. Roger L. Nichols and George R. Adams have edited a good collection of essays on The American Indian: Past and Present (Lexington, Mass.: Xerox College Publishing, 1971). Darcy McNickle’s rather misleadingly titled Native American Tribalism: Indian Survivals and Renewals (New York: Oxford UP, 1973) combines a brief history of the Indians with some excellent discussion of modern Indians.
Treatments of specific aspects of modern Indians and their relation to white society are proliferating. The role of the Bureau of Indian Affairs is dealt with critically in Sar A. Levitan and B. Hetrick, Big Brother’s Indian Programs With Reservations (New York: McGraw Hill, 1971), journalistically in Edgar S. Cahn, ed., Our Brother’s Keeper (1979),37* and with full statistical detail in Alan L. Sorkin’s Indians and Federal Aid (1971).26 The important, sometimes iniquitous, role of education and its relation to government policies is dealt with in Margaret Szasz, Education and the American Indian: The Road to Selfdetermination, 1928-1973 (Albuquerque: New Mexico UP, 1974), and the major issue of Indian land is covered in Wilcomb E. Washburn’s Red Man’s Land, White Man’s Law: A Study of the Past and Present Status of the American Indian (New York: Scribner, 1971), and more polemically in Kirke Kickingbird and Karen Ducheneaux, 100 Million Acres (New York: Macmillan, 1973). Russet L. Barsh and James Y. Henderson’s The Road: Indian Tribes and Political Liberty (1980) “ traces expertly the legal changes in Indian status and presents an important new conceptualization of the relation between tribes and the federal government.
Urban Indians, having been long neglected, are now getting academic, if not official, attention. Alan L. Sorkin, The Urban American Indian (1978)9 is ponderous but detailed. Jack O. Waddell and O.M. Watson, eds., The American Indian in Urban Society (Boston: Little, Brown, 1971), is a useful collection of lively essays. One of the most interesting accounts of urban Indians is Jeanne Guillemin’s Urban Renegades: The Cultural Strategy of American Indians (1975).a She deals with the experience of the small group of Micmac Indians in Boston, but her argument and analysis have a more general reference. This is true of some other fine studies of individual tribes or groups. Examples from a rich and varied field are James F. Downs, The Navajo (New York: Holt, Rinehart and Winston, 1972); Ethel Nurge, ed., The Modern Sioux: Social Systems and Reservation Culture (Lincoln: Nebraska UP, 1970); Malcolm McFee, Modern Blackfeet: Montanans on a Reservation (New York: Holt, Rinehart and Winston, 1972); Edmund Wilson, Apologies to the Iroquois (London: W.H. Allen, 1960); Karen I. Blu, The Lumbee Problem: The Making of an American Indian People (Cambridge: Cambridge UP, 1980); and Elizabeth S. Grobsmith, Lakota of the Rosebud: A Contemporary Ethnography (New York: Holt, Rinehart and Winston, 1981).
Accounts of more recent political developments tend to be impressionistic and partisan, like Stan Steiner, The New Indians (New York: Harper & Row, 1968); Bruce Johansen and Roberto Maestas, Wasi’chu: The Continuing Indian Wars (New York and London: Monthly Review Press, 1979); William Meyer’s brief Native Americans: The New Indian Resistance (1971),11* and Robert Burnette and John Koster, The Road to Wounded Knee (1974).14* Vine Deloria’s Ouster Died For Your Sins: An Indian Manifesto (1970)4* and Behind the Trail of Broken Treaties: An Indian Declaration of Independence (1974)45* are hard-hitting, polemical accounts of a radical Indian’s position. The best place to trace Indian reactions to current events is through Indian newspapers. Akwesasne Notes includes material from many different tribes, often reprinting articles from smaller newspapers, and is a lively forum, largely for the more radical Indian groups. Some of the other important publications, like Navajo Times, are discussed in Price’s Native ,Studies (1978).5*
Indians’ own views, not just political, are further expressed in Shirley M. Witt and Stan Steiner, eds., The Way (New York: Knopf, 1972); Diane Niatum, ed., Carriers of the Dream Wheel: Contemporary Native American Poetry (New York: Harper & Row, 1975); Robert K. Dodge and Joseph B. McCullough, Voices From Wah-Kon-tah: Contemporary Poetry of Natiae Americans (New York: International Publishers, 1974); and John A. Milton, ed., The American Indian Speaks (Vermillion: South Dakota UP, 1969). Some important and imaginative explorations of what it is to be a modern Indian come in the form of fiction by Indians, for instance, N. Scott Momaday, House Made of Dawn (New York: Harper & Row, 1968); James Welch, Winter In the Blood (New York: Harper & Row, 1974); Leslie Silko, Ceremony (New York: New American Library, 1978); and Darcy McNickle, Wind From An Enemy Sky (San Francisco: Harper & Row, 1978).
(*see Note for full reference)
8. Notes
- See, for instance Dee Brown’s best-selling Bury My Heart At Wounded Knee: An Indian History of the American West (New York: Holt, Rinehart and Winston, 1971), which ends its passionate indictment of white treatment of Indians in the 1890s, with the Indians demoralized and apparently about to die out ignominiously after Wounded Knee. I have chosen here to retain the term ‘Indian’ rather than ‘Native American,’ which is sometimes preferred now. ‘Indian’ is a misnomer, but it is more widely understood and less clumsy than ‘Native American’—which is itself misleading, since it more usually refers to anyone born in America.
Back
- See Henry F. Dobyns, “Estimating Aboriginal American Population: An Appraisal of Techniques with a New Hemispheric Estimate,” Current Anthropology, 7 (1966), 395-416; and Francis Jennings, The Invasion of America: Indians, Colonialism and the Cant of Conquest (Chapel Hill: North Carolina UP, 1975), pp. 15-31, for discussion of the political implications of earlier low estimates.
Back
- Population figures are complicated by the problem of defining an Indian. Prior to 1960 figures were based on identification by census enumerators, on the evidence, sometimes, of mere appearance, but since then they have been based on self-identification, which has led to an increase in the numbers—itself interestingly reflecting perhaps a growing awareness and pride in being Indian. Even so, there is evidence to suggest that the figures may still be a substantial undercount: see U.S. Commission on Civil Rights, To Know or Not to Know: Collection and Use of Racial and Ethnic Data in Federal Assistance Programs (1973), p.31. To qualify for federal services designed for Indians, the terms ofdefinition are stricter, involving evidence of membership of a tribe or proof ofa substantial degree of Indian blood.
Back
- Vine Deloria, Jr., Ouster Died For Your Sins: An Indian Manifesto (New York: Avon, 1970), p.10.
Back
- The basic differences are that Canadian Indians were originally less violently deprived of their land, they maintained better trade relationships with whites for longer, and they now make up a much larger percentage of the population than their U.S. counterparts. Legally, though, their status has been potentially weaker than U.S. Indians’, since they have had fewer guaranteed rights, not being granted citizenship until 1960. Fuller comparisons are made in John A. Price, Native Studies: American and Canadian Indians (New York and Toronto: MeGrawHill Ryerson, 1978).
Back
- Cf. Lee H. Bowker, “Red and Black in Contemporary American History Texts: A Content Analysis,” reprinted in Howard M. Bahr, et al., eds., Native Americans Today: Sociological Perspectives (New York: Harper Row, 171), and Virgil Vogel, “The Indian in American History Text-books,” Integrated Educuation, 6 (1968), 16-32.
Back
- The Cherokees, Choctaws, Chickasaws, Creeks and Seminoles.
Back
- Jeanne Guillemin, Urban Renegades: The Cultural Strategy of American Indians (New York: Columbia UP, 1975), p.67.
Back
- See Robert Berkhofer, The White Man’s Indian: Images of the American from Columbus to the Present Day (New York: Knopf, 1978), pp. 23-31.
Back
- See John Cawelti Six-Gun Mystique (Bowling Green, O.: Bowling Green UP, 1970), pp. 40, 53.The same is true of many white photographs of Indians; see M. Gidley, American Photography, pamphlet in this series.
Back
- Price, p. 206.
Back
- John Harrington, “Understanding Hollywood’s Indian Rhetoric,” Canadian Review ofAmerican Studies,8 (1977), 77-88.See also G.M. Bataille and C.L.P. Silet, eds., The Pretend Indians: Images of Native Americans in the Movies (Ames: Iowa State UP, 1980).
Back
- E.g., William Meyer, Native Americans: The New Indian Resistance (New York: International Publishers, 1971), p. 75.
Back
- Quoted in Robert Burnette and John Koster, The Road to Wounded Knee (New York: Bantam, 1974), p. 125.
Back
- Russel L. Barsh and James Y. Henderson, The Road: Indian Tribes and Political Liberty (Berkeley and Los Angeles: California UP, 1980), pp. 53, 92-93.
Back
- Quoted in Virgil J. Vogel, This Country Was Ours: A Documentary History of the American Indian (New York: Harper & Row, 1972), p. 193.
Back
- D.S. Otis, in Francis P. Prucha, ed., The Dawes Act and the Allotment of Indian Lands (Norman: Oklahoma UP, 1973), p. 17.
Back
- The ceremony is quoted in full in Vine Deloria, ed., Of Utmost Good Faith (San Francisco: Straight Arrow, 1971), p. 142.
Back
- C.H. Matthews gives a useful overview in “The Revolt Against Americanism: Cultural Pluralism and Cultural Relativism as an Ideology of Liberation,” Canadian Review of American Studies, 1 (1970), 4-31.
Back
- D.H.Lawrence, Phoenix (1936; reprint ed., London: Heinemann, 1961), pp. 144-45.
Back
- John Collier, From Every Zenith (Denver: Sage Books, 1963), p. 126.
Back
- Ruth F. Benedict, Patterns of Culture (Boston and New York: Houghton Mifflin, 1934).
Back
- Kenneth R. Philp, John Collier’s Crusade For Indian Reform (Tucson: Arizona UP, 1977), p. 242.
Back
- Steve Talbot, “The Meaning of Wounded Knee, 1973: Indian Self-Government and the Role of Anthropology,” in Gerrit Huizer and Bruce Mannheim, The Politics of Anthropology (The Hague: Mouton, 1979), p. 243.
Back
- Arthur V. Watkins, “Termination of Federal Supervision: The Removal of Restrictions Over Indian Property and Persons,” Annals of the American Academy of Political and Social Science, 311 (1957), 50, 51.
Back
- Alan L. Sorkin, Indians and Federal Aid (Washington: Brookings Institution, 1971), p. 158.
Back
- Public Papers of the Presidents, Richard Nixon, 1970 (Washington, DC: US Govt.
Back
- Printing Office, 1971), pp. 565, 566-67.
Back
- Cf. Philip Davies, The Metropolitan Mosaic: Problems of the Contemporary City (1980), the fourth pamphlet in this series.
Back
- Alan L. Sorkin, The Urban American Indian (Lexington, Mass.: Lexington Books, 1978), pp. 14, 13.
Back
- Ibid., p. 12.
Back
- Guillemin, Urban Renegades, pp. 72-73, 183-184.
Back
- Joseph G. Jorgensen, “Indians and the Metropolis,” in The American Indian in Urban Society, ed. Jack O. Waddell and O. Michael Watson (Boston: Little, Brown, 1971), p. 79. For a more general assessment of the idea of the “culture of poverty,” see Charles A. Valentine, Culture and Poverty (Chicago UP, 1968) and Daniel P. Moynihan, ed., On Understanding Poaerty (New York: Basic Books, 1968).
Back
- Murray L. Wax, Indian Americans: Unity and Diversity (Englewood Cliffs, NJ: Prentice-Hall, 1971), p. 194.
Back
- Jorgensen, p. 85.
Back
- See Raymond Williams, The Country and the City (London: Chatto & Windus, 1973), pp. 279-88.
Back
- Jorgensen, p. 84.
Back
- Edgar S. Cahn, ed., Our Brother’s Keeper: The Indian in White America (New York and Cleveland: New Community Press, 1969), p. 112.
Back
- See Hazel W. Hertzberg, The Search For An American Indian Identity: Modern Pan-Indian Movements (Syracuse UP, 1971), an invaluable study, which deals primarily with movements prior to 1934.
Back
- James Mooney, The Ghost Dance Religion and the Sioux Outbreak of 1890 (Chicago UP, 1970), p. 19. Originally published as Part II of 14th Annual Report of Bureau of Ethnology to Smithsonian Institution, 1892-93.
Back
- See Dennis Tedlock, ed., Teachings From the American Earth: Indian Religion and Philosophy (New York: Liveright, 1979); Ake Hultkranz, The Religions of the American Indians (Berkeley: California UP, 1979); and Ruth F. Benedict, The Concept of the Guardian Spirit in North America (repr. ed., New York: Kraus, 1974), originally published in Memoirs of the American Anthropological Association, 29 (1923 ).
Back
- Bryan Wilson, Magic and the Millenium: A Sociological Study of Religious Movements of Protest Among Tribal and Third-World Peoples (London: Heinemann, 1973), p. 417. Peyote was not the only means of adjustment to modern conditions. Amongst the Ute and Shoshone, for instance, a form of the Sun Dance replaced the Ghost Dance, and still retains more adherents than the Peyote religion. Joseph G. Jorgensen’s The Sun Dance Religion: Power For the Powerless (Chicago UP, 1972) demonstrates impressively the relation of such redemptive religions to the deprivation and neo-colonial status of modern Indians.
Back
- Vittorio Lanternari, The Religions of the Oppressed (London: Macgibbon and Kee, 1963), pp. 91-92.
Back
- David F. Aberle, The Peyote Religion Among the Navajo (New York: Viking, 1966).
Back
- Quoted in Robert C. Day, “The Emergence of Activism as a Social Movement,” in Bahr, ed., Native Americans Today, p. 527; and in Peter Collier, “The Red Man’s Burden,” Ramparts, 8 (1970), 26-38.
Back
- The proposal, with the government response, is reprinted in full in Trail of Broken Treaties: BIA, I’m Not Your Indian Any More (New York: Akwesasne Notes, 1974). Vine Deloria’s Behind the Trail of Broken Treaties: An Indian Declaration oflndefiendence (New York: Delacorte, 1974) gives the background, while Burnette and Koster’s The Road to Wounded Knee includes a first-hand account of events.
Back
- Robert Thomas, quoted in Talbot, “Meaning of Wounded Knee,” pp. 224, 242.
Back
- Cahn, Our Brother’s Keeper, p.2.
Back
- See Price, Native Studies, pp. 230-31. A full, if partisan and journalistic, account is given in Burnette and Koster, and Talbot sets it convincingly in a larger political framework. Akwesasne Notes, 6(January, 1975), 32-33.
Back
- Quoted in Talbot, p. 237.
Back
- See Daniel C. Noel, ed., Seeing Castaneda (New York: Putnam, 1976), a collection of early reactions, and Robert de Mille, Castaneda’sJourney (Santa Barbara: Capra Press, 1976).
Back
- There seems to be something of a tradition of fake Indians, from Grey Owl, who created a sensation in England in the 1930s on inspirational speaking tours but turned out to be from a middle-class family in Hastings, to Chief Red Fox’s spurious memoirs and the show-biz “Indian princess” sent by Marlon Brando to receive his Oscar.
Back
- See Leslie Fiedler’s Return of the Vanishing American (London: Cape, 1968).
Back
- See Earth House Hold (New York: New Directions, 1969) and Turtle Island (New York: New Directions, 1974).
Back
- See The Shoshoneans: the People of the Basin Plateau (New York: W. Morrow, 1966) and Recollections of Gran Apacheria (San Francisco: Turtle Island Foundation, 1974).
Back
- E.g., Donald M. Bahr, Pima and Papago Ritual Oratory: A Study of Three Texts (San Francisco: Indian Historian Press, 1975).
Back
- Hertzberg, American Indian Identity, p. 319.
Back
- Keith Basso’s Portraits of “The Whiteman” (Cambridge UP, 1979) is an interesting example, since it gives the view from the other side of the anthropological encounter. Long overdue attention is given to the anthropologists’ Indian informants, many of whom were important cultural and political figures within their own tribes, in Margot Liberty, ed., American Indian Intellectuals (St. Paul: West Publishing, 1978).
Back
- Irving Hallowell, “Ojibwa Personality and Acculturation,” in Sol Tax, ed., Acculturation in the Americas: Selected Papers of the 29th International Congress of Americanists (1967), vol. 2, p. 110. See also A. Irving Hallowell, Culture and Experience (Philadelphia: Pennsylvania UP, 1974).
Back
- “This Country Was a Lot Better Off When The Indians Were Running It,” New York Times Magazine, 8 (March 1970), repr. in Bahr, ed., Native Americans Today, p. 504.
Back
- New York Times, 7 Jan. 1978, pp. 1, 14.
Back
- Quoted in Barsh and Henderson, The Road, p. 290.
Back
- BIA Profile: The Bureau of Indian Affairs and American Indians (Washington, DC: U.S. Government Printing Office, 1981), p. 6.
Back
- Felix Cohen, The Legal Conscience: Selected Papers, ed. Lucy K. Cohen (Hamden, Corm.: Shoe String, 1970), p. 305.
Back
- George P. Castile, “Federal Indian Policy and the Sustained Enclave: An Anthropological Perspective,” Human Organization, 33 (1974), 219-28.
Back
- Akwesasne Notes, 10 (Autumn 1978), p.6.
Back
- Barsh and Henderson, The Road, p. 292. For an interesting recent analysis of the current situation, see M.A. Dorris, “The Grass Still Grows, the Rivers Still Flow: Contemporary Native Americans,” Daedalus, 110 (Spring 1981), 43-69.
Back
Top of the Page
James T. Patterson, The Welfare State in America, 1930-1980
BAAS Pamphlet No. 7 (First Published 1981)
ISBN: 0 9504601 7 6
- Welfare and Poverty in 1930
- The New Deal Establishes a Welfare State
- The Rediscovery of Poverty, 1960-1965
- The Revolution in Social Welfare, 1965-1975
- Floors as Well as Doors
- Guide to Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1: Welfare and Poverty in 1930
A central theme dominates the history of American poverty and public welfare: the poor received much abuse and little assistance. Their harsh treatment persisted into the 1960s, when a combination of social, economic, and political forces prompted a “rediscovery” of poverty, a vast expansion of social welfare programs, and a substantial reduction in the percentage of people officially defined as poor. Even then, however, most middle-class Americans continued to disparage the poor and to grope for ways to control the costs of public welfare. Their attitudes deeply affected political responses in the 1970s. By the early 1980s, therefore, liberal reformers conscious of the extent of poverty found their first priority was to prevent Congress from cutting back on the provision of welfare—when the real need, liberals felt, was to enact fundamental improvements in the jerrybuilt welfare system constructed in the 1930s.
Three cases suggest the cold way in which Americans treated poor people as late as 1933. The first, in New York State, involved a woman who had earlier established her right to poor relief in the city of Syracuse. She then married an immigrant who, after a journey to Canada, was denied re-entry to the United States. Penniless, the woman requested public assistance for her children and herself. The city refused, arguing that her marriage to an immigrant made her ineligible for relief. She received support only after extended legal efforts. The second, in Massachusetts, concerned the effort of the town of Plymouth to avoid aiding a young illegitimate girl. Because the girl’s mother had died, the town tried to force her grandfather to care for her. He refused, and a suit followed. After protracted maneuvering, the court held that neither the grandfather nor the town owed the girl help—by the seventeenth-century state law, an illegitimate child was legally the responsibility of no one! In the third case, a temporarily unemployed Maine man applied for aid. The town not only denied his claim but decreed that he was henceforth a “pauper” and therefore ineligible to vote.[1]
That was in the early 1930s. By then Germany had had a social insurance system for fifty years, Great Britain and Sweden for almost twenty-five. Most West European nations paid family allowances designed in part to help the poor. Though statistics are unreliable, contemporary observers were certain that the United States lagged far behind Europe in the size and coverage of its welfare payments, and in the spirit in which aid was provided. It had no social security (i.e., old age pensions), no unemployment insurance, no family allowances, no health insurance. Given the resources of the United States—by all odds the wealthiest nation in the world—the national government was strikingly inactive in the field of welfare.[2]
The country’s federal tradition partially explained this situation. At that time even progressives like Governor Franklin D. Roosevelt of New York believed deeply in local experimentation and decentralized government. State and local governments, they thought, should supplement private charity in helping the needy. But states and towns, too, had done little in the field of welfare by 1933. No state had an unemployment insurance plan before 1932. Old age pension plans existed in only eleven states in 1929; these paid only $220,000 in pensions. Programs of aid to mothers and dependent children assisted 93,280 families in 1931, out of 3.8 million female-headed families in the United States. The average monthly grant for these assisted families ranged from a low of $4.33 in Arkansas to $69.31 in Massachusetts. That latter sum, coming to $832 per year, contrasted with the $2,000 that some economists then regarded as “sufficient to supply only basic necessities.”[3]
Other forms of aid, either “indoor relief” (in institutions) or “outdoor relief” (in the home), ordinarily came from the towns. Obviously, these local responses varied considerably. But most towns, like Syracuse and Plymouth, strove to minimize their costs. Relief officials at the local level were ordinarily elected or appointed by town officers. They rarely had any training in social work, or much sympathy for the poor. Many towns, moreover, kept on the books old poor laws that dated from the seventeenth century. These statutes resembled the Elizabethan poor laws of England. Applicants for relief ordinarily had to establish “settlement”—usually interpreted to mean continuous residence within the town for at least a year. The migrant poor were often “removed” forcibly to the towns from which they had come. Localities regularly denied aid even to needy people who had established settlement, dawdling on applications, humiliating and stigmatizing applicants, investigating the moral character of all who applied for help. Poor people who dared to complain rarely received hearings. Local officials were so anxious to limit their responsibilities that they resorted to suits against other cities that might be considered the places of settlement. The cost of litigation in such disputes in New York between 1928 and 1932 was $192,000—compared to the $215,000 that might have been spent to sustain the poor people involved during the time of legal action. During these years, 11,234 New Yorkers—more than 2,000 per year—were removed from one town to another as a result of such suits.[4]
Perhaps the best way to summarize the inadequacy of welfare at the time is to estimate the total dollars spent in 1929, a year of unprecedented prosperity in the United States. In that year, public spending on welfare—federal, state, and local—was around $500 million. This sum did not include aid to education (a local responsibility) or to veterans; it did include both indoor and outdoor relief. Private charity increased the grand total to around $750 million. Other statistics suggest that a maximum of 10 percent of the population of 122 million received help during any one year. Thus America in 1929 spent $6.25 per capita (or 0.73 percent of GNP) on helping some 12.2 million poor people, who received an average of $62.50 each.[5]
Given the comparative affluence of American society, it was of course arguable that there was no great need for welfare in the United States before the Great Depression. That was surely the view of the vast majority of Americans, and even of many economists and social scientists during the 1920s. The economic progress of that decade was astonishing and real. However one measured that progress—by the quality of diets, housing, availability of home appliances, spread of electricity and central heating—most Americans were much better off in 1929 than they had been in 1880 or 1900. In 1929 only 3.2 percent of the labor force was unemployed. By then, even working-class families could afford to buy cars. Most dramatic, perhaps, were improvements in health which were reflected in statistics on longevity. Life expectancy at birth was 47 years in 1900, 60 in 1930. There were 7 million Americans aged 55 or more in 1900, and 15 million in 1930. All these improvements prompted great optimism among contemporary observers, who tended to assume that economic growth would greatly enlarge the pie. It was therefore unnecessary to think about cutting it into more equal shares or to expand welfare. Herbert Hoover, reflecting this optimism, proclaimed in 1928, “We shall soon, with the help of God, be in sight of the day when poverty will be banished in the nation.”[6]
It was arguable also that poverty in the United States tended in the main to be short-term, related to life cycles in families. Most of America’s poor, it seemed, were members of working-class families that experienced privation only periodically—when the breadwinner was sick, when a recession threw him out of a job, when he had a household full of very young children to support, when he got old or disabled. Grim as their suffering was at these times, it was rarely chronic, and almost surely not culturally transmitted to the following generations. Moreover, by the 1920s poverty seemed to be fading away as the second and third generations of once poor immigrants moved up the occupational ladder and out of their urban ghettos. Observers who perceived poverty in this way were admittedly speculating, for they lacked reliable studies on the extent to which poverty was chronic or culturally transmitted. But subsequent historical research suggests that social mobility for working-class Americans at the time was indeed substantial. Probably at least 30 percent of workers raised in blue-collar households in various American cities in the early twentieth century attained middle-class occupational status between 1880 and 1930.[7]
Stereotypes of the poor also seemed to soften a little during the first third of the century. Prior to 1900, few writers on poverty had stressed that it was environmental. They had assumed that many—perhaps most—of the poor were “undeserving.” Some were thought to be members of the “dangerous classes,” others to be lazy, oversexed, and shiftless. Paupers (the undeserving who applied for relief) got almost no sympathy. Francis Walker, a professor who headed the United States Census, explained in 1897 that “pauperism is largely voluntary …. Those who are paupers are so far more from character than from condition. They have the pauper trait; they bear the pauper brand.”[8]
By 1929 such stereotyping was less common. During the Progressive era (1900-1917) and afterwards, writers such as Robert Hunter, Jacob Hollander, and Robert Kelso described poverty as an economic condition caused by low wages, underemployment, technological change, depressed trades (including agriculture), and the vicissitudes of the business cycle. They avoided racist talk about inferior immigrants or shiftless blacks. As Hollander explained in 1914, “neither race qualities nor national characteristics account for the presence of such poverty. It persists as an accompaniment of modern economic life, in widely removed countries among ethnically different people. It cannot be identified with alien elements in native race stocks.”[9]
It was clear, finally, that public welfare advanced a little in the years 1900-1930. During those years, states adopted workmens’ compensation plans and began their limited experiments with mothers’ and old-age pensions. Many states established departments of Social Welfare or Public Welfare. Social workers developed professional skills to help them with case work. And public spending, while low, nonetheless increased over time.
Still, these real gains between 1900 and 1930 could not disguise the persistence of widespread need in America even in the good year of 1929. In a slim volume put out by the Brookings Institution five years later, the authors estimated that 16 million families in the 1920s, about 60 percent of the total number, received less than $2,000 a year, an income “sufficient to supply only basic necessities.” That was at least 70 million people. Only about 25 percent of the nation’s non-farm families earned $3,000, the sum needed to pay for an “adequate diet at moderate cost.”[10]
Who were these poor? The Brookings authors and others emphasized the main site of America’s mass poverty: the farm. Farm family income, they reported, averaged only $1,240—$460 less than median family income in the nation. Moreover, the income of farmers was concentrated among a fairly small number of large commercial operators. The authors found that 54 percent of America’s 5.8 million farm families got less than $1,000 a year. That was about 17 million people, who long had been the poorest of the nation’s poor. Within fifteen years millions of them flocked to the cities, thereby markedly changing the nature of poverty in America. Not surprisingly, because American agriculture was depressed in those years, the South showed by far the lowest income levels. In twelve states, all in the South, the per capita income of the farm population was below $200. Such statistics gave little comfort to Jeffersonian advocates of life on the farm.
Poverty hit certain other groups particularly hard. Old people, last hired and first fired, were much more likely to be poor than people under 55 or 60. Members of female-headed families were almost universally poor. So were the disabled, and migrant workers. And blacks, as ever, clung to the lowest ledges of the income pyramid. George Schuyler, a prominent black intellectual, explained later, “the reason why the Depression didn’t have the impact on the Negro that it had on the whites was that the Negroes had been in the Depression all the time.[11]
If such widespread need persisted in 1929, why was not the welfare system more responsive? Among the many answers offered by scholars are several that employ broad functional explanations. Some maintain that the middle classes, psychologically insecure, stereotyped the poor in order to sustain their own self-esteem. That was “blaming the victim.” Other scholars stress racial considerations. Keeping welfare low, they say, locked blacks in their subordinate place. Still other writers emphasize social and economic considerations. Stingy levels of relief, they argue, assured farm and factory owners abundant cheap labor. Putting down the poor served also as a means of social control.[12]
These functional interpretations are useful, as far as they go. Charity workers, in trying to instill the work ethic, deliberately tried to keep people off relief. In so doing they expanded the pool of cheap labor. Americans generally remained indifferent to the exploitation of various groups, especially blacks. Lobbies such as the Chamber of Commerce, the National Association of Manufacturers, and the Farm Bureau Federation drew their strength in the twentieth century from employers and large commercial farmers who profited from keeping poor people in their place, and who received psychological comfort from painting unflattering stereotypes of the needy. No explanation of the withered form of American welfare prior to 1930—and afterwards—can ignore the role played by purposeful interest groups that had good reasons for resisting generous welfare.
It is not very accurate, however, to imagine that these elites alone conspired to subvert the interests of the poor.[13] Reformers in the field, too, rarely showed much enthusiasm for extensive welfare. They preached, rather, the gospel of prevention: Their goal was to prevent poverty in the first place, thereby making welfare virtually unnecessary. Professional social workers, for instance, labored hard to develop sophisticated methods that would help families and individuals to help themselves. Lobbyists for the American Association for Labor Legislation, founded in 1906, concentrated on prevention of industrial accidents and of unemployment. Their plan for unemployment compensation featured the concept of “merit ratings”—whereby employers with records of maintaining stable employment would receive relief from taxation. The Association thought that merit ratings would discourage layoffs, and thereby prevent large-scale unemployment. [14]
Behind this quest for prevention lay several powerful ideas. One was the faith that hard work led inevitably to economic advancement, indeed to spiritual peace. As Theodore Roosevelt put it, “nothing in this world is worth having unless it means effort, pain, difficulty.” It followed logically that those in need—save the few who were sick or tied down at home with children—were not working hard enough. Most of the poor were “undeserving.” Relief should be given grudgingly, if at all, in order to discourage such people from malingering and to sustain the morale of the populace. This faith in the work ethic remained a potent ideological force that affected not only conservative elites but also the middle classes. [15]
A second force militating against creation of a welfare state before 1930 was the lack of experience with, or confidence in, governmental answers to public problems. In this respect the American scene was unusual among the industrialized nations of the West, most of which had relatively strong, well-developed central governments. By contrast, the United States was divided ethnically, racially, and regionally. It cherished its voluntarist, federalist, and decentralist traditions. Until 1917, few Americans had much direct experience with governmental institutions beyond the local level—save when they mailed a letter at the post office.
Voluntarism in the United States meant more than qualified laissez-faire. For many, including labor leaders, it involved active distrust of government, which had traditionally sided with corporations. They knew from bitter experience that the government could take away with one hand what it had given with the other. Reformers, too, doubted the very capacity of government to cope with large-scale problems. They noted that when Germany and England adopted social insurance systems, those countries already had a trained and respected civil service. The United States, however, had a spoilsridden, inefficient bureaucracy. One reformer complained, “The German success, such as it is, has been owing to a strictly competent and independent administration. With an administration like that which has controlled our army pensions, what would become of social insurance?”[16]
Lacking confidence in government, some experts in the inter-war years looked to the private sector to generate pension plans and other social programs. A few large and growing corporations responded, and “welfare capitalism” developed—but on only a very small scale, and benefiting not so much the poor as skilled workers who established seniority. The unemployed, the masses of poor farmers, women with dependent children, the aged, the disabled—all remained outside the scope of welfare capitalism. Neither then nor later did private efforts contribute greatly to the welfare of the poor in the United States.
Other reformers hoped that strong support for welfare might come from below-from the poor or working classes. But that was a forlorn hope prior to 1930. Labor unions, weary from contending with hostile opponents in government and in the private sector, were not a potent political force until the 1930s. The poor, meanwhile, rarely imagined that government would do much for them. They were divided ethnically and racially, virtually isolated in slums and backwaters. They had no organization, no role in politics; they knew from experience not to expect much. As one needy American observed in the early 1930s, “Always going to be more poor folks than them that ain’t poor …. I guess I always will be. I ain’t saying that’s the government’s fault. It’s just down right truth, that’s all.”[17]
Perhaps the most immovable obstacle in the way of more generous public welfare before 1930 was the optimism of the era. Once the frightening depression of the 1890s had subsided, middle-class Americans regained confidence. That confidence gripped Progressives, who imagined that they could do away with corruption and injustice. Conservatives, meanwhile, assumed that the play of the marketplace (and government assistance to corporations) would sustain economic progress. Like Hoover, they were certain that poverty would soon vanish from the land. Amid confidence such as this, advocates of expanded public welfare before 1930 seemed almost irrelevant.
2. The New Deal Establishes a Welfare State.
Among the many developments that changed American public welfare after 1930, none was so important as the Great Depression. It hit harder and lasted longer in the United States than in any other Western nation. It vastly expanded the number who were poor. Shattering the complacency and optimism that had characterized the 1920s, it forced the national government to respond to suffering. The emergent welfare state that arose dwarfed all earlier American efforts. In part because it developed so suddenly, however, it resembled a crazy-quilt of programs. Reformers since that time have struggled to improve the system.
The extent of need created by the Depression was staggering. One authoritative report concluded that 18.3 million American families and single people received less than $1,000 per year in fiscal 1936. That was around 60 million people. At the time at least $1,200 was deemed necessary for an urban family of four. President Franklin D. Roosevelt’s statement in 1937 that “one-third of [the nation” was “ill-housed, ill-clad, ill-nourished” was by almost any contemporary definition of poverty conservative. The percentage was probably closer to 50 percent.[18]
Many of these poor people, of course, suffered from forces that had little to do with the cataclysm of the 1930s. These were the “old” pre-depression poor: small farmers, the aged and disabled, femaleheaded households, minority groups. Other structural changes operated to compound such problems. One was the aging of the population. The number of Americans over 65 increased from 4.9 million in 1920 to 9 million in 1940—from 4.6 per cent to 6.9 per cent of the population. Another scourge was the ruination of the soil, which culminated in the Dust Bowl that drove the “Okies” on the westward trek captured in John Steinbeck’s The Grapes of Wrath.
Contemporaries, however, tended to slight both the residual poor and the throngs of poorly paid workers. Instead, they focused before 1935 on the catastrophic level of unemployment. According to official government estimates, unemployment rose from 1.6 million in 1929 to a high of 12.8 million in 1933. That was 25 per cent of the labor force. It dipped to a low of 7.7 million in 1937, but rose to 9.5 million, or 17 per cent, in 1939. Other estimates placed the high at around 15 million in 1933, and concluded that the number of people directly affected comprised one-third of the population of 123 million.[19]
These statistics, of course, do not tell all. They need some comparative dimension. In some ways the poverty of the 1930s was perhaps more easy to bear than that, say, of the 1960s. During the depression years around 50 per cent of the poor were able to supplement their diets through a little farming of their own. Thirty years later, about 85 per cent of poor people did not live on or near farms; for them, nothing was free. Americans in the 1930s, moreover, did not expect to own many expensive gadgets. Some 40 per cent of households in 1940 lacked bathtubs, 50 per cent central heating. By the mid-1960s, the vast majority of poor people in the United States had to have electricity and television; most owned cars and home appliances. Poor people in the 1930s, lacking television, mostly untravelled, were less aware of what the middle classes had. Their sense of relative deprivation was less acute.[20] Their very marginality, especially in the rural areas of the South and West, helped account for their essential invisibility to the public eye and for their neglect before 1930 by policy makers.
The estimates of poverty in the 1930s—including those that set it at 50 per cent or more of the population—also do not appear disastrously large in historical or international perspective. In 1900, similarly high percentages lived at or below subsistence, in the more stringent way that subsistence was defined in that era. Moreover, America was still a rich nation by world standards in the 1930s. When Russians viewed the film of The Grapes of Wrath,they marvelled that the Okies had cars. The humorist Will Ropers quipped that the United States was the only nation in history that went to the poorhouse in automobiles.
But that was of course the point: Americans in the 1930s did not care how Russian people lived. Like people in any country at any time, they measured their well-being by their own standards and expectations. Though these were much lower than they were to become thirty years later, they were higher than they had been in 1900, and they had been formed in an era, the 1920s, which had promised progress and prosperity. Americans in the 1930s were stunned especially by unemployment.
In reacting to the Great Depression, many middle-class Americans readily advanced old stereotypes about the poor. People on welfare were lazy. (“Why is a relief worker like King Solomon?” the joke went. “Because he takes his pick and goes to bed.”) Even Americans with more liberal attitudes, mesmerized by unemployment, tended to forget about low-wage workers, blacks, and immigrants. This absorption with unemployment, understandable in the circumstances, was unfortunate, for it did little to expose the suffering of the pre-depression poor, including the millions of regularly employed who suffered from low income.
Still, the extent of need proved so great that Americans were forced to jettison, even if slowly and reluctantly, old notions about poverty and welfare. Congressional responses between 1933 and 1938 show that the majority of people expected the federal government to help the unemployed. These needy, after all, were “deserving.” Leading spokesmen for Roosevelt’s New Deal, especially relief administrator Harry Hopkins, were eager to respond to this pressure and to institute bold new measures in federal relief and welfare. The result was a welfare state which, though rudimentary, marked a real break with the past. “During the ten years between 1929 and 1939,” an informed social worker wrote in 1940, “more progress was made in public welfare and relief than in the three hundred years after this country was first settled.”[21]
The first concern of the Roosevelt administration, which took office in March 1933, was to provide relief. To that end it quickly enacted a range of measures in the first “Hundred Days” of its existence. In the area of relief the most important were the Civilian Conservation Corps, which employed young men in forestry and conservation work, and especially the Federal Emergency Relief Administration (FERA). With an initial appropriation of $500 million, the FERA under Hopkins attempted to achieve its goal of providing “sufficient relief to prevent physical suffering and to maintain living standards.” Wasting no time, Hopkins spent more than $5 million in his first two hours in office. Most of the $500 million went as a cash dole, or outdoor relief; some recipients got work relief. States were expected to match FERA money at the ratio of $3 for every $1 from Washington. Later that year Roosevelt authorized the newly created Civil Works Administration (CWA) to tide people over the potentially disastrous winter of 1933-34. This was a work relief program available to needy people without a means test. It aided more than four million workers at its peak in January 1934, and paid wages averaging more than $15 a week—two and one half times average FERA benefits. No New Deal creation was more generous or more gratefully received than the CWA.[22]
Many liberal reformers were not wholly pleased with the FERA. States, they complained, could not or would not provide the matching money. Governors, they added, used the money to place political allies on the payroll, and denied aid to political enemies. Local administrators, still holding poor-law philosophies, persisted in humiliating applicants during means tests. One of Hopkins’s aides was enraged by witnessing the harsh dispensation of relief in Arizona: “When I see the lack of intelligence, not to say common, ordinary human sympathy which characterizes the handling of destitute families in some places, I am ashamed of what we are doing.”[23]
Hopkins, a strong and humane administrator, worked hard to overcome these limitations. But both he and Roosevelt became increasingly uneasy about continuing the dole for long. Relief of that kind, they thought, demoralized recipients. “I don’t think anybody can go on year after year, month after month,” Hopkins said, “accepting relief without affecting his character in someway unfavorably. It is probably going to undermine the independence of thousands of families.” Similar thinking prompted Roosevelt to declare in January 1935 that “continued dependence upon relief induces a spiritual and moral disintegration destructive to the national fibre. To dole out relief in this way is to administer a narcotic, a subtle destroyer of the human spirit.” The federal government, he closed, “must and shall quit this business of relief.”[24]
To replace the FERA, the President called for a division of responsibilities between the federal government, which was henceforth to supply work relief (not the dole) to those able to work, and the states, which were expected to take care of “unemployables.” He also called on Congress to approve social insurance, including old age pensions and unemployment compensation. Once the depression was over, Roosevelt thought, social insurance would become the main bastion of defence against poverty; except for state or local aid to a small residue of unemployables, relief would become virtually unnecessary. These beliefs—decentralization of relief was good, the dole was demoralizing, heavy welfare spending was fiscally dangerous, social insurance could prevent destitution—exposed the President’s faith in a traditional set of attitudes. Like his contemporaries, he earnestly hoped that the depression would end and that with social insurance in force, poverty would “wither away.”
Congress, too, was ready to “quit this business of federal relief,” and did Roosevelt’s bidding in 1935. The American welfare state assumed its essential form at that time. It had four main parts:
- “general” assistance, funded by states and localities, for socalled unemployables;
- work relief, paid by the federal government, though this proved only a temporary expedient;
- “categorical” public assistance, for the needy blind and aged, and for dependent children;
- social insurance in the form of old age pensions and unemployment compensation.
Categorical assistance and social insurance were introduced via the Social Security Act of 1935.
From the beginning, “general” assistance was inadequate. In leaving this responsibility to states and localities, Congress, like Roosevelt, grossly underestimated the numbers who were to apply for help. These included not only people obviously “unemployable,” but many other able-bodied men and women who could not get work in the private sector or work relief from the federal government. Many were old or unskilled. Others, like the Okies, were displaced workers who roamed the land in search of employment. States and localities adhered to old settlement rules and regularly refused them aid. Some local officials ran the migrants out of town. A general assistance program was essential, both then and later. Indeed, it gave aid to perhaps four million Americans per year in the late 1930s; no other public program affected so many people. But millions more needed help. Then and always, general assistance remained an ill-supported stepchild within the complex federal system of welfare in America.[25]
Federal work relief promised to be more effective. The Works Progress Administration (WPA), which carried most of this relief load between 1935. and 1943, supported as many as three million workers a month in the mid-1930s. Most of them were manual laborers engaged in construction work. Some were employed on imaginative projects benefiting artists, actors, and writers. The National Youth Administration (NYA), an offshoot of the WPA, provided part-time jobs to more than 2 million students, and assisted 2.5 million more who were not in school. WPA and NYA workers built and improved hospitals, schools, municipal facilities, and playgrounds. By any pre-depression standard the work relief programs represented a striking advance in welfare.[26]
But work relief, too, disappointed many reformers. To begin with, the work was for the most part menial; it did little to enhance skills. It paid poorly, around $55 per month, or $660 per year. Shortages of equipment and of qualified supervisory personnel led to inefficient management, and to damaging criticism from conservatives. WPA, they sniped, stood for “We Piddle Around.” And shortages of funds underlay all these limitations. Congressional appropriations, though unprecedentedly large, scarcely coped with the magnitude of need. The WPA ordinarily supported only a quarter to a third of the 8 to 11 million unemployed workers during the late 1930s.[27]
The third part of the New Deal welfare state, categorical assistance for the needy blind, aged, and dependent children, ultimately became an important, though controversial, part of American social welfare. By these measures the federal government and the states tried to support some of the poorest of the obviously “deserving” poor. The programs provided aid in cash, and required the states and localities to modify some of their harsh practices dating from the colonial period. The aid to dependent children program (ADC), which gradually became the most expensive of the categorical assistance plans, reached 700,000 children in 1939, as opposed to the but 300,000 who had benefited from state mothers’ aid laws at the start of the Depression.[28]
But categorical assistance, too, proved inadequate in the 1930s and thereafter. Reflecting the pervasive faith in decentralization and federalism, the programs operated on the matching grant principle: Washington provided funds only if states did. Some of the states—mainly the wealthier ones—had the money to take full advantage of the matching grants. But many of the poorer states could not or did not. The result was wide variations in payments. In 1939 ADC payments ranged from averages of $8.10 per family per month in Arkansas to $61.07 in Massachusetts. Recipients of old age assistance fared a little better, but nowhere received enough to attain a subsistence level of income.[29]
The decentralization of categorical assistance permitted states and localities a relatively free hand in administering the programs. Many, following traditional practices, imposed tough income or property restrictions. Money earned by recipients was promptly subtracted from benefits—a total disincentive to work. States also discriminated against minority groups, and employed many ruses to save money and retaliate against the “undeserving” poor—“welfare mothers”—getting ADC. A common ploy was to restrict aid to families who lived in a “suitable home”—a euphemism for households in which there were no illegitimate children. States and localities also developed “absent father” rules. These provided that aid be refused to dependent children in families where male breadwinners were suspected of living in or near the home. In order to enforce such regulations, officials resorted to snooping, including midnight raids to determine if there was a man in the house.
The fourth part of the early welfare state, social insurance, was strictly speaking not “welfare” at all. Money for the major programs in this area, unemployment compensation and old age pensions, came mainly from the private sector—ultimately from recipients. An employers’ payroll tax provided the funds for unemployment compensation. Congress authorized employers in states with federally approved plans to deduct 90 per cent of the taxes. Old age pensions—social security in American parlance—were financed by taxes on employers and employees. Contrary to the practice in most other industrialized nations, the government chipped in no money from general funds to supplement the program.
In the long run, these social insurance programs affected far more people than did public assistance or work relief. That, indeed, was the intent of the New Dealers who inaugurated them. They believed in social insurance, not handouts. But particularly in the early years, the programs left much to be desired. The unemployment compensation plan, like categorical assistance, permitted states wide latitude in establishing benefits and in administering rules. Big variations developed. The program also excluded employees in small firms, and agricultural and domestic workers. These people, of course, were among the neediest in the population. Those workers who were covered ordinarily received compensation for sixteen weeks, at about half their weekly pay. This was less generous than in Britain, where recipients tended to get three-fourths of weekly wages. Maximum payments in the United States were around $15 a week, or the same as earned by the better paid WPA workers. When coverage expired, the unemployed had no recourse save to join the throngs seeking work relief or general assistance. Many received no aid.[30]
Old age insurance also exempted large categories of the neediest people, notably domestics and agricultural laborers. When old age pension payments began—not until 1940—only 20 per cent of workers qualified. At that time benefits ranged from $10 to a maximum of $85 per month—well below subsistence. The law also included no payments for health insurance or disability. Until 1939, it did not assist widows or survivors. It was no wonder that the Social Security Act of 1935, which inaugurated these plans, left many liberals discouraged. One termed it a “measure to furnish such means of security as do not arouse serious opposition.”[31]
Distressed by the limitations of government social policy in the 1930s, some reformers tended to indict the whole four-part welfare system composed of general assistance, work relief, categorical aid, and social insurance. That system, they charged correctly, did not meet people’s needs; it sanctioned large variations in benefits; it catered to local pressures. Largely because of the political influence of the medical and real estate lobbies, it did not include health insurance or much public housing—important aspects of the welfare programs of other Western nations. In distinguishing sharply between welfare (given grudgingly, and mainly to the “deserving”) and social insurance, the American system tended to highlight the stigma attached to recipients of means-tested public assistance. In these and other ways, the American welfare state left much to be desired.
Many contemporary forces accounted for these limitations of the early American welfare state. Among the most obvious were racist views toward blacks, states-rights ideology, the power of organized pressure groups, and the traditionally hostile views toward welfare held by most middle-class citizens. These posed very real obstacles to liberal reformers, who secured considerable legislation under the circumstances. For instance, Congress and Roosevelt held back from trying to impose rigorous national standards because they feared that the ultra-conservative Supreme Court would rule their efforts unconstitutional. Roosevelt also supported Social Security’s conservative financing—by worker contributions—in the hope that such a system would guarantee long-term political support. Congressmen (and the President) moved cautiously also because they wished to control public expenditures—few Americans at the time supported deficit spending. Modest infusions of aid, they hoped, would alleviate distress in the short-run, while recovery programs would provide the major element in the war against poverty: long-term economic progress. Like many Americans—then and later—congressmen wanted to believe that economic growth would soon develop, and that the need for heavy public welfare would “wither away.”
But the early welfare state of the 1930s nonetheless marked a large leap forward in the history of American treatment of the needy. During the 1930s, some 46 million people got public aid or social insurance at one time or another. That was 35 per cent of the population. Public funds for these programs scarcely existed in 1929; by 1939 they amounted to $5 billion. That was 27 per cent of governmental expenditures at all levels and 7 per cent of the Gross National Product. Most of this money went for work relief ($3.1 billion) and general and categorical aid ($1.1 billion). Though the long-range priority remained social insurance, not public assistance, the early welfare state provided public aid at levels that had been unthinkable in 1929.
3. The Rediscovery of Poverty, 1960-1965
Between 1940 and 1960, the Congress retained and managed sporadically to extend the welfare state created in the 1930s. It steadily increased federal funding for matching grants; it added “caretaker” grants for mothers of dependent children, thus changing ADC into Aid to Families of Dependent Children; it set up a new category, the disabled, who might get public assistance; and it approved formulas that resulted in more liberal categorical assistance in low income states. It also expanded social security. In 1939 it approved pensions for widows and survivors of covered (insured) workers; in 1956 it added disability insurance. Over the years it greatly increased coverage, so that by the late 1970s social security reached more than 95 percent of Americans over 65. To this extent, Roosevelt’s political calculations proved wise. For all its limitations, the early welfare state set in motion programs that in time sharply challenged the older voluntaristic traditions of American social policy.[32]
But these years between 1940 and 1960 are best described as a period of “benign neglect” of welfare. Given the unprecedented affluence that followed in the wake of heavy defense spending after 1939, that was not surprising. Congress, indeed, scrapped the WPA in 1943, and it was slow to liberalize public assistance. It never supplied federal funding for general assistance, or appropriated much money for government-sponsored employment. It refused to approve the major goals of welfare reformers: guaranteed minimum subsistence payments for recipients of categorical aid, large-scale public housing, and national health insurance.
This benign neglect, moreover, had fateful consequences for poor people. Far from “withering away,” poverty remained a serious problem during these wartime and post-war years. Official estimates placed the number of poor people in 1960 at around 39 million people, or 21 percent of the population. Income remained badly distributed, and the sense of relative deprivation—whetted by the mass media and by the affluence of the upper middle classes—grew keener as time passed. Observers pointed in alarm at increases in juvenile delinquency, illegitimacy, and family breakup. They worried about the ghettos of Northern cities to which millions of poor people, mainly from the South, Puerto Rico, and Mexico, were flocking in the 1940s and 1950s. Beneath the apparent consensus of that deceptively quiet period lay tensions that were to explode in the turbulent 1960s.
Also beneath the surface lay broad forces that after 1965 were to prompt considerable improvements in social welfare in America. One of these was political: the impetus given during World War II to the growth of the federal government. In time, that growth built up the welfare bureaucracy, greatly expanded the fiscal resources of the State, and raised the expectations of pressure groups, who turned to Washington for aid. Utilizing a growing network of lobbies, these groups pressed for and to some extent succeeded in liberalizing social policy. Economic forces, too, led inexorably to long-range improvements in the American way of welfare. Clearly the most powerful of these was the remarkable prosperity set in motion during World War II. At that time, Americans began to regain the optimism that had characterized social scientists in the late 1920s. Poverty, they thought, could—and should—be abolished.
But these were long-range forces. Until the late 1950s, a sluggish time for the economy, few people worried much about poverty or thought seriously about reforming welfare. Leading social scientists, indeed, imagined that America had developed into a pluralistic but essentially consensual society marked by a virtual absence of sharp class divisions. “As far as the bulk of Western society is concerned,” one prominent sociologist intoned in 1959, “and especially in the United States, the conception of class is largely obsolete.”[33] The sources of such optimism were obvious and historically familiar: the remarkable prosperity and social stability of the era. Despite the existence of millions in poverty, the vast majority of the population was healthier and wealthier than ever before. It was easy to assume that economic growth-the cure-all-would ultimately wipe out the vestiges of suffering. In such a world, both poverty and welfare would soon wither away.
No one did more to damage that illusion than the activist Michael Harrington, whose powerful, passionate book, The Other America,appeared in 1962. Harrington conceded that economic growth had pulled some people out of poverty. But the disappearance of the “mass” poverty of the 1930s merely made the “class” poverty of the 1950s all the more outrageous. The “new” poor, he said, were the aged, the minorities, people in female-headed households, farm workers driven from the land. They lived in an “other America” that was a “ghetto, a modern poor farm for the rejects of society and of the economy.”[34]
Harrington was not the first to “rediscover” the poor. A congressional subcommittee headed by Senator John Sparkman of Alabama had issued deeply researched reports revealing the problems of “low income families” during the 1950s. Using a family poverty line of $2,000, it estimated that 20-25 percent of Americans were poor.[35] The economist John Kenneth Galbraith published The Affluent Society in 1958, an attack on the notion that economic growth by itself would work wonders. He called for a vast expansion of public services to the poor. Other writers, notably Richard Cloward and Lloyd Ohlin, published books and articles in the late 1950s and 1960s warning readers of the structural obstacles to economic opportunity, especially in the ever more crowded slums of the North.[36] But Harrington wrote with a scarcely controlled rage that seemed to touch a nerve among reformers in the early 1960s. Between 1963 and 1965, the poor received more attention—in magazines, in books, in Congress—than at any time since the 1930s.
The reasons for this rediscovery were not wholly clear. Polls at the time suggested that the majority of Americans continued to hold unflattering stereotypes about the poor. Indeed, in some ways these stereotypes were more hostile than ever, for they depicted a growing class of welfare recipients: black “welfare mothers.[37] Polls revealed, too, that Americans yearned to economize on welfare, which—alas—had not withered away. Aid to Families of Dependent Children (AFDC), originally conceived to assist “deserving” families (mostly headed by widows), had covered 625,000 families at a total cost of $565 million per year in 1950; by 1962, thanks to an increase in family breakups, it aided 943,000 families (most headed by divorced or separated women) at a cost of $1.4 billion. Nor did Americans become concerned about poverty because they were afraid of social disorder. On the surface at least, the early 1960s seemed stable. Civil rights leaders were preaching nonviolence; the ghettos appeared quiet; almost no one talked of “black power.” It is historically inaccurate to claim, as some scholars were shortly to do, that the rediscovery of poverty in the early 1960s masked a drive on the part of elites to avert social disorder.[38]
Rather, other forces prompted this resurgence of interest in the poor. One was the dawning realization among some observers that economic growth was slowing down. This had been Galbraith’s point. Moreover, economists at that time had developed enormous selfconfidence in their ability to manage things. George J. Stigler, head of the American Economic Association, proclaimed in 1965 that “economics is finally at the threshold of its golden age—nay, we already have one foot through the door.”[39] Carefully considered policies, economists said, could eradicate poverty, at little cost to the middle classes. One important study concluded that the “elimination of poverty is well within the means of Federal, state, and local ,government.” It could be done at a cost of about $10 billion a year, less than 2 percent of the GNP and less than a fifth of the cost of national defense.[40]
Predictions such as these revealed two fundamental assumptions that lay behind the rediscovery of poverty at the time. One was the faith that a wealthy nation like the United States could afford to abolish need. That faith had scarcely been imaginable in the 1930s, and had been slow to develop in the aftermath of the Depression. The relative prosperity of the early 1960s made it much easier to hold. The second assumption related to the first: poverty in such an affluent society was anomalous—and therefore intolerable. It was un-American. This was the central theme of Harrington, who was appalled at the co-existence of wealth and destitution. It moved Galbraith and many other liberal economists, including President Kennedy’s Council of Economic Advisers. The rediscovery of poverty, in short, did not reflect a softening of popular attitudes toward the poor, or a consensus that welfare ought to be extended as a basic right of citizenship, or a fear of social unrest. It stemmed rather from the optimism and confidence that gripped economists, policy makers, and (to some extent) the American public during the early Kennedy-Johnson years.
This rediscovery of poverty led in time to useful academic discussion about the nature of poverty. Within the next few years scholars debated hotly the meaning of terms such as “culture of poverty.” By 1970 they had conceded the obvious: persisting cultural patterns helped explain the behavior of various ethnic, racial, and income groups. But the scholars largely jettisoned historically persistent notions that poverty itself represented a “culture” passed from generation to generation. Rather, destitution stemmed from economic problems, not cultural deprivation. It inhered in demographic change that expanded the number of old people and of broken families, in racism that blocked the aspirations of minority groups, in the vicissitudes of a market economy. It followed from this economic, non-cultural interpretation of poverty that welfare must be expanded, jobs provided, civil rights protected.[41]
In the short run, the rediscovery of poverty had more visible political repercussions. President Kennedy, an activist, secured from Congress in 1962 legislation to assist so-called “depressed areas” of high unemployment, and a Manpower Development and Training Act to help able-bodied people to improve their skills. Congress also passed amendments to the welfare system. The most important of these extended the AFDC program by making aid available to intact families with unemployed parents (UP). Though optional (half the states refused to supply the matching money for it), AFDC-UP was an advance that promised to hold poor families together by offering them a form of unemployment relief. Under Kennedy’s prodding, Congress also approved the much-touted Public Welfare Amendments of 1962. These augmented federal funding for the training of social workers and authorized federal payment of 75 percent of the cost to states of rehabilitative or preventive services to the needy. Kennedy hailed these amendments as “the most far-reaching revision of our public welfare program since it was enacted in 1935.”[42]
Aside from AFDC-UP, these ventures did not in fact do much for the poor. Most of the able-bodied needed decent employment, not training for jobs that scarcely existed. Most of the unemployables did not need services so much as they needed money, which neither Congress nor Kennedy was anxious to provide in large amounts. Indeed, none of these programs was generously funded. Like most Americans, Kennedy and his liberal aides still believed in prevention, not income maintenance. They hoped services would cause poverty to wither away. They were ready to give a hand up, but not a handout.
By far the most visible outcome of the rediscovery of poverty was the “war on poverty”. Led by Walter Heller, Kennedy’s chairman of the Council of Economic Advisers, high-level White House aides in 1963 developed careful studies of need and income distribution. These revealed the slowing down of economic growth in the late 1950s, and the necessity of government action to help the poor. President Johnson,inheriting the staff work, persuaded his then tractable Democratic Congress to approve a “war on poverty” in the summer of 1964, and named Sargent Shriver to head the Office of Economic Opportunity (OEO)that was to administer the program.[43]
Few government programs in modern American history enjoyed so much initial ballyhoo as the war on poverty. Reflecting a series of congressional compromises, it featured a range of activities, including loans for poor farmers and small businessmen, aid for needy college students, and the Volunteers in Service in America (VISTA), a domestic “peace corps” that sent idealistic young people into deprived areas. Its main programs, however, were the job Corps, which set up training to develop skills among the poor, and the Community Action Plans. These CAPs-more than a thousand in all-provided federal money for a variety of community-based programs, among them legal services for the poor and Head Start, an effort to enrich the lives of pre-school children. In helping to design these Plans, the poor were to have “maximum feasible participation.” That emphasis reflected a disenchantment that had already set in concerning the social-work bureaucracy and the “services” strategy of the 1962 Public Welfare Amendments, as well as the belief that existing federal agencies-ever battling each other-could not be entrusted with the money.
Phrases like “war on poverty” vastly exaggerated what was in fact a very modest skirmish. Like earlier government efforts, the OEO shied away from a WPA-style program of government jobs, and from income maintenance. These cost money, and were resisted in Congress. It stressed instead the old goal of prevention, by increasing “opportunity.” Shriver made that emphasis clear. “I’m not at all interested in running a hand-out program,” he said, “or a ‘something for nothing’ program.” Johnson emphasized that the war was to cut back on welfare, not to increase it. “We are not content to accept the endless growth of relief rolls or welfare rolls,” he said. “We want to offer the forgotten fifth of our people opportunity, not doles.” For these reasons, the OEO never gave poor people what they most needed: jobs and income.[44]
Community action also proved to be a dubious way to attack destitution. Most poor people, after all, suffered from broad economic maladjustments that transcended community limits. Even well-designed, innovative programs at the community level could make scant headway against such wider forces. Moreover, controversy soon developed over what was meant by “maximum feasible participation” of the poor. As federal money started to trickle into communities, local militants—many of whom were engaged in civil rights activity (and after 1965 in “black power” movements)—battled with established agencies, especially the social-work bureaucracy and state and local governments. When OEO sided with some of these militants, mayors rose in hot protest, and Congress responded by limiting the program planning powers of the poor. It also earmarked funds for “safe” programs, such as Head Start. By 1966, the counterattack by established political powers had largely curbed the militants.
By that time Johnson had tired of the program he had launched with such fanfare in 1964. Distressed by the political controversy surrounding community action, he was also absorbed in the Vietnam War. So he did not exert much effort on behalf of funding the programs. Lack of money, indeed, beset the Office of Economic Opportunity from the beginning. From 1965 to 1970, peak years of the agency, it received an average of around $1.7 billion per year. These amounts never comprised more than 1.5 per cent of the federal budget or one-third of one per cent of the Gross National Product. During those years the number of poor ranged from 25 to 33 million. If all the OEO money had gone directly to the poor—and it did not—each poor person in America would have received around $50 to $70 per year. Assessing the program, one expert concluded in 1970, “the war on poverty has barely scratched the surface. Most poor people have had no contact with it, except perhaps to hear the promises of a better life to come.”[45]
That was perhaps too harsh an assessment. The “war” brought the rediscovery of poverty on to the agenda for political action. It provided funding for programs, such as Neighborhood Legal Services, that soon challenged successfully the harsh practices of state and local welfare administrators. It helped locate and train upwardly mobile community organizers who sustained the pressure on city governments to do something about the ghettos. The “war” also prompted experts to study new and better ways of coping with need. A rough consensus among liberals behind income maintenance developed as a result by 1970.
Still, the excessive rhetoric of the war on poverty was unfortunate. The gulf between promises and delivery quickly alienated many of the poor. Their expectations raised but not satisfied, they grew distrustful of established authorities. One planner noted sadly, “there was the assumption of regularly increased funding …. Promises were made that way …. the result was a trail of broken promises. No wonder everybody got mad and rioted.” While America’s urban riots of the late 1960s had deeper causes, notably racial discrimination, it was true that the war on poverty did as much to frustrate as to assist the poor.[46]
The rhetoric disillusioned planners as well. Some people who had initially supportedthe social welfare programs of the Johnson years came to regard the war on poverty as the classic example of what could go wrong with governmental efforts. Dismissing the modest gains that had been made, they concluded that Johnson tried to do too much too fast. They questioned thereafter the capacity of social scientists to plan, and of government to deliver, ambitious programs of social betterment. These doubts, which were aimed at American liberalism itself, persist into the 1980s. They have helped to stymie subsequent efforts at comprehensive welfare reform.
4: The Revolution in Social Welfare, 1965-1975
Despite the reaction against welfare programs, three dramatic developments changed the face of American poverty and social welfare between the mid-1960s and mid-1970s. First, a precipitious drop in the number of poor; second, a stunning enlargement of social welfare programs, especially social security; and, third, an explosion in the welfare rolls leading to what worried contemporaries thought was a “welfare crisis.”
The first development owed little to new welfare policies, but much to the spread of old ones and especially to the real economic growth of the 1960s. Thanks to that growth, the number of Americans defined as living below the government’s official poverty line decreased from 39 million in 1959 (or 21 per cent of the population) to 32 million in 1965 (17 per cent) to 23 million (11 per cent) in 1973. The vast majority of people who climbed above the line in those years were previously poor workers and their families. Those left behind were disproportionately members of minority groups, small farmers and farm laborers, old people, the disabled, and people in female-headed families.[47]
Some contemporary observers played down this improvement. The distribution of income, they insisted, remained largely unchanged in the postwar era, during which time the lowest fifth of income earners got only around one-twentieth of the national income. Liberals added that the official poverty line employed by the government was low. In 1969, for instance, it stood at $3,700 for an urban family of four, compared to the $6,960 that the Bureau of Labor Statistics thought should have been applied. If that higher line had been applied, 33 per cent—not the 12 to 14 per cent estimated by the government—would have been defined as “poor.” Liberals pointed also to the heightened sense of deprivation that lower-class citizens felt in an age of mass communications and rapidly expanding expectations.
But even reformers had to admit that the gains were real. For some historically poor groups, notably the aged, the improvement was particularly dramatic. In 1959, perhaps 40 per cent ofAmericans over 65 lived below the official poverty line; by 1974, the percentage was only 16 per cent—not much higher than the national average. Studies at the time also suggested that a fairly small minority of the poor-around 20 per cent (or 5 to 6 million people in 1973)—lived in chronic destitution. The rest escaped poverty from time to time. Optimists pointed out also that the official poverty line, though lower than others that might have been used, was nonetheless set at levels that allowed for considerably better living standards than had the rough poverty lines used in the 1930s and 1940s. In most material ways, poor Americans in the 1960s lived more comfortably than had their counterparts in earlier generations.[48]
America’s poor were also staggeringly well off by world standards. The per capita income of Harlem, the black ghetto of New York City, ranked in 1960 with that of the top five nations in the world. Blacks in Mississippi—among the poorest groups in the nation—had a median income in 1959 of $944, compared to a median for Puerto Rico of $819. Puerto Rico then ranked in the top quarter of the world’s nations in per capita income. In Harlan County, Kentucky, one of the country’s poorest, two-thirds of the homes in 1960 were considered “substandard” by the government. Yet 67 per cent had television, 42 per cent telephones, 59 per cent an automobile.[49]
The second major change of these years, in social welfare programs, was almost as remarkable. Public spending for social welfare (including public assistance and social insurance but excluding education and veterans’ benefits) increased at an annual rate between 1965 and 1976 of 7.2 per cent in constant dollars. In 1960, such spending was 7.7 per cent of the GNP, in 1965 10.5 per cent, in 1974 16 per cent. These figures showed that the expensive war in Vietnam, however draining, did not prevent unprecedented increases in spending for domestic purposes. The government was providing both guns and butter.[50]
Liberal critics, of course, were quick to point out that most of these increases went for social insurance, and not for public assistance. The people getting the most help were not the neediest—minorities, female-headed families, the “undeserving”—but working people who had paid for their own social security benefits. Reformers complained also that social welfare programs in the United States still lagged behind those of other Western industrial nations—in coverage, in size of benefits, and in the spirit in which the aid was given. And critics lamented especially the continuing flaws in publicassistance programs. As ever, the federal government gave states and localities no funds for general assistance, which continued to be woefully inadequate. The categorical assistance programs, including aid to families of dependent children, still featured wide state-by-state variations in coverage and in the size of benefits. In no state did such assistance alone come close to bringing poor individuals or families up to the level of the official poverty lines. One expert concluded in 1970, “the current Public Assistance system of the United States … deserves to go down in history with the British poor laws of the early Industrial Revolution.”[51]
While such critics correctly identified historic flaws in the system, they could not deny that the American social welfare programs, broadly defined to include social insurance, expanded dramatically in these years. That was especially true of social security, which distributed $16.6 billion to 20.8 million retired people in 1965, and $54 billion to 29.9 million nine years later. Moreover, thanks to Johnson’s efforts, Congress in 1965 added Medicare—health insurance for the aged—to social security. That became a very vital—and expensive—part of the country’s social welfare system. Equally vital were other new and growing commitments—“in-kind” programs such as food stamps, which offered poor people relief in purchasing groceries, and Medicaid, which extended medical help mainly to people on categorical assistance. Food stamps, a new program that no one expected much from in the early 1960s, cost $36 million and went to 633,000 people in 1965. By 1975, food stamps assisted 17.1 million people. Medicaid, a federal-state program, was inaugurated in 1965; by 1975 it provided an estimated $9 billion in benefits to 23 million recipients. These in-kind benefits were means-tested, and went only to the needy. They greatly supplemented the cash benefits from programs like AFDC.
Experts who attempted to assess the impact of such spending on poverty concluded that it was considerable. Cash transfer payments removed about 38 per cent of the poor, or 5 million households, from poverty in 1965. By 1972, the percentage was 44 per cent, the number of households, 7.7 million. The development of in-kind benefits further alleviated poverty by the mid-1970s. Estimates placed the percentage of households pulled out of poverty by cash transfers and in-kind payments at around 60 per cent—nearly 15 million households—by the mid-1970s.[52]
It was not difficult to spot the chief reasons for these improvements. One was demographic. By far the most impressive gains in social spending were for the aged under social security and Medicare. Benefits under these programs were regularly increased—by 20 per cent in the election year of 1972 alone. In 1974 they were “indexed” so as to keep pace with inflation. Congress approved such policies in part because it still adhered to its faith in social insurance as an alternative to public assistance. It did so primarily, however, because old people were by then numerous, well organized, and politically powerful. Congress could not ignore them.[53]
The very maturing of America’s social welfare system further abetted the increase in social welfare of the 1960s and 1970s. Before then the United States, a latecomer to the field, had been struggling to expand coverage and benefits. Many old people still lacked the required amount of covered time under social security, and did not receive pensions. By the mid-1960s, however, America had been operating its social security system for thirty years, long enough to catch up with other nations in the field and to involve most workers in its system.
Political forces also contributed to the rise in social welfare spending at the time. Money for food stamps, for instance, escalated between 1968 and 1972 in part because President Richard Nixon felt obliged to respond to Democratic exposes of malnutrition in America, and because well-organized food retailers and producers applied pressure for the programs. Medicaid was approved in 1965 in part because state officials demanded federal help for medical aid programs they were then struggling to administer. Indeed, states rights ideology—a potent force in the 1930s—was weak by the 1970s. However much states and localities cherished the ideals of decentralization, they found it increasingly difficult to cope with the manifold responsibilities thrust upon them, and they applied political pressure on the federal government for help. The growing force of such pressure groups accounted for the gradual nationalization of welfare.
The jump in social welfare spending depended especially on the country’s ability to pay. The economic growth of the 1960s and early 1970s was impressive. So was the all-important belief held at the time that America could afford increased social spending, that poverty could be eliminated without causing any deprivation to the middle classes. When Sargent Shriver said in 1966 that the United States “virtually could eliminate” poverty, he exposed a faith that had gathered strength during the previous decades of prosperity: the widespread conviction that the age of Malthusian scarcity had vanished forever. In appropriating unheard of sums for social welfare, Congress reflected this surge in confidence.
The third major change of this period, the rise in the welfare rolls, was perhaps the most dramatic development of all. The number of Americans on categorical public assistance grew from 7.1 million in 1960 to 7.8 million in 1965 to 11. 1 million in 1969 to 14.4 million in 1974. All of this growth came in the numbers on AFDC, which increased from 3.1 million in 1960 to 10.8 million in 1974. The percentage of poor people on AFDC was 13 per cent in 1965, 43 per cent in 1974.[54]
That surge in the rolls did not come because Americans suddenly jettisoned old and harsh stereotypes about the welfare poor, thus gladly opening up the rolls to people once excluded. On the contrary, polls taken in the 1960s and 1970s continued to show substantial majorities of Americans holding unflattering views of the poor and hostile attitudes toward welfare, which they yearned to cut back. As the welfare rolls swelled, people did not applaud the greater comprehensiveness of the system. Instead, they cried in alarm that the nation was suffering from a “welfare crisis” that if unchecked would bankrupt the country and rip into the moral fiber of the population.”[55]
In fact, a host of social forces helped account for the increases in the rolls. Some of these were demographic. The baby boom of the 1940s and 1950s expanded considerably the numbers of children potentially eligible for AFDC. The mass migrations of poor people to the more liberal Northern and Western states also led to long-range increases in the rolls. Around three-fourths of the rise in case loads in the 1960s took place in nine Northern urban states. California and New York alone accounted for more than 40 per cent of it.[56]
Five other developments, especially in these liberal Northern states, were important in enlarging the welfare load. One was the approval in these states of AFDC-UP; this aid to unemployed parents accounted for perhaps 10 per cent of the increases. A second was the tendency of these relatively wealthy states to raise the income levels at which people become eligible for aid under AFDC. They did so because of changes in the matching grant formula, which after 1966 guaranteed the federal government would pay at least half of whatever total sum each state paid to AFDC families, providing the state also offered Medicaid. That liberalized formula enticed states to raise income levels for eligibility, so as to maximize their access to federal dollars.
Changes in the law represented a third reason. Thanks in part to the role of OEO legal services and in part to the egalitarian temper of the times, lawyers at last mounted serious challenges to the hoary practices of state and local welfare administrators. These challenges reached the Supreme Court, which between 1968 and 1971 struck down the “absent father” rules and residency requirements, as well as regulations that had denied aid to families with supposedly “employable” mothers. These and other decisions improved the quality of treatment accorded the poor and diminished somewhat the great stigma that historically had attached to applying for aid.
The other two developments, most important of all, accentuated the significance of this rise in the pool of eligibles. One was a big jump in the percentage of eligible families that applied for aid. This increase reflected the much heightened awareness that potential clients had of their rights. The second development was a rise in the percentage of eligible applicants who were in fact granted aid. These two forces resulted in a fantastic jump in the participation of eligible families in AFDC, from perhaps one in three in the early 1960s to more than 90 per cent in 1971. For the first time in American history, the largest category of people eligible for assistance, AFDC families, was taking virtually full advantage of its opportunities.
What prompted this historic development? The most obvious source of it was changing attitudes of poor people themselves. Despite the hostility of the middle classes to increases in welfare, poor Americans at last refused to be cowed from applying for aid. Welfare, they were coming to believe, was a right. So was health care under Medicaid, the availability of which under AFDC clearly quickened the desire of poor families to secure assistance. Compared to the past, when poor people—harassed and stigmatized by public authorities—were slow to claim their rights, this was a fundamental change. The Depression era dirge, “always going to be more poor folks than them that ain’t poor,” was as dated as the Model A Ford.
But why did the poor now become more assertive? One important reason was the civil rights movement that gathered momentum in the 1960s. It helped to arouse some of the ghettos, to train activists, and generally to heighten the sense of inequality and relative deprivation that gripped all low-income people, whether black or white. The civil rights movement, in turn, owed much of its strength to the broad reach of egalitarian thinking that affected most Western nations in the postwar age of escalating expectations, in which the poor—at last—were full and eager participants.
If the increase in the welfare rolls depended heavily on pressure from below, it also received support from the top. Thanks to growing pressures for aid from state and local officials, and from the poor themselves, federal bureaucrats looked aggressively for ways to expand their coverage and to manipulate the wording of formulas so as to maximize aid. As one leading official proclaimed in 1969, “you hatch it, we match it.” This attitude contrasted sharply with those of the conservative state and local officials who had played dominant roles in welfare in the 1930s. And fortunately for the poor, federal officials now had greater opportunity than before to direct the welfare state. This partial nationalization of welfare interacted dynamically with the pressures from below to make the granting of aid more humane. In this sense it was inaccurate to claim, as did radicals and conservatives alike, that liberalism was dead, that the “welfare bureaucracy” was the enemy, or that the federal government mismanaged all that it touched.[57]
5: Floors As Well As Doors
For most of the time prior to the 1960s, the gospel of prevention had possessed anti-poverty reformers. That faith had assumed many forms, including the war on poverty. After the 1960s, the faith persisted. But for many experts who explored ways of ending the scourge of poverty, it no longer commanded first place. Preventive efforts, they thought, had simply not succeeded. Given the structural problems inherent in modern industrial society, such efforts could never work wonders; there would always be poor people in need of welfare. So these experts changed their tack. Henceforth they endorsed income maintenance—floors under income, as well as doors to self-help. They were ready, willing, and eager to give a hand up. But if that did not work, they wanted to be there to give a handout as well.
This is not to say that the experts agreed on the means. Some favored children’s allowances, others public employment programs, others wage supplements. But by the mid-1960s a very rough consensus began to develop among many economists behind straightforward income maintenance which would give people cash assistance up to certain minimum levels. Liberal advocates, such as the British economist Robert Theobald, favored income maintenance as a basic right of citizenship. Conservatives, such as Milton Friedman, hailed the idea as an alternative to what they considered to be the wasteful system of welfare that then existed. In calling in 1962 for a “negative income tax” which would guarantee all families of four at least $1,500, Friedman gladly anticipated the abolition of the massive welfare bureaucracy in Washington.[58]
By the late 1960s support by experts for negative income taxes or floors under income had become widespread. In 1968; 1,300 economists at almost 150 institutions signed a petition urging Congress to adopt a “national system of income guarantee supplements.” Liberal social workers, though anxious to preserve most of the existing welfare programs, agreed with the idea, by then virtually an axiom among reformers, that all people had a right to a minimum income. Other reformers welcomed the chance offered by talk of floors to campaign for the further nationalization of public assistance and the elimination of state-by-state variations in coverage and benefits. In 1969 the so-called Heineman Commission, which had been established by President Johnson to investigate the feasibility of income maintenance plans, endorsed a “universal income supplement program …. making cash payments to all members of the population with income needs.”[59]
Doubts about the workability of such plans of course persisted. Many observers refused to believe that the Internal Revenue Service, or any other Washington agency, could effectively manage such a colossal task. Social workers worried that advocates like Friedman were bent on putting them out of business, and on depriving poor families of the expert case work that social work provided. Some liberals feared that income maintenance plans would institutionalize a low income standard to which already inadequate wage rates would tumble. And many observers wondered how to preserve work incentives under income maintenance schemes. Why should low-income workers continue to try to get ahead, they asked, if nonworkers could get as much (or almost as much) from a government dole? Friedman and others answered by recommending sliding benefits which would decrease as earned income increased, but always leave the workers ahead of the nonworker. Doubts, however, were not easily dispelled, and the question of incentives continued to dominate debate over all such proposals.[60]
Despite these problems, income maintenance in some form seemed by 1970 an idea whose time had come. Experts conceded that details remained to be worked out, but insisted, in keeping with the egalitarian temper of the time, that all people were entitled to minimum incomes. Where reformers in the 1930s had concerned themselves with equity—hence the insurance principles that underlay social security—the experts of the late 1960s spoke about “entitlements” and “rights.” They sought not only equality of opportunity but also greater equality of result. In this way, as in many others, the decade of the 1960s represented a break with the past.
Nothing made this development so clear as the willingness of the supposedly conservative Nixon administration to endorse a version of income maintenance in 1969. Nixon called for a Family Assistance Plan (FAP) that promised to guarantee all families with children a minimum of $500 per adult and $300 per child, or $1,600 per year for two-parent families of four. The plan also included provisions to preserve work incentives. Though it stopped short of universalizing income guarantees—money went only to poor families with children—it seemed a great step forward at the time.
Supporters of the plan noted that it promised especially to aid the poorest of the poor—people in the Southern states where welfare benefits were lower than $1,600 per year for families of four. The program also extended assistance to the working poor with children, whether female-headed or not. Impressed, The Economist asserted that FAP “may rank in importance with President Roosevelt’s first proposal for a social security system in the mid-1930s.”[61]
The plan secured approval in the House but then stalled in the Senate. In 1972, after three years of infighting, it failed of passage. Many forces contributed to its downfall. Northern advocates of welfare reform, including representatives of the ghetto poor, complained that the floor was much too low—lower, in fact, than AFDC benefits already being extended in non-Southern states. Other reformers lamented that FAP excluded poor people without children, perhaps 20 per cent of the poverty population. Critics deplored especially the “workfare” provisions that Nixon had included in the bill. These required adult recipients (save the aged, disabled, and mothers with preschool children) to accept “suitable” training or work, or forfeit their benefits. Proponents of the plan contended that the “workfare” provisions would not be used to force people into low-paying employment, indeed that workfare was largely rhetoric aimed at appeasing conservatives. Unpersuaded, liberal reformers were cool to the plan.
Many conservatives, too, opposed FAP. The Chamber of Commerce took out full-page newspaper advertisements that proclaimed, “FAP would triple our welfare rolls. Double our welfare costs.” They grumbled that it would deprive the economy of cheap labor and sap the work ethic. In, devastating examinations of the plan, they showed that it could severely harm work incentives. They emphasized that any attempt to place a floor under income clashed with durable values and beliefs: in self-help, in state and local administration, in personalized management of social services. In dismissing FAP, the conservative opponents held to the familiar in their lives.[62]
At the last moment, advocates of welfare reform managed to secure passage of a measure that received much less attention than the controversial FAP. That was the Supplemental Security Income program. SSI, as it was called, was approved with little fanfare in 1972. It established an income floor under benefits paid to the less controversial categories of public assistance—the aged, blind, and disabled. These previously separate programs were henceforth administered as one, and funded entirely by the federal government. Most people who qualified got benefits considerably more generous than before 1972. Equally gratifying to reformers, SSI meant a uniform, national program-not the patchwork of varied state and local plans that had obtained until that time. To that considerable extent the advocates of floors achieved a long-sought goal.
Why did Congress approve SSI and scrap FAP? The reasons were instructive—and not very reassuring to advocates of floors. It did so first because SSI offered states some relief, and second because the aged, disabled, and blind were “deserving” in ways that “welfare mothers” were not. Third, SSI received little publicity before passage and did not stir up potential opponents. As with food stamps, Medicaid, and increases in social security, passage of SSI suggested that the best way ofgetting social welfare measures through Congress was to sell them quietly as modest, incremental improvements. Grandiloquent talk about “welfare reform” or “floors for all” alerted opponents, who mobilized effectively.
Advocates of floors under income for all (or at least for families with dependent children) therefore were not much gladdened by passage of SSI. Clearly, there still existed no popular consensus behind the principle of income maintenance, or even of national administration of AFDC. General assistance for “unemployables” remained wholly in the hands of states and towns. And the millions of working poor who did not qualify for categorical assistance in most cases received no help at all. Remedying these durable defects in the American social welfare system continued to be high on the liberal agenda in the late 1970s and early 1980s.[63]
But the 1970s and early 1980s were not conducive to major social reforms. President Jimmy Carter appealed for passage of a welfare reform measure similar to FAP, but got nowhere. Advocates of national health insurance or large-scale programs of public employment fared no better. On the contrary, they had to confront optimists who proclaimed with some justification that the economic progress of the 1960s had greatly diminished poverty, that social welfare spending (especially for SSI and social security) continued to increase as a percentage of GNP and of total government expenditures, and that welfare programs needed only incremental reform. The optimists added that most poor people knew of their enhanced rights, applied for aid, and received it—that the experiences that had cowed and stigmatized poor people in 1930 no longer took place.
At the same time, the combination of inflation and recession in the later 1970s gave ammunition to those who argued that the nation could not afford a heavy bill for welfare, and that economic recovery required government retrenchment and tax cuts to stimulate investment. Aggressive, self-confident conservatives further asserted that welfare was destroying the work ethic, even eroding the nation’s moral fiber. They trumpeted for cutbacks in food stamps, AFDC, and Medicaid and for the tightening of relief administration. Advisers to President Ronald Reagan, who swept into the White House in 1981, even pushed to reduce spending for certain social security benefits—which since 1935 had been almost sacrosanct politically. Congress proved surprisingly sympathetic to such proposals, and supported cutbacks in the provision of welfare. Certainly, in the conservative mood of the early 1980s, middle-class attitudes traditionally hostile to public welfare seemed as strong as ever, and no new rediscovery of poverty—or liberal reform of welfare—appeared in sight.
6. Guide to Further Reading
Students interested in historical works dealing with poverty in America will do well to begin with Robert H. Bremner, From the Depths: The Discovery ofPoverty in the United States (New York UP, 1956), a balanced, readable account of poverty from the 1830s to the 1920s. Another broad study is Paul Boyer, Urban Masses and Moral Order in America, 1820-1920 (Cambridge, Mass.: Harvard UP, 1978). Walter I. Trattner’s From Poor Lazes to Welfare State: A History of Social Welfare in America (New York: Free Press, 1974) is a useful brief survey, while James T. Patterson, The Struggle Against Poverty in America, 1930-1980 (Cambridge, Mass.: Harvard UP, 1981) provides the most recent full treatment of the subject.
Books that deal with poverty and social welfare in the early twentieth century include Roy Lubove, Struggle for Social Security, 1900-1935 (Cambridge, Mass.: Harvard UP, 1968), as well as his The Professional Altruist, 1880-1930 (1965);[12] Allen Davis, Spearheads for Reform: The Social Settlements and the Progressive Movement, 1890-1914 (New York: Oxford UP, 1967); and Daniel Nelson, Unemployment Insurance: The American Experience, 1915-1935 (1969).[14] The most important contemporary sources are Jacob Hollander, Abolition of Poverty (1914);[9] Robert Kelso, Poverty (New York: Longman’s, Green, 1929); and especially Robert Hunter, Poverty (New York: Macmillan, 1904), a classic account of poverty at the turn of the century.
The literature dealing with poverty and welfare in the 1930s grows more voluminous. Useful contemporary accounts include Edith Abbott, Public Assistance (1940);[1] and Josephine Brown, Public Relief, 1929-1939 (1940).[3] Studies of the poor include several thoughtful works by E. Wight Bakke, notably his The Unemployed Worker (1940).[27] See also such collections of case studies as Clinch Catkins, Some Folks Won’t Work (New York: Harcourt, Brace, 1930); Federal Writers’ Project, These Are Our Lives (1939);[1] Martha Gellhorn, The Trouble I Have Seen (New York: Morrow, 1938); and Tom E. Terrill and Jerrold Hirsch, Such As Us: Southern Voices of the Thirties (Chapel Hill: North Carolina UP, 1978).
For federal policy in the 1930s, see Harry Hopkins, Spending to Sane: The Complete Story of Relief (New York: Harper & Row, 1936); Donald Howard, The WPA and Federal Relief Policy (1943);[26] and National Resources Planning Board, Security, Work, and Relief Policies (1942). An intelligent monograph on the subject is Barbara Blumberg, The New Deal and the Unemployed: The View from New York City (Lewisburg, Pa.: Bucknell UP, 1979). Major sources on the early years of social security include Arthur J. Altmeyer, The Formative Years of Social Security (Madison: Wisconsin UP, 1966); and Edwin E. Witte, The Development of the Social Security Act (Madison: Wisconsin UP, 1963). For important general works on the 1930s, consult William E. Leuchtenburg, Franklin D. Roosevelt and the New Deal, 1932-1940 (1963);[22] and William Stott, Documentary Expression and Thirties America (New York: Oxford UP, 1973). J. Wayne Flynt, Dixie’s Forgotten People: The South’s Poor Whites (Bloomington: Indiana UP, 1979) covers that subject competently.
A large body of works now exists on nonwhite poverty in America. Among the most accessible are Kenneth Clark, Dark Ghetto: Dilemmas of Social Power (New York: Harper & Row, 1965); and Lee Rainwater, Behind Ghetto Walls: Black Families in a Federal Slum (Chicago: Aldine Publishing Co., 1970). Elliot Liebow’s Tally’s Corner: A Study ofNegro Streetcorner Men (Boston: Little, Brown, 1967), is an especially cogent work. For black life prior to World War II, the starting place remains the massive study headed by Gunnar Myrdal, An American Dilemma: The Negro Problem and American Democracy (New York, London: Harper, 1944), but see also Gilbert Osofsky, Harlem: The Making of a Ghetto (1966).[11]
Sources covering postwar poverty proliferated in the aftermath of the “rediscovery” of the poor in the 1960s. Students new to the subject might begin with Michael Harrington’s The Other America (1962),[34] which exposed the neglect of the poor, and with three excellent collections of essays by social scientists, Daniel P. Moynihan, ed., On Understanding Poverty (1968);[41] Louis Ferman, et al., Poverty in America (Ann Arbor: Michigan UP, 1965); and Jeremy Larner and Irving Howe, eds., Poverty: Views from the Left (New York: Morrow, 1969). For the “culture of poverty” debate, see Oscar Lewis, La Vida: A Puerto Rican Family in the Culture of Poverty—San Juan and New York (New York: Random House, 1965), and Charles A. Valentine, Culture of Poverty: Critique and Counter proposals (Chicago UP, 1968).
Of the many sources that focus on policy-making in the field of welfare and social security, 1940-1970, among the most readable and authoritative are those by James Sundquist, Gilbert Steiner, and Martha Derthick. See especially Sundquist, Politics and Policy: The Eisenhower, Kennedy, and Johnson Administrations (Washington, D.C.: Brookings, 1968), and his edited collection, On Fighting Poverty: Perspectives from Experience (New York: Basic Books, 1969). Steiner’s major works include Social Insecurity: The Politics of Welfare (Chicago: Rand McNally, 1966) and The State of Welfare (1971).[61] For Derthick, see Uncontrollable Spending for Social Service Grants (1975),[57] and Policymaking for Social Security (Washington, D.C.: Brookings, 1979). See also Winifred Bell, Aid to Dependent Children (ICY.: Columbia UP, 1965); and especially Kirstin Grenberg, Mass Society and the Extension of Welfare, 1960-1970 (1977).[52] Piven and Cloward’s Regulating the Poor (1971)[12] is an unbalanced leftist critique of policy.
The literature on the “war on poverty” is larger than that ill-fated program merited. Students wishing to cut through the voluminous writing should begin with John Donovan, The Politics of Poverty (1973),[43] a solid account which establishes the political background, and with Robert Levine’s The Poor Ye Need Not Have With You (1970)[46] for an evaluation. They may then turn to Daniel P. Moynihan’s engaging but overstated critique of the war on poverty—and of contemporary social science: Maximum Feasible Misunderstanding: Community Action in the War Against Poverty (New York: Free Press, 1969).
Some source carry the story into the 1970s. In addition to Derthick’s works noted above, consult the excellent overview by Henry Aaron, Politics and the Professors: The Great Society in Perspective (Washington, D.C.: Brookings, 1978); also H. Haveman, ed., A Decade of Federal Antipoverty Programs: Achievements, Failures, and Lessons (Madison: Wisconsin UP, 1977); and Robert D. Plotnick and Felicity Skidmore, Progress Against Poverty, 1964-1974 (1975).[4] For the story of President Nixon’s Family Assistance Plan, see Vincent and Vee Burke, Nixon’s Good Deed: Welfare Reform (1974).[62]
Comparative studies offer some perspective on the American experience. These include Otto Eckstein, ed., Studies in the Economics of Income Maintenance (1967);[32] Heidenheimer, Heelo, and Adams, Comparative Public Policy (1975);[30] Hugh Heclo, Modern Social Politics in Britain and Sweden: From Relief to Income Maintenance (New Haven, Conn.: Yale UP, 1974); and Gaston Rimlinger, Welfare Policy and Industrialization in Europe, America, and Russia (1971).[16]
7. Notes
- Edith Abbott, Public Assistance, 2 vols. (Chicago UP, 1940), vol. I, pp. 35, 125-36, 174-75, 220-23. Back
- G.V. Rimlinger, “American Social Security in a European Perspective,” in William G. Bowen, et al., eds., The American System of Social Insurance (ICY.:McGraw Hill, 1968). Back
- Roy Lubove, “Economic Security and Social Conflict in America,” Journal of Social History,1 (1967-68), pp. 61-87 and 325-50; Josephine C. Brown, Public Relief, 1928-1939 (ICY.: Henry Holt, 1940), 380. Back
- Abbott, vol. 1, pp. 190, 220-23; Sophonisba Breckenridge, Public Welfare Administration in the United States (Chicago UP, 1927), pp. 708-09. Back
- Howard Odum, “Public Welfare Activities,” in President’s Research Committee on Social Trends, Recent Social Trends in the United States (ICY.: McGraw-Hill, 1933), pp. 1224-73; and U.S. Bureau of the Census, Historical Statistics of the United States, Colonial times to 1957 (Washington, D.C.: U.S. Govt. Printing office, 1961), p. 193. Back
- See Wesley Mitchell’s introduction, Committee on Recent Economic Change in the United States of the President’s Conference on Unemployment, Recent Economic Changes in the United States (ICY.:McGraw-Hill, 1929), p.xx. Back
- Stephan Thernstrom, The Other Bostonians: Poverty and Progress in the American Metropolis, 1880-1970 (Cambridge, Mass: Harvard UP, 1973), pp. 243-44. Back
- F. Walker, “The Causes of Poverty,” Century,55 (1897-98), pp. 210-16. Back
- Jacob Hollander, Abolition of Poverty (Cambridge, Mass.: Harvard UP, 1914), pp. 5, 16. For the Progressives, see J.A. Thompson, Progressivism (1979),the second pamphlet in this series. Back
- Maurice Leven, et al., America’s Capacity to Consume (Washington, D.C.: Brooking Institution, 1934),passim. Back
- Quoted in Gilbert Osofsky, Harlem: The Making of a Ghetto: Negro New York, 1890-1920 (ICY.: Harper & Row, 1966), p. 149. Back
- Broad interpretations include William Ryan, Blaming the Victim (ICY.: Vintage, 1976);Frances Fox Piven and Richard A. Cloward, Regulating the Poor: The Functions of Public Welfare (ICY.:Vintage, 1971);and Roy Lubove, The Profesional Altruist: The Emergence of Social Work as a Career, 1880-1930 (Cambridge, Mass.: Harvard UP, 1965). Back
- See Eugene Durman, “Have the Poor Been Regulated? Toward a Multivariate Understanding of Welfare Growth,” Social Service Review,47 (1973), pp. 339-50;and Gerald Grob, “Reflections on the History of Social Policy in America,” Reviews in American History, 7 (1979), pp. 293-308. Back
- Daniel Nelson, Unemployment Insurance: The American Experience, 1915-1935 (Madison: Wisconsin UP, 1969), pp. 11-46. Back
- Daniel T. Rodgers, The Work Ethic in Industrial America, 1850-1920 (Chicago UP, 1978). Back
- Gaston V. Rimlinger, Welfare Policy and Industrialisation in Europe, America, and Russia (ICY.: Wiley, 1971). Back
- Federal Writers’ Project, These Are Our Lines (Chapel Hill: North Carolina UP, 1939), pp. 366. Back
- National Resources Planning Board, Security, Work, and Relief Policies (Washington, D.C.: U.S. Govt. Printing Office, 1942), pp. 24, 130-33, 445-49. Back
- Unemployment Census File, Official File 2948, Franklin D. Roosevelt Library, Hyde Park, New York; Bernard Sternsher, “Counting the Unemployed and Recent Economic Developments,” in Sternsher, ed., The New Deal (ICY.: Forum Press, 1979), pp. 101-03. Back
- Herman Miller, “The Dimensions of Poverty,” in Ben B. Seligman, ed., Poverty as a Public Issue (ICY.: Free Press, 1965), pp. 20-51. Back
- Brown, Public Relief, p. ix. Back
- William E. Leuchtenburg, Franklin D. Roosevelt & The New Deal, 1932-1940 (ICY.: Harper & Row, 1963), esp. Chapter 6. Back
- Pierce Williams to Hopkins, 31 Aug. 1933,Hopkins papers, Roosevelt Library. See also James T. Patterson, The New Deal and the States: Federalism in Transition (Princeton, NJ.: Princeton UP, 1969). Back
- Hopkins’ speech, 14 March 1936,Hopkins papers; Samuel Rosenman, ed., Public Papers and Addresses of Franklin D. Roosevelt (ICY.: Random House, 1938-50), vol. 5, pp. 19-21. Back
- Edwin E. Witte, The Development of the Social Security Act (Madison: Wisconsin UP, 1963). Back
- Donald S. Howard, The WP.4 and Federal Relief Policy (ICY.: Russell Sage Foundation, 1943). Back
- See W.W. Bremen “Along the ‘American Way’: The New Deal’s Work Relief Programs for the Unemployed, “Journal of American History, 42 (1975), pp. 636-52; and E. Wight Bakke, The Unemployed Worker: A Study of the Task of Making a Lining Without a Job (New Haven, Conn.: Yale UP, 1940). Back
- Jules Berman, “Public Assistance under the Social Security Act, “Industrial and Labor Relations Review,14(1960), pp. 83-93; and Winifred Bell, Aid to Dependent Children (ICY.: Columbia UP, 1965). Back
- Eveline M. Burns, “Where Welfare Falls Short,” Public Interest, l (Fall,1965), pp. 82-95. Back
- Arnold J. Heidenheimer, Hugh Heclo, and Carolyn Teich Adams, Comparative Public Policy: The Politics of Social Choice in Europe and America (ICY.:St Martin’s Press, 1975), pp. 195-96. Back
- Frank J. Bruno, Trends in Social Work, 1874-1956 (ICY.: Columbia UP, 1957), pp. 309-10. Back
- Evaluations include H.J. Aaron, “Social Security: International Comparisons,” in Otto Eckstein, ed., Studies in the Economics of Income Maintenance (Washington, D.C.: Brookings, 1967), pp. 13-68. For a fuller treatment of these years, see J.T. Patterson, “Poverty and Welfare in Postwar America,” in Gary W. Reichard and Robert H. Bremner, eds., Reshaping America: Society and Institutions, 1945-1960 (Columbus: Ohio State UP, forthcoming in 1982). Back
- R. Nisbet, “The Decline and Fall of Social Class,” Pacific Historical Review,2(1959), pp. 11-17. For a critique of views like these, see John Pease, et al., “Ideological Currents in American Stratification Literature,” American Sociologist, 5 (1970), pp. 127-37. Back
- Michael Harrington, The Other America: Poverty in the United States (ICY.:Macmillan, 1962), p. 10. Back
- See Low Income Families and Economic Stability: Materials on the Problem of Low Income Families,Sen. Doc. 231, 81st Cong., 2d Sess., 1950; and Characteristics of the Low Income Population and Related Federal Programs, Joint Committee on the Economic Report, Subcommittee on Low Income Families, 84th Cong., 1st Sess., 1955. Back
- E.g., Richard Cloward and Lloyd Ohlin, Delinquency and Opportunity: A Theory of Delinquent Gangs (ICY.:Free Press, 1960). Back
- D.J. Kallen and Dorothy Miller, “Public Attitudes Toward Welfare,” Social Work, 16 (1971), pp. 83-90; and J.R. Feagin, “America’s Welfare Stereotypes,” Social Science Quarterly, 52 (1972), pp. 921-33. Back
- E.g. Piven and Cloward, Regulating the Poor (1971). Back
- Stigler, “The Economist and the State,” American Economic Review, 55 (1965), pp. 130-33. Back
- James N. Morgan, Martin H. David, Wilbur J. Cohen, and Harvey Z. Brazer, Income and Welfare in the United States (ICY.:McGraw-Hill, 1962), pp. 3-7. Back
- Daniel Moynihan, ed., On Understanding Poverty: Perspectives from the Social Sciences (ICY.: Basic Books, 1968); and President’s Commission on Income Maintenance Programs, 1969, Poverty Amid Plenty: The American Paradox (Washington, D.C.: U.S. Govt. Printing Office, 1969). Back
- C.E. Gilbert, “Policy-Making in Public Welfare: The 1962 Amendments,” Political Science Quarterly, 81 (1966), pp. 196-224. Back
- John C. Donovan, The Politics of Poverty (Indianapolis: Bobbs Merrily 2d ed., 1973). Back
- “Poverty, U.S.A.,” Newsweek, 63 (17 Feb. 1964), p. 38. Back
- Joseph A. Kershaw, Government Against Poverty (Washington, D.C.: Brookings, 1970), pp. 161-69. Back
- Adam Yarmolinsky, in “Poverty and Urban Policy,” Conference Transcript of 1973 Group Discussion of the Kennedy Administration’s Urban Programs and Policies, Kennedy Library (Dorchester, Mass.), p. 302. A balanced evaluation is Robert Levine, The Poor Ye Need Not Have With You: Lessons from the War on Poverty (Cambridge, Mass.: Harvard UP, 1970). Back
- Robert D. Plotnick and Felicity Skidmore, Progress Against Poverty: A Review of the 1964-1974 Decade (ICY.:Academic Press, 1975), pp. 82-83; R.J. Lampman, “Growth, Prosperity, and Inequality Since 1947,” Wilson Quarterly, 1 (1977), pp. 143-55. Back
- J.B. Williamson and K.M. Hyer, “The Measurement and Meaning of Poverty,” Social Problems,22(1975), pp. 652-62; Mollie Orshansky, “How Poverty is Measured,” Monthly Labor Review, 92 (1969), pp.37-41. Back
- Herman Miller, “Changes in the Number and Composition of the Poor,” in Margaret S. Gordon, ed., Poverty in America (Berkeley: California UP, 1965), pp. 81-101. Back
- L.E. Lynn, Jr., “Policy Developments in the Income Maintenance System,” in Robert Haveman, ed., A Decade of Federal Antipoverty Programs: Achievements, Failures, and Lessons (Madison: Wisconsin UP, 1977), pp. 55-117. Back
- Levine, Poor Ye Need Not Have, pp.185-86. Back
- Kirstin Grenbjerg, Mass Society and the Extension of Welfare, 1960-1970 (Chicago UP, 1977), ch. 3; R.D. Plotnick, “Social Welfare Expenditures: How Much Help for the Poor?” Policy Analysis, 5 (1979), pp. 271-89; M. MacDonald, “Food Stamps: An Analytical History, “ Social Service Review, 51(1977), pp. 642-58. Back
- Andrew Achenbaum, Old Age in the New Land: The American Experience Since 1790 (Baltimore: Johns Hopkins UP, 1978), p. 144. Back
- Lynn, “Policy Developments”; Frederick Doolittle, Frank Levy, and Michael Wiseman, “The Mirage of Welfare Reform,” Public Interest, 47 (Spring, 1977), pp. 62-87. Back
- Joe R. Feagin, Subordinating the Poor: We fare and American Beliefs (Englewood Cliffs, N.J.: Prentice Hall, 1975), ch. 5. Back
- The analysis in these paragraphs relies on Grenbjerg, Mass Society, pp.51-55, 167-68; R.A. Levine and D.W. Lyon, “Studies in Public Welfare: A Review Article,” Journal of Human Resources,10(1975), pp. 445-66; and Heather L. Ross and Isabel V. Sawhill, Time of Transition: The Growth of Families Headed by Women (Washington, D.C.: Urban Institute, 1975), pp. 17-18, 101-23. Back
- Martha Perthick, Uncontrollable Spending for Social Service Grants (Washington, D.C.: Brookings, 1975), pp. 20-22, 35-36, 71-72. Back
- Milton Friedman, Capitalism and Freedom (Chicago UP, 1962), pp. 190-95. Back
- Martin Anderson, Welfare: The Political Economy of Welfare Reform in the United States (Stanford, Cal.: Stanford UP, 1978), pp. 72-73; President’s Commission on Income Maintenance, Poverty Amid Plenty, pp.7, 52-53. Back
- See M.N. Ozawa, “Issues in Welfare Reform,” Social Service Review, 52 (1978), pp. 37-55. Back
- Quoted in G.Y. Steiner, The State of Welfare (Washington, D.C.: Brookings, 1971), pp. 76-77. Back
- Vincent and Vee Burke, Nixon’s Good Deed: Welfare Reform (ICY.: Columbia UP, 1974), esp. pp. 130-38, 155, 161-64. Back
- For an up-to-date account of persisting poverty and associated problems, see Philip Davies, The Metropolitan Mosaic: Problems of the Contemporary City (1980), the fourth pamphlet in this series, esp. pp. 10-17. Back
Top of the Page
Robert H. Fossum & John K. Roth, The American Dream
BAAS Pamphlet No. 6 (First Published 1981)
ISBN: 0 9504601 6 3
- The American Dream: One and Many
- New Beginnings
i. Towards a New Youthii. We the People
- The Old World Yet
- A Shining Thing in the Mind
- Rugged Individualism
- Unalienable Rights
- “Where To? What Next?”
i. Reaffirmation or Negation?
ii. A Rebirth of Wonder?
- Guide To Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1: The American Dream: One and Many
Few terms are defined in so many different ways or bandied about more loosely than “the American Dream.” To some people, the term is a joke, an object of satire, derision, or contempt, a made-in-America label for a congeries of chauvinistic cliches mouthed by jingoists like the orator in e.e. cummings’ poem, “next to of course God America i.” To others, it merely signifies self-determined success, wealth, the “good life” of modish clothes, sports cars, and hot tubs—in a word, the latest thing touted by Madison Avenue. And to still others, less scornful or frivolous, it denotes a unique set of social and moral ideals. The United States may not be in many respects quite so exceptional as it believes itself to be. Nevertheless, the fact remains, as Lionel Trilling once remarked, that it is the “only nation that prides itself upon a dream and gives its name to one.”[1] Thus it is no accident that one recent bestseller challenged Americans to restore the Dream, while another asked them to reconsider whether its best aspirations have been fulfilled.[2] Best aspirations? What are they? In short, what is the American Dream?
The framers of the Declaration of Independence stated many of the Dream’s basic assumptions in 1776 when they asserted that “all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these arc Life, Liberty and the pursuit of Happiness.” In Letters from an American Farmer (1782), Hector St.,John de Crèvecoeur enlarged this vision by describing the new republic as a place where a man could abandon “all his ancient prejudices and manners,” act “upon new principles,” and be “rewarded by ample subsistence.” Indeed, to Crèvecoeur the hopes that Philip Freneau and Hugh Henry Brackenridge had poetically expressed in -The Rising Glory of America” (1771), that the nation-tobe would constitute a new Paradise, were already virtually fulfilled: America was “the most perfect society now existing in the world.”[3]
According to Charles L. Sanford, however, “perhaps the most impressive early statement of the American Dream” was made in 1823 by Hugh Swinton Legare. Delivering a Fourth ofJuly speech in Charleston, South Carolina, Legare declared that America’s goal was nothing less than to establish a democratic utopia of liberty, prosperity, and public virtue. Others, the most expansive of whom was Walt Whitman, aspired to similar goals in the decades following. Yet the term”American Dream” itself is of relatively recent coinage. One of the first to popularize it was the historian James Truslow Adams. In The Epic of America (1931), he referred to “that dream of a land in which life should be better and richer and fuller for every man, with opportunity for each according to his ability or achievement.” Adams believed that this dream might be “the greatest contribution” that the United States has “made to the thought and welfare of the world.” His very phrasing also suggested, probably inadvertently, what others had suspected: that America is not just the home of the Dream but the Dream itself.[4]
Although the foregoing descriptions capture some of the ideals -equality, liberty, prosperity, opportunity, public virtue—commonly associated with the American Dream, they omit others which have been equally influential. Furthermore, their very reliance on abstractions reduces complexity to a misleading simplicity. For the Dream is and always has been comprised of many dreams; no single vision has ever totally dominated the American imagination. As even Ralph Ellison’s novel, Invisible Man (1952), puts it, “America is woven of many strands; I would recognize them and let it so remain …. Our fate is to become one, and yet many—This is not prophecy, but description.” Echoing the national motto—E Pluribus Unum—Ellison’s narrator here not only expresses both the unity and the diversity of American society; he implicitly acknowledges the complex, sometimes paradoxical nature of the Dream which that society has reflected and perpetuated. Whatever unity it possesses is delicate, its diversity undeniable.
Within the limited scope of this pamphlet, we could not begin to mention, let alone analyze, every feature of the Dream. Nor could we expect to point out all the ways in which they are interwoven; the warp and woof of the Dream are much too intricate for that. We can only hold the fabric up to the light, identify its major strands, and show as best we can how their recurrent interweavings have given it a subtle unity.
None of those strands is more persistent than a belief in new beginnings. This belief exemplifies better than any other the optimism—some would call it the naivety—of Americans and the fundamental reason why rhetoric about the Dream caught on in the United States. From their inception, American self-images reflected the idea that the past did not bind one irrevocably. Fresh starts could be made, tomorrow promised to be better than today, and progress seemed always to be possible. To speak of the American Dream, therefore, became one appealing way to epitomize the many principles and experiences that reinforced those hopes in American life. Individuals and groups have, of course, disagreed markedly as to the contents and priorities emphasized in their particular versions of the Dream, thus sparking controversies that make the American Dream not only an ideal to rally around but also a convenient target for criticism. The net result is that debates about the Dream infuse American identity so much that they are unlikely to cease.
Historically the motif of new beginnings is tied to the image of America as the New World, a potential New Eden in the West, as well as to common American attitudes toward history, opportunity, success and failure. The effect of the frontier, both as fact and as metaphor, must therefore be taken into account if the texture of the Dream is to be perceived. So too must American concepts of human nature, which in turn affect a people’s view of the purpose of government, of a citizen’s rights and responsibilities, of the proper relationship between the freedom of Whitman’s “simple separate person” and the welfare of his “En-Masse.”[5] These are all important patterns in the fabric, as we will show by examining-sometimes, out of necessity, only briefly mentioning-what representative novelists, philosophers, poets, theologians, politicians, and occasionally the voices of popular culture have had to say about them. Our approach will be primarily thematic, with due attention paid to historical development, because we believe that basic elements of the Dream persist from colonial times to the present, however much the forms of their expression change.
If America is indeed a “culture of contradictions,” as Richard Chase argues,[6] then one will scarcely be surprised to find these representative men and women not only disagreeing among themselves but demonstrating ambivalence toward certain aspects of the Dream. For although Americans—whether famous or obscure, extraordinary or ordinary—cherish and want to believe in their dreams, they are also frequently skeptical, even cynical, about realizing them.This has been true all along, because no actuality could ever exactly correspond to the lofty ideals America has set for itself. Faith and skepticism, conviction and uncertainty, wax and wane throughout American history. Our impression is, however, that the late twentieth century has seen a progressively intensified ambivalence at best, a sense of downright betrayal at worst, about a Dream which threatens to end in nightmare.
Ronald Reagan’s election released a flurry of words about a new American renaissance. In tones reminiscent of John F. Kennedy’s “New Frontier,” Lyndon B. Johnson’s “Great Society,” and similar slogans from others who have occupied the White House, President Reagan used his Inaugural Address on January 20, 1981, to assure Americans that they “have every right to dream heroic dreams.” To the rhetorical question “Can we solve the problems confronting us?”, his answer was “an unequivocal and emphatic yes . . .. We are Americans.” But will American experience in the 1980s produce “an era of national renewal,” or is it more plausible, as Archibald MacLeish suggested in Land of the Free (1938), that the West and all it stood for is “behind us now,” that “the dreaming is finished”? Like MacLeish, “We can’t say/We aren’t sure.”[7]
2: New Beginnings
At one point in Ellison’s Invisible Man, the protagonist thinks to himself. “You could actually make yourself anew .. . . All boundaries down, freedom was not only the recognition of necessity, it was the recognition of possibility.” Coming at a moment of optimism in a novel about the frustrations and injustices suffered by blacks in mid-twentieth-century American society, this euphoric statement derives its force from the reader’s recognition that it is so typically American. It reiterates one of those beliefs, intoned repeatedly over the course of the nation’s history, which persuade us that the American Dream possesses unity as well as diversity.
Towards a New Youth
Like so many visions stamped “American,” the dream of a new start did not originate in America. But in a sense it was bound up with a figurative America even before that continent was discovered. To be sure, Christopher Columbus sought to find a passage to India, and his backers were essentially interested in the riches of the East; still, Columbus imagined that a new kingdom of God, a terrestrial paradise, might be established in the land to which his navigational mistake had led him. In his imaginings, he thus resembled the other European dreamers who saw their fountain of youth, their New Atlantis, their El Dorado as existing somewhere in that Golden West which turned out to be America. The Atlantic was their frontier, somewhere beyond which lay a brave new world where men could start again. These hopes, however, were frequently dashed on reefs of conflict. Visions of personal gain clashed with those of communal association; and even where those tensions were lacking, the threats of disease and starvation in a supposed land of plenty were never far behind. Such gaps between dreams and realities persist. If America promises opportunity, that promise is not without mocking ironies.
Many of these early voyagers were Spanish and French, and their Catholicism left its mark on some aspects of American life. Even more decisive was the Protestantism of the Puritans. When they set foot on North American soil, what they found was hardly an earthly paradise. As William Bradford and Michael Wigglesworth both noted (and as Coronado’s western reports could have confirmed), it was a waste and howling wilderness. Nevertheless, here they could practice their religion as they saw fit. Unimpeded by pressures to tolerate or conform to alien ways, they would restore to Christianity the health which, in the Puritans’ opinion, it had lost in the Old World. Here, the Puritans were convinced, the human race had a divinely granted second chance at redemption. Thus, as late as 1742, Jonathan Edwards still found it probable that the “glorious work of God, so often foretold in scripture, which, in the progress and issue of it, shall renew the world of mankind . . . will begin in America.”[8]
Although Edwards was echoing prophecies uttered by earlier Puritans such as John Cotton, John Eliot, and Cotton Mather, Puritanism was not a single, fixed ideology. It was instead a farreaching reform movement with diverse and even conflicting tendencies. The Separatists who established Plymouth Plantation did not see eye-to-eye with their cousins at Massachusetts Bay, who sometimes took as short a way with dissenters—Anne Hutchinson and Quaker “enthusiasts” come to mind—as the hated Archbishop Laud. Another dissenter, Roger Williams, moved on to found Rhode Island, a colony which eventually produced, according to Vernon L. Parrington, “a theory of the commonwealth that must be reckoned the richest contribution of Puritanism to American political thought.”[9] Providing for a separation of church and state, it placed an even higher premium on the integrity of conscience than did the other Puritan colonies.
In all of these colonies, however, the sense of covenant expressed in the Mayflower Compact was given substantial weight. As the Puritans’ conscience usually understood it, their charge was to “combine our selves togeather into a civill body politick” that would be an example for the rest of the world—a charge that America has taken seriously ever since. In John Winthrop’s view, their settlement ought to be “a model of Christian charity,” a “city upon a hill” influencing human destiny. Progress toward this goal entailed “a due form of government both civil and ecclesiastical,” the details of such a government, Winthrop believed, having been entrusted to Puritan men by God. To these idealistic principles, however, Winthrop’s colleague, John Cotton, felt compelled to add a realistic admonition: “Let all the world learn to give mortal men no greater power than they are content they shall use—for use it they will.”[10]
Winthrop and Cotton tried to sustain a consensus rooted in biblical principles. Their plan failed not only because everyone did not read God’s word exactly alike, but also because many of the early arrivals did not read God’s word at all. For only a minority of the colonial settlers dreamed of spiritual renewal, let alone Christian charity. Most of them, especially in the regions south of New England, simply wanted the economic success which had previously eluded their grasp. Others had fled English prisons or the threat of them. Still others, like the followers of the notorious Thomas Morton, saw the New England forest as a natural garden of sensual delights, a view which could not be tolerated by the Puritans. No matter what their particular aims, however, the early settlers shared one hope in common: that America would provide them with a fresh start.
Despite recurrent attempts to maintain or, later, to reinstate the Christian vision through Great Awakenings of one sort or another, even the Puritans’ dream of new beginnings was eventually transformed into more secular hopes of social, political, economic, psychological, even sexual rebirth. Benjamin Franklin’s thinking is a representative early example of the process. While paying lip service to God and virtue, Franklin clearly had his eye on material success: nothing is so likely to make a man’s fortune as virtue. Virtue is a means, worldly fortune the end. Furthermore, a man’s ability to change from Poor Richard to Rich Richard was contingent not on divine grace but on his determination to help himself. Tom Paine was equally disinclined to turn responsibility for the creation of a new order over to God. Illustrating the evolution of religious Protestantism into the political rebelliousness which culminated in the American Revolution, Paine asserted in 1776 that “we have it in our power to begin the world over again.”[11]
Whether or not this was “Common Sense,” the notion that America had re-created the world and engendered a unique species carried forward into the nineteenth century. In 1839, an editorial in the Democratic Review proclaimed that “our national birth was the beginning of a new history,” while in the same decade Ralph Waldo Emerson insisted that Americans had the power to establish an “original relation to the universe.” Henry David Thoreau, declaring that “every child begins the world again,” went to the woods to recover that original relation and youthful sense of wonder.[12]
Nature was the place to do it, or so Americans generally felt. Given that popular Romantic assumption and the extent of virgin land confronting them, the widespread belief that the American could be a New Adam, the nation a New Eden, is understandable. Poet of universal democracy and celebrant of the procreative urge, Walt Whitman was only the most rhapsodic of those who insisted that the American is an Adam “to the garden [of] the world anew ascending.”[13] Henry James, an ambivalent Romantic at best, slyly named his hero of The American (1877) Christopher Newman. Mark Twain’s best-loved books, The Adventures of Tom Sawyer (1876) and The Adventures of Huckleberry Finn (1884), for all their differences, both include nostalgic paeans to youth’s resilience. Twain’s Hank Morgan, in A Connecticut Yankee in King Arthur’s Court (1889), for a time is actually convinced that he can change the past. Transported in a dream to medieval England, Morgan tries to turn the latter into a counterpart of nineteenth-century industrial America.
Perhaps it is this very faith of Americans, like Emily Dickinson’s, that they “dwell in possibility,”[14] which accounts for the emphasis on children who change and develop, as narrators and principal characters, in so much American fiction—from Huckleberry Finn and Henry James’s What Maisie Knew (1897) to Carson McCullers’ The Heart Is a Lonely Hunter (1940) and J.D. Salinger’s The Catcher in the Rye (1951). Certainly the great majority of immigrants, early and late, have come here with that faith. Certainly the American love of mobility and of maintaining a youthful appearance is based on a similar faith. D.H. Lawrence was right: the “true myth ofAmerica” is a “sloughing of the old skin, towards a new youth.”[15]
We the People: A More Perfect Union
The American Dream is also indelibly stamped by the new beginnings Americans made in the Revolution, when they won both independent nationhood and opportunity to establish their own system of government. Tom Paine welcomed that Revolution by announcing that “the birthday of a new world is at hand,” but Alexander Hamilton wondered if Americans could indeed slough the old skin. As he said, they now had to “decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.”[16] Although not all of its provisions were to his liking, Hamilton supported the Constitution drafted at Philadelphia in 1787, which represented the culmination of the Founding Fathers’ thinking on the problem and which still enshrines common attitudes toward government. Ratified a year later, that document joined the Declaration of Independence to form the keystone that sustains the unity of the American Dream. That unity, however, is like a precariously balanced arch because both the Constitution and its legacy make clear that the Dream is often markedly diverse as well.
The Constitution recognizes that Americans are not innately tolerant, that they are rather a coalition of minorities, each trying to escape the others’ bigotry. Thus, if that document is famous for anything, its emphasis on “checks and balances” may top the list. Knowing that even Americans are corruptible, aware that any group may become tyrannical after gaining power, the Founding Fathers separated power into legislative, executive, and judicial branches in the belief that no single faction would be able to gain control of all three branches at once. Hence the Constitution aimed at the best form for politics. Written constitutional restraints were the best way to preserve liberty against the tyranny of factions, which were acknowledged as inevitable.
In 1803 Chief Justice John Marshall could hold that America’s is “a government of laws, and not of men.” But already the Constitution’s provision for popular election of leading officials, whether directly or indirectly, was allowing a mass electorate to determine the character of Congress, of the President, and in the long run even of the Supreme Court. Thus the constitutional system became more democratic than had been originally intended and came to be identified with popular rule. Inevitably a real tension has existed, and still exists, between the fundamental principles set forth in the Constitution and its Bill of Rights and the will at any one moment of the majority of the people as expressed by their elected representatives. American government is very much a government of men as well as laws.[17]
“Justice,” says Madison, “is the end of government.” To which Hamilton adds: “the vigor of government is essential to the security of liberty.”[18] Justice and liberty, the collective good and individual freedom—how does American government resolve these tensions? On the surface, the balance may pose few serious problems. Americans agree that liberty should not permit one person to trample on another and that justice requires limits on freedom. To reject such principles is to return life not to an Edenic but to a savage state, a moral wilderness where isolation breeds both fear and egocentricity, and where insecurity and jungle law foster a ruthless violence. Yet precisely how are liberty and order to be reconciled, how the line between them drawn? What is the proper balance between individual liberty and collective welfare? These represent dilemmas within the Dream, and Americans have not always agreed on how to untangle them.
Nothing illustrates that reality more emphatically than the fact that repeated compromises failed to disprove Abraham Lincoln’s belief that the United States could not endure “half slave and halffree.” Less than “four score and seven years” after the Declaration of Independence, a bloody Civil War was necessary to keep the Union intact and to certify, as the Thirteenth Amendment did after the War, that “neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.” Today the Civil War battlefields at Chancellorsville, Gettysburg, and Chickamauga are quiet memorials to Confederates and Yankees who fought to the death because they disagreed over the future of the Union and the place of slavery within it. Yet the calm of those places is deceiving. Regional differences remain. The South and the North, the East and the West, are not culturally identical, and discord still jars consensus about federal power. Some Americans adhere to Theodore Roosevelt’s conviction: “the betterment which we seek must be accomplished . . . mainly through the national Government.” Others agree with Paine, Jefferson, and Reagan that the best government is the one which governs least, for they suppose that the ‘natural laws’ of economics and social development operate beneficently. Strong suspicions of governmental power survive, and Americans rarely trust even elected officials completely. In fact, it sometimes seems that many Americans share George Bernard Shaw’s contention that “politics is the last refuge of a scoundrel,” an attitude confirmed by such political novels as Mark Twain’s and Charles Dudley Warner’s The Gilded Age (1873), Henry Adams’ Democracy (1880), and Joseph Heller’s Good as Gold (1979). This feeling has been heightened by the growth of an army of appointed bureaucrats who govern by administrative decision, and by continued concern that, as expressed by Dwight D. Eisenhower, an undue “influence, whether sought or unsought, by the military-industrial complex” also undermines democracy.[19]
Yet Americans still trust the structure of their government as it is constitutionally ordained, and they have not lost faith entirely in their leaders. If they have, how does one explain the outrage when their apparent distrust is proved valid? In the wake of Watergate, Gerald R. Ford announced: “Our Constitution works …. Here, the people rule.” If the latter is no self-evident truth, acceptance of Ford’s first appraisal holds good nonetheless. Americans continue to hope that leaders can be found to cure national ills, cling to their faith that the American form of government is worthy, and believe, as Carl Sandburg’s poem has it, that “the people, yes, the people” will ultimately dismiss those who betray the public trust.[20] Despite errors committed and disillusionments produced by the actual government of laws and persons, the Constitution still provides an indispensable preamble for the Dream of new beginnings by encouraging “We the People.of the United States . .. to form a more perfect Union, establish justice, insure domestic Tranquillity, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity.”
3: The Old World Yet
Diversity within the American Dream is often typified by ambivalence, even within voices largely affirmative, about the extent to which new beginnings are possible. Although Mather and Edwards were hopeful that the latter-day glory would begin in America, they were still Calvinist ministers obliged to remind their parishioners that, God’s chosen people and inhabitants of a new land though they might be, they were nonetheless offspring of a fallen Adam, their redemption dependent on God’s grace. The Founding Fathers, children of the Enlightenment, emphasized human rationality. Within reason’s boundaries, man could go far to make what he wanted of himself, his surroundings, his institutions, for the God in whom Thomas Jefferson and his comrades trusted was a benevolent being. Still, they were not all as thoroughly convinced as Pains that man’s rational faculties, given free rein, would bring about universal regeneration.
Madison acknowledged that “the reason of man remains fallible,” that his “self-love” and his “unfriendly passions” may lead to “violent conflicts,” and that those passions must consequently be controlled. Hamilton saw a great deal of difference between the human nature of the rabble and the human nature of the propertied gentry, and hence anticipated Alexis de Tocqueville’s fears that majority rule would be tyrannical. George Washington, in his “Farewell Address” (1796), deplored the “baneful effects of the spirit of party,” but found that spirit “inseparable from our nature, having its root in the strongest passions of the human mind.” In short, the Founders were neither foolish nor without fears that their new-born experiment might flounder.[21]
If some early Americans came to regard themselves as different from, or even better than, their European counterparts, others criticized that exceptionalist hypothesis. In the nineteenth century, for instance, no American writer was more ambivalent about the possibility of a new start than Nathaniel Hawthorns. A contemporary of the Transcendentalists, Hawthorns wanted to share their belief that humanity was naturally good, and he did believe, with Emerson, that every person contained all history. Humanity was a magnetic chain stretching from the past into, and no doubt beyond, the present; but to Hawthorns, as to the Puritans, the links were forged of evil as well as good. The Transcendentalists dreamed of a benign spiritual democracy in which the conflict between the freedom of the individual and the welfare of the collective were reconciled in an Oversoul which included everyone. Hawthorne found this dream implausible. He agreed with the Transcendentalists that man was capable of rational reflection. That very reflection, however, could lead to a violation of the human heart, as the scientific experiments in “The Birthmark” (1843) and “Rappaccini’s Daughter” (1844) demonstrate. Furthermore, the heart itself was like a cave or a forest: there was bright sunshine at its entrance, and if one penetrated deeply enough, one would reach sunshine again. But the heart also contained a tangle of evil passions inherited from the Old Adam. So Young Goodman Brown and other Hawthorns characters discover to their sorrow. Until “that foul cavern” is purified, the world “will be the old world yet.[22]
Hawthorne also sympathized with those persons who believed that man could renew himself and that America provided the best conditions for doing it. Thus in The Scarlet Letter (1850) Hester Prynne exhorts the guilt-ridden Arthur Dimmesdale to discard the past and “begin all anew” in the Western forests. Ultimately she learns that this is more easily said than done. Thus too in The House of the Seven Gables (1851) Clifford Pynchon declares that time moves in an “ascending spiral curve,” that the past is only a “coarse and sensual prophecy” of an “etherealized, refined, and perfected” future, and that merely by destroying old houses society can purify itself. Although Hawthorns undoubtedly wished that Clifford were right, the novel as a whole demonstrates that Hawthorne’s expressed desire to write a book with some “cheering light” was engaged in a losing battle with his deepest convictions. A similar conflict is evident in The Blithedale Romance (1852), Hawthorne’s fictional account of his experiences at the Transcendentalist-inspired Brook Farm, one of the more than two hundred nineteenth-century utopian communities based on the notion of human perfectibility. The Blithedalers think it time to take up the Pilgrims’ “high enterprise” and create a paradise in the New England countryside. What they actually create is only a “cold Arcadia,” a “lifeless copy of the world in marble,” and even this is soon shattered by passions their rationality cannot control. In attempting to cut themselves off from the past, from a society they consider corrupt-just as, Hawthorns implies, America has tried to cut itself off from the fallen Old World—the Blithedalers also suffer from an isolation which distorts their moral perceptions.
Hawthorne’s last novel, The Marble Faun (1860), isset in Italy, where his naive American hero and heroine confront ancient evil for the first time. In the end, they return home, sadder and presumably wiser for the experience. Hawthorns ironically suggests, however, that they have not entirely abandoned their illusion (and that of the republic for which they stand) that evil cannot flourish in America as it can in Europe. That lingering hope is to Hawthorne as ingenuous as the fictional John Hancock’s assertion in “Old Esther Dudley” (1839) that Americans are entirely a people of the present. It is comparable to Robin Molineux’s assumption in “My Kinsman, Major Molineux” (1832) that he and his colonial compatriots can psychologically commit parricide and not suffer the consequences. For while Hawthorne admitted that the Revolution was both right and necessary, he wanted Americans to remember that their nation was planted in “soil . . . fertilized with British blood.”[23] Even if a new start were possible, he believed, it was bound to be a costly affair, since it fractured the temporal continuity upon which identity, personal or national, is so heavily dependent. Hawthorne’s own American dream was of a nation which fully recognized the mixture of good and evil, rationality and irrationality, in human nature; which accepted both the burdens and blessings of history; and which, without believing that the past entirely determined all subsequent events, acknowledged that there is no such thing as a sovereign present or a virgin future.
The other great mid-nineteenth-century American novelist, Herman Melville, shared Hawthorne’s skepticism. That he doubted nature’s moral structure, the natural man’s inherent goodness, civilized man’s ability to return to a primitive simplicity, is evident even in his first book, Typee (1846). Though in Redburn (1849) he could confidently say that America “shall see the estranged children of Adam restored as to the old hearthstone in Eden,” no such faith is discernible in his subsequent writings. In Moby Dick (1851) the fate of Captain Ahab—that most self-reliant, ambitious, and pioneering of all pioneers- is not to make himself and the world anew but to perish in a morally indifferent and unregenerated wilderness of ocean. The initially innocent Pierre (1852) dies disillusioned with himself and with a hopelessly ambiguous world. Melville’s last full-length novel, The Confidence-Man (1857), is a devastating attack on the blind trust of Americans given to self-deception. And in the posthumously published Billy Budd, Foretopman (1924), Melville concludes that in a man-of-war world the destiny of the New Adam is, as always, crucifixion.
Emerson is typically cited as contrasting with the skeptical Hawthorne and Melville. Yet his journals reveal that his concept of human nature was not always quite so sunny as his public pronouncements would indicate. Nor, it would seem, was he unshakeably convinced that Americans had already started the world over again; had he been, his call in “The American Scholar” (1837) for an intellectual declaration of independence from Europe need not have been so impassioned. Emerson had his ambivalences. So did Thoreau: witness his attacks on capitalistic exploitation in “Life Without Principle” (1863) and on government in “Civil Disobedience” (1849).[24] So did Whitman by the time of Drum-Taps (1865) and Democratic Vistas (1871); Twain by the time of A Connecticut Yankee, where Morgan’s final dream is of an escape from technological America back to Arthurian days; and Henry Adams, who, in The Education of Henry Adams (1907), doubted that the industrial Dynamo could adequately replace the Virgin as a symbol of spiritual energy and unity.
4: A Shining Thing in the Mind
Julian West, the hero of Edward Bellamy’s Looking Backward: 1887-2000 (1888), found himself transported into an American utopia only three generations removed from the squalor and strife of late nineteenth-century Boston. A bestseller in its day, Bellamy’s novel suggests that if doubt dampened American convictions, it could not stop the search for new beginnings. The Calvinist belief in human depravity and in the uncultivated forest as its counterpart was weaker than the Enlightenment faith that man’s reason could govern his passions and the Romantic assumption that nature offered spiritual and moral renewal.
Indeed, to Frederick Jackson Turner, who aft-armed that “America has been another name for opportunity,” the presence of untrammelled nature in the West had an enormous effect on American dreams. In his influential essay, “The Significance of the Frontier in American History” (1893), Turner argued that “American social development has been continually beginning over again on the frontier. This perennial rebirth, this fluidity of American life, this expansion westward with its new opportunities, its continuous touch with the simplicity of primitive society, furnish the forces dominating American character.” Noting that America had a successive series of westward-moving frontiers from the first, Turner said that each one furnished an “escape from the bondage of the past; and freshness, and confidence, and scorn of older society.” In this seminal essay and others, Turner developed his thesis that frontier life fostered individualism, self-reliance and self-determination, democracy, faith in man, a penchant for discovery, and the courage to break new ground. Following Crevecoeur, Turner also stressed that on the frontier “immigrants were Americanized, liberated, and fused into a mixed race.”[25] In short, the frontier dream was the American Dream.
Turner’s thesis has been criticized for oversimplification, for idealizing the frontier and its settlers, for confusing reality with the metaphor of the Garden, and for inconsistently lamenting the passing of the frontier while praising it as a step toward a higher stage of civilization. Granted, other forces helped form the ideals which Turner mentions. Granted too, the frontier was sometimes less a garden than a desert, frontiersmen a disparate lot that no stereotype accurately describes. But to deny his frontier thesis altogether is to dismiss its considerable insight about the American Dream.
One need not take Turner’s word for it that the West meant an opportunity to leave one’s ancestry behind, that it encouraged a belief in self-reliance and self-determination. The frontier that inspired Romantic historians such as Francis Parkman and George Bancroft has been the home of America’s most indigenous heroes—Daniel Boone, Davy Crockett, Andrew Jackson, James Fenimore Cooper’s Natty Bumppo and his innumerable fictional descendants—and subsequently it spawned that most indigenous of American popular arts, the Western film. Buffalo Bill may be defunct, as e.e. cummings says, but John Wayne’s mystique lives on.
As for Turner’s inconsistency, it is not peculiar to him. Several decades before, Cooper struggled in his Leatherstocking tales with a comparable tension between the claims of nature and the claims of civilization, between a dream of the Garden and a dream of social evolution.[26] Later on, in O Pioneers! (1913)and My Antonia (1918), Willa Cather vacillated between admiring the simplicity and vigor of her immigrant pioneers and their agrarian way of life and lamenting the cultural deprivations of frontier existence. What about Western movies and Western heroes, for that matter? Aren’t they typified by a conflict between range and settlement, the lone cowboy and the schoolmarm, and their analogues within the individual and collective psyche? If America’s archetypal hero was the idealized frontiersman, his Doppelganger was the gunslinger, his heir the gangster, personifying that violence endemic to American life whose irruptions have disturbingly increased in our time. As scientific hypothesis, Turner’s argument may be faulty. As a metonymy of American dreams and their ambivalence, it rings true.[27] For if the closing of the frontier signified the onward march of civilization, Turner also saw that in the future the United States would face all the problems of the Old World.
Since the West has always had temporal as well as spatial connotations, however, its settlement did not put an end to the hopes invested in it. True, the idea of a new start in the West implied an escape from the corrupted past; but it also implied a return into a mythical prelapsarian past. And if, as Archibald MacLeish has said, “America is West . . . . A shining thing in the mind,”[28] in the American mind that place has frequently been some dream-like state in which an idealized past merges with an idealized future, a country in which anything is possible.
Probably the most dramatic rendering of America’s millennial dream of recovering, in the future, a time-transcendent moment from the past is F. Scott Fitzgerald’s The Great Gatsby (1925). The novel’s narrator, Nick Carraway, says that it is “a story of the West, after all,” and in many ways Jay Gatsby is a typical Western hero. A protege of Dan Cody, frontiersman, Gatsby starts out poor and then amasses an enormous fortune through his own self-reliant (if unscrupulous) shrewdness. In a sense he personifies America and the American Dream as Fitzgerald perceived them. Springing from a “Platonic conception of himself,” Gatsby has repudiated all but one part of his past in favor of a past, and consequently an identity, which he has invented for himself. The one part of his actual past which he has not repudiated is a golden moment when he fell in love with Daisy Fay Buchanan, who incarnates all of the youth and loveliness and wealth in the world to Gatsby. He lost her then, and everything he has done since has been aimed at recovering her.
Nick says at one point: “[Gatsby] talked a lot about the past, and I gathered that he wanted to recover something, some idea of himself perhaps, that had gone into loving Daisy. His life had been confused and disordered since then, but if he could once return to a certain starting place and go over it all slowly, he could find out what that thing was . . . .” The recovery of Daisy is supposed to restore the continuity of Gatsby’s life, put the broken pieces of himself together again, and, fully realizing at that future time his original idea of himself, arrest time at that moment. When Nick tells him that he can’t repeat the past, Gatsby replies incredulously, “Can’t repeat the past? . . . Why of course you can!” So it is with America, in Fitzgerald’s view, which continues to suffer from the illusion that sometime in the future it can fully realize an Ideal America envisioned in the past.[29]
Fitzgerald himself, though he brilliantly analyzed the illusion and came to know from his own emotional bankruptcy[30] that the Dream’s material goals were disappointing even if achieved, was nonetheless fascinated by their glittering superficial beauty. Like Gatsby, as a young man Fitzgerald fled the Midwest—in the novel a symbol of a more innocent America—to pursue his dream of success in the East. Although not necessarily for the same reasons, so did many other aspiring writers in the years immediately preceding and following World War I. Sinclair Lewis, Sherwood Anderson, Glenway Wescott, Carl Van Vechten (to mention only a few) left their Midwestern homes, which to them no longer seemed frontiers of opportunity but places of cultural stasis and moral hypocrisy. The new shining West of their minds became the East Coast where, as often as not, they wrote about the region they had left behind.
For some of them, however, the Eastern seaboard also proved unsatisfactory. The United States as a whole, they felt, had been pre-empted by the businessmen and industrialists. Their fabulous wealth not only had become the American Dream for young men such as Clyde Griffiths in Theodore Dreiser’s An American Tragedy (1925). Their ruthless power had also crushed the forces of progressivism and many of the dreamers as well. There was only one thing left to do: light out for another territory-in this case, paradoxically, the old world of Europe, which many of them had naively tried to “save for democracy.”[31]
Not everyone openly critical of an America of Hardings and Coolidges and Babbits expatriated himself, of course, nor was every expatriate motivated by spiritual or aesthetic longings. The point is that a myriad of young Americans, however disillusioned with their native land, nevertheless retained their belief that in a world new to them but old in time they could make a new start.
5: Rugged Individualism
Part of the Dream has been the presumption that Americans can solve any problem. A “can do” confidence, a disposition to “get on with it,” has characterized their psychological tone. From time to time, such mettle has been bolstered by deterministic visions of the future. The Puritans saw the world as governed and predestined by Providence. In the more secularized 1840s a reassuring doctrine of “Manifest Destiny” spurred westward expansion. Later in the nineteenth century, the sociologist William Graham Summer used his evolutionary vision of the inevitable motion of natural selection to develop a gospel of open entrepreneurial competition. Although writers like Jack London and Theodore Dreiser, along with Henry George in Progress and Poverty (1879)and Thorstein Veblen in The Theory of the Leisure Class (1899), painted less happy pictures of a society in which only the “fittest” survive, a conviction that Americans are elected to redeem the world also flourished in this era, dispelled-but only temporarily-in the aftermath of World War I.
Mixed with the sentiment that Americans are a chosen people, however, has always been a strong sense that much does depend on what individuals resolve or on what group decisions mandate. If Americans sometimes try to have things both ways—personal responsibility and deterministic assurance of a favorable destiny -the prevailing mood during the past century (at least until quite recently) has tipped to the side of freedom of choice. No philosopher more effectively expressed that mentality than William James.
Writing at the turn of the century, James did not demur from Franklin’s Poor Richard—“God helps them that help themselves”—but neither did he believe that a world of genuine freedom could be so automatically pro-American as some vociferous nationalists supposed. No fortunes, James argued, can be told in advance. Choices yet to be made and actions still to be carried out decide what will happen. By no means, however, are fulfillment and salvation impossible, and if Americans will do their best, James believed, progress toward both can be made. Such an analysis was convincing. If American consciousness felt threatened by James’s dismissal of a guaranteed future, that anxiety was muted by confidence that Americans not only could but would do what was needed.
Americans have had more than a fair share of hope and optimism, feeling from the beginning that the greatest success is always yet to come. As James understood those qualities, he found them best embodied in a “strenuous mood,” which involves a deep desire to find lasting meaning, a passionate concern to relieve suffering and to humanize existence, and an urgent duty to develop and use one’s talents to the utmost. Compare that prescription with the recent Los Angeles Times description of a prominent Californian: “The Owner of a World Chamionship Team Lives an American Dream: He Parties at Hef’s Place, Dines at Great Restaurants and Dates Beautiful Young Women”. William James was no enemy of “the good life,” but this strenuous mood is not quite the one he had in mind. James would have found the Californian’s American dream a perversion of Cotton Mather’s conviction that material success is a sign of God’s favor, of Franklin’s belief that frugality is a virtue even, of Horatio Alger’s faith that a rags-to-riches prosperity leads to spiritual fulfillment.
In 1907 James was worried that “the picture-papers of the Euro-European continent are already drawing Uncle Sam with the hog instead of the eagle for his heraldic emblem.”[32] It was also James who said that success of a certain kind was a bitch-goddess. Freedom means opportunity; but the ways in which freedom can be exercised, the opportunities one chooses to grasp, the results one achieves—all of these are infinitely varied. Especially the results. As Dreiser dramatized in Sister Carrie (1900), and Fitzgerald and Lewis made clear in their novels, sometimes nothing fails to bring a sense of well-being so much as material affluence and what Veblen called “conspicuous consumption.” Too often, as Fitzgerald’s motto to The Beautiful and Damned (1922) has it, “The victor belongs to the spoils.;
John Dewey explored still other ways in which freedom can be employed. Writing at the onset of the Great Depression, he noted that “the United States has steadily moved from an earlier pioneer individualism to a condition of dominant corporateness.” Ironically, this circumstance occurred because the old “rugged individualism”—as Herbert Hoover called it—based on the Franklinean and Emersonian images of self-reliant, self-made pioneers, was once a dynamic dream: it spurred people to build huge businesses and industrial plants. Individuals obviously remain, yet the success of American capitalism has meant that many people lead quietly desperate lives as cogs in wheels that turn out products collectively. Dewey agreed: Americans arc incorporated.
No return to a pre-industrial, pre-corporate stage is possible. Even if it were, Dewey would have rejected it, but the trade-off has not been inexpensive: “The problem of constructing a new individuality consonant with the objective conditions under which we live is the deepest problem of our times.” Dewey reiterated a familiar American plea when he called for “a new psychological and moral type,” while his plans sound as contemporary as those of a B.F. Skinner. Progress toward the goal, he enjoined, “can be achieved only through the controlled use of all the resources of the science and technology that have mastered the physical forces of nature.”[33] If machines have been in control, Dewey proclaimed, the opportunity exists to make them man’s servants. Everything depends on whether Americans will choose wisely in using the power of scientific methods.
Despite repeated reassurances that the American economy was fundamentally sound and prosperity right around the corner, “objective conditions” at the time Dewey wrote the above shook a number of American dreams with the force of an earthquake. Not since the Civil War had they been shaken so badly. Once again, there were “two nations,” as John Dos Passos put it in his trilogy, U.S.A. (1938)- not North and South this time, but the haves and the have-nots—and once again “machines” of one kind or another had a great deal to do with it.
To John Steinbeck, whose novel The Grades of Wrath (1939) is the most moving fictional document of the period, the ultimate machine was American capitalism itself. In taking monopolistic control of land and the means of production, capitalism had rendered American individualism obsolete, made a mockery of opportunity, and reduced an entire class to a condition of impotent slavery. The only answer, as Steinbeck saw it, was for working men to organize themselves into a body as cohesive and powerful as that of corporate ownership itselfin short, to strengthen the labor unions. Before that classconsciousness could develop, however, they had to recognize as illusions those American dreams which told them that any man willing to work could find work; that ambition, industriousness, and competence inevitably brought success; and that in a land of plenty, especially in the lush Canaan of California, no one could possibly go hungry. Steinbeck’s dispossessed Dust Bowl farmers eventually abandon their illusions and extend their concept of family to include everyone who shares their plight. Yet they, and obviously Stcinbeck himself, retain other long-standing American beliefs: a belief in the essential goodness and rationality of the common man; a belief in his ability to govern himself, either to correct his institutions or to overturn them; and a belief that those who live close to nature are spiritually nourished by it. In The Grapes of Wrath, the old dream of a more perfect union, so to speak, still lives.
Dos Passos was less sanguine. Equally devoted to the ideal of Whitman’s “storybook democracy,” he steadily lost faith that it could be achieved. U.S.A., written in the Thirties but a fictional chronicle of the nation from 1900, shows the conditions prefiguring the Depression: a change in the historical mood from one of progressivism and reform to one of development by reckless spending; war-profiteering; individuals exploited by corporations; protesting workers set upon by police and hired strike-breakers; two ignorant immigrants tried and executed for murder on flimsy evidence, their only proven “crime” being their professed anarchism. But Dos Passos lacked Steinbeck’s faith in the power of the common people to endure and prevail. By the end of the trilogy, his workers and reformers have all been either killed or broken or turned into vagrants or, perhaps worst of all, themselves been corrupted by dreams of power or the “big money.” Dos Passos eventually became skeptical about organizing them, too. By the late 1940s, in The Grand Design (.1949) and subsequent books, the one-time leftist sympathizer had concluded that, be it the Communist Party or a large union or a government agency, any big organization demands that its members sacrifice individuality to the will of its leaders, who by virtue of the power invested in them are eminently corruptible. It is a conclusion typical of an American mentality.
Critical of American capitalism as they were, neither Steinbeck nor Dos Passos ever joined the Communist Party. Many American intellectuals did, however, during what Leo Gurko called “the angry decade.” And like Dos Passos, a great many others were at least lured for a time by the Marxist vision, its European origins notwithstanding, having been persuaded that Communism might be the best way of realizing American dreams of opportunity, liberty, justice, and equality. To them it was, after all, the ultimate version of the Protestant Ethic, the supernatural trappings—but not the fervent dedication-removed: to labor is to be saved, not in heaven but in an earthly paradise. That “dream of the golden mountains,” as Malcolm Cowley labels it, was blasted for most of its adherents by the Moscow trials and the 1939 Nazi-Soviet pact. But for a while, helped along by fears that the Fascist specter haunting Europe threatened America as well, Communism was the manifest content of some latent American visions.[34]
A combination of Franklin D. Roosevelt’s New Deal and the economic consequences of World War II finally ended the Depression, ushered in a new prosperity, and restored America’s confidence in opportunity. Rather than reviving individualism, however, it appeared to strengthen the corporate nature of American life and to encourage conformity. Sociological studies such as David Riesman’s The Lonely Crowd (1950) and Individualism Reconsidered (1954), Peter Viereck’s The Unadjusted Man (1956), and William H. Whyte’s The Organization Man (1956) all attested to, and protested, this demise of individuality.
A century earlier, in his famous essay “Self-Reliance” (1841), Emerson had said: “Whoso would be a man, must be a nonconformist.” Many Americans now apparently felt that manhood was not worth it. The price might well be a place on Senator Joseph McCarthy’s list of “un-American traitors.” If the national heroes of an earlier time had been men alone—Cooper’s Leatherstocking, Melville’s Ahab, Hemingway’s Frederick Henry, or the real-life Charles Lindbergh, “The Lone Eagle”—the new hero was not heroic at all but a cautious company employee, a team player, a man in a gray flannel suit, whose aim was security rather than individual freedom. Fighting for freedom was a wartime activity; for that matter, the war had predictably convinced most servicemen that the military had even less room for individualism than American civilian society. Novels such as Norman Mailer’s The Naked and the Dead (1948) and James Jones’s From Here to Eternity (1951) made that abundantly clear.
Yet the very fact that Viereck, Mailer, Jones, those who refused to bend the knee to McCarthyism, all lamented the demise ofindividualism signified that its spirit was not entirely dead. Far from it. Americans continue to prize security; they always have. But they also vehemently resist any “invasion of privacy,” continue to fear the encroachment of “big government,” and consider excessive conformity to be (in one of Mailer’s favorite metaphors) akin to cancer.
Protests against those forces, which Saul Bellow’s Herzog (1964)feels have “devalued the person . . . owing to the multiplied power of numbers which made the self negligible,” have taken many shapes. Generally subdued during the Eisenhower years, they flared flamboyantly during the 1960s: draft resistance, anti-war demonstrations, middle-class drop-outs, distrust of the “establishment” in general, increased ethnic pride and assertiveness among minorities, especially Blacks. The forms of protest were in some cases self-defeating, sometimes not. The crucial point is that individual freedom of choice is evidently still part of the Dream. Perhaps it is a part no longer realizable for many. Yet, whether unrealistically or not, most Americans would doubtlessly assume that it is among the things that make life worth living. Take away their freedom of choice? With James Purdy’s Malcolm (1959), they might well respond by saying “Keep your hands off my soul!”
Whether freedom to choose does indeed make life worth living is one of the questions raised by William Styron’s recent novel, Sophie’s Choice (1979) . Styron had raised the same existential issue in an earlier novel, Set This House on Fire (1960), and answered it affirmatively. His answer in Sophie’s Choice is more typical of recent America in one sense: it is more ambiguous. For in telling the story of Sophie Zawistowska, resident of Brooklyn, Styron shows that sometimes freedom to choose can make life unbearable.
A Polish Catholic prisoner at Auschwitz in 1943, Sophie was forced by an SS official to choose which of her two children, Jan or Eva, should be sent to the gas chambers. “ ‘Ich kann nicht wahlen!’ she screamed.” Sophie could not choose. Then, so as not to lose them both, she let Eva go. Limited though it was, Sophie’s choice was real. So was her sense of guilt. Set free in 1945, she found her way to the United States, but liberation left her imprisoned. Sophie found inescapable the conclusion that her own life, even in America where she hoped to start anew, was not worth living. In 1947she let it goalso by choice.
Stingo, the white boy from a Presbyterian South who narrates the novel, cannot prevent her suicide. But Stingo endures, having learned much about himself, about American racial guilt, about his own American Dream. Three fragments from a journal he kept in 1947form the novel’s conclusion. “Someday 1 will understand Auschwitz”—that vow, Stingo reflects years later, is (like so many American dreams) “innocently absurd.” “Let your lone floor out on all living things”—that one is worth, saving “as a reminder of some fragile yet perdurable hope.” Finally, some poetry: “Neath cold sand 1 dreamed of deathlbut woke at dawn to seelin glory, the bright, the morning star.” Faced with a choice between hope and despair, Stingo the American chooses hope. If freedom to choose destroyed Sophie, Stingo will resist a similar fate only by using choice against itself in a struggle to make life more worth living, not less.[35] Articulated more somberly, the Dream that individual freedom to choose makes life worth living still resounds even after Auschwitz. Dreams die hard in America.
6: Unalienable Rights
Had Sophie Zawistowska been a Jew, she would have had no choice. For Hitler’s racism and the power of his Nazi state destined them all for extermination. Such facts led Richard L. Rubenstein, in The Cunning of History: The Holocaust and the American Future (1978),to question the “truths” that Thomas Jefferson taught Americans to hold “self-evident.” None of those truths is more crucial to the American Dream than the claim that men are endowed “with certain unalienable Rights.” Those rights, Jefferson believed, are not merely legal privileges that people grant or deny to each other as they please. Rather, such rights are “natural.” As part and parcel of what is meant by human existence, they belong equally to all men and presumably cannot be violated with impunity. Nonetheless, the sense in which rights are unalienable is an elusive part of Jefferson’s Declaration, which states that “to secure these rights, Governments are instituted among Men.” Apparently unalienable rights are not invulnerable; but if they are not invulnerable, then in what way are they unalienable?
One answer could be that to speak of unalienable rights is to speak of conditions of existence so basic that they ought never to be abrogated. But what ought to be and what is are clearly very different things, for rights to life, liberty, and the pursuit of happiness are qualified repeatedly, even by governments that seek to secure them. More importantly, the functional status of unalienable rights is profoundly questioned by realities like Auschwitz. In Rubenstein’s words, the Holocaust and related instances of state-sponsored population elimination suggest that “there are absolutely no limits to the degradation and assault the managers and technicians of violence can inflict upon men and women who lack the power of effective resistance.” True, some people believe that certain rights must not be usurped. Still, if those rights are violated completely and all too often with impunity-and they are-how can they be called unalienable? Is that not one more American illusion, an instance of rhetoric obscuring reality? A much more credible proposition, Rubenstein contends, is that “rights do not belong to men by nature. To the extent that men have rights, they have them only as members of the polis . . . . Outside of the polis there are no inborn restraints on the human exercise of destructive power.”[36]
If Rubenstein’s view recalls that of Henry James, Sr., in ,Society the Redeemed Form of Man (1879), it also contrasts with the idealism of Theodore Parker, the nineteenth-century Boston preacher who advocated abolition, women’s liberation, and human rights in general. Like his fellow Transcendentalists, Parker had a vision of humanity and, more particularly, of America which was promising, to say the least. In a speech entitled “The Political Destination of America and the Signs of the Times” (1848), he declared: “The most marked characteristic of the American nation is Love of Freedom; of man’s natural rights …. We have a genius for liberty: the American idea is freedom, natural rights. Accordingly, the work providentially laid out for us to do seems this—to organize the rights of man.”[37]
Parker’s vision reflects assumptions that have been at the heart of the American Dream since the nation’s birth. They include beliefs that the most basic human rights are a gift of God and that nature and reason testify to a universal moral structure which underwrites them. But what if there is no God? What if nature is amoral? What if reason insists that the most self-evident truth of all is that history is a slaughter-bench, a place where unalienable rights are not worth the paper they are written on—unless political might protects them?
Such questions have crossed American minds in the past, but in a post-Holocaust age they test American optimism more severely than before. For it is no longer clear that anything but human power secures a person’s rights, and if rights depend on human power alone, then they are natural and unalienable in name only. In such circumstances, to call rights unalienable may still be a legitimate rhetorical device to muster consensus that certain privileges and prerogatives must not be taken away. No doubt the idea of unalienable rights functions precisely in that way as an ingredient of the American Dream. But idea is not fact, dreams do not always correspond to waking life, and Americans seem increasingly aware that rights are functionally unalienable-which is all that may count in the long and short of it—only within a state that can successfully defend and honor them as such.
“We all declare for liberty,” Abraham Lincoln said of Americans in 1864, “but in using the same word we do not all mean the same thing.”[38] His proposition holds for ‘rights’ as well, and not only because Americans have diverse philosophical assumptions, implicit or explicit, about whether rights are unalienable. Disputes also arise because the American Dream means many things to many people when unalienable rights are translated into legal or civil rights.
If life, liberty, and the pursuit of happiness are at the top of America’s list of rights, that very agreement paradoxically keeps Americans at odds because one person’s liberty can mean another’s exploitation, and one individual’s pursuit of happiness may rob another of opportunity or even of life itself. In theory every American’s rights are equal; therefore, the respect owed to every citizen ought to create a balance in which rights can be freely exercised without doing violence to each other. Laws that establish the parameters of rights within the state are intended to put that theory into practice. The Bill of Rights of 1791, the first ten amendments to the Constitution, is a key example. So is the Fourteenth Amendment, which provides that no state shall “deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.” Still, some Americans’ rights and their protection remain much more equal than others’.
Tensions within the Dream are exacerbated in another way as American expectations transmute rights guaranteeing opportunity into entitlements guaranteeing results. For example, in his speech on the “Four Freedoms” (1941), Franklin D. Roosevelt invited Americans to look forward to a nation, indeed to a world, in which there would be not only freedom of speech and worship but also freedom from fear and want. His vision, he argued, was not of “a distant millennium,” but of a “world attainable in our own time and generation.”[39] Roosevelt’s dream did not come true. It failed largely because, depending on self-interest as well as on moral principles, freedom from want can be defined in as many different ways as ‘liberty’ or ‘rights.’ What constitutes ‘want’? And to what extent is freedom from want, however defined, anyone’s ‘right’? One man’s want is another man’s luxury. Repeated attempts are made to define ‘true need’ in quantifiable terms; legislative and judicial decisions seek to establish who is entitled to what; but diverse opinions about the boundaries between rights and privileges, opportunities and entitlements, continue to keep Americans contentious.
In 1963 John F. Kennedy asserted that “every American ought to have the right to be treated as he would wish to be treated, as one would wish his children to be treated.”[40] Ironically, all Americans can affirm Kennedy’s principle and in so doing actually intensify their disagreements. Kennedy offered his principle in defense of civil rights for Blacks, but it is a two-edged sword: it can cut in the opposite direction, depending on how Americans view each other and how they wish themselves and their children to be treated. On that point they often disagree profoundly. Witness the current issue of school busing. Witness, too, the current issue of abortion. Americans are divided between those who believe that abortion is murder (the violation of an unalienable right to life) and should therefore be banned by constitutional amendment, and those who believe that to make abortion illegal is to deny a woman her right to liberty and the pursuit of happiness.
The American Dream originated in a struggle over human rights, and true to its heritage, the Dream keeps that struggle in the forefront. Once, the doctrine of “Separate but Equal” was thought sufficient to guarantee the rights of Blacks. It proved insufficient, nor have subsequent legal changes done the trick. Once, women did not have the right to vote; today, they do. But it is highly doubtful that even the passage of the Equal Rights Amendment would do more to guarantee their rights than similar measures have done for Blacks. Laws do not automatically change attitudes or alter social and economic realities. The voices protesting the “dream deferred,” ‘in Langston Hughes’s words,[41] attest to that. Paradoxically, though, the very persistence of such voices suggests that the American Dream of human rights is very much alive. For in spite of impediments, the protesters continue to hope with Martin Luther King, Jr., that it is still possible to “speed up that day when all of God’s children, black men and white men, Jews and Gentiles, Protestants and Catholics, will be able to join hands and sing in the words of that old Negro spiritual, ‘Free at last! Free at last! Thank God almighty, we are free at last!’”[42] To them, as Thomas Wolfe expressed it in You Can’t Go Home Again (1940),the “true discovery of America is before us.”
7: “Where To What Next?”
So asked Carl Sandburg in The People, Yes (1936).Four decades later, another poet, Adrienne Rich, offered an answer by saying in “From an Old House in America” (1975)that “we are in the open, on our way.”[43] Although Rich, an ardent feminist, is referring specifically to women, her line also articulates a typical, more inclusive American faith in the future. Have such confident cries become less frequent, more tentative and muted in the decades since the end of World War II, however? As usual, one can marshal evidence both for and against, sometimes within the same piece of writing; but the nearer one comes to the present, the more exceptional Rich’s statement becomes.
Reaffirmation or Negation?
Compared to the turbulent 1960s, the 1950smay now seem to have been a complacent decade. But some representative literary works of the period show that not all Americans- at least among the nation’s imaginative writers—contemplated the future with unqualified optimism. True, Jack Kerouac’s account of his journey across the continent, On the Road (1957),echoed Whitman’s celebration of America’s body and soul alike, yet there is an underlying sadness in the book, an almost hysterical tone to the celebration. Kerouac’s fellow “beatnik,” Allen Ginsberg, is more blatantly hysterical and surely less ambiguous in his poem “Howl” (1956)which denounces America for turning Whitman’s dream into a “Nightmare of Moloch” and destroying “the best minds of my generation.”[44] The hero of Saul Bellow’s The Adventures of Augie March (1953),a modern urban combination of Whitman’s pioneer and Twain’s Huck Finn, is an apparently irrepressible optimist who hopefully lights out for one territory after another to maintain his autonomous identity. Ambiguously enough, however, he ends up in Europe, and the jubilant tone of his final words seems barely to disguise a desperation verging on despair: “Look at me, going everywhere! Why, I am a sort of Columbus of those near-at-hand.” At the end of the decade, Philip Roth also invoked the discoverer of America in Goodbye, Columbus (1959).But the new world which Neil Klugman, Roth’s protagonist, discovers and ultimately bids farewell to is hardly the terrestrial paradise which Columbus envisioned—unless paradise is an upper middle-class suburb where sensual gratification and affluence demand the sacrifice of personal and ethnic identity.
By 1961 Roth was so pessimistic about his country that he doubted whether it could any longer be the subject of the writer’s art: “it stupefies, it sickens, it infuriates and finally it is even a kind of embarrassment to one’s own meager imagination.” Roth has, of course, continued to write about it, as have others. Still, it is significant that much American fiction of the past twenty years, continuing a trend that began in the 1950s,has concentrated on inwardness, as if to say that objective American reality is either too incomprehensible or detestable, its future too bleak or terrifying, to contemplate. Indeed, such novelists as Thomas Pynchon, John Barth, and Donald Barthelme imply that America’s social reality is now so absurd as to have exhausted the possibilities of mimetic representation. As an alternative to that reality, they offer either verbal constructs in which words are almost entirely reflexive, pointing only to the context in which they appear, or in which the absence of conventional plot, character, and logical relationships constitutes both a parody and a wry repudiation of an America where dream has become nightmare. [45]
Other contemporary writers, some of them still committed to relatively traditional ways of recording social conditions, do not on the surface appear much more optimistic. Black authors such as Chester Himes, Amiri Imamu Baraka, and James Baldwin have picked up where Richard Wright in Native Son (1940)and, earlier, W.E.B. DuBois in The Souls of Black Folk (1903)left off, reminding us that whereas the American dream of equality of opportunity has tickled Blacks with hope, it continues to deny them equality of result. As DuBois said, “America is not another word for Opportunity to all her sons.”[46] White writers perceive other ways in which the Dream seems to have failed. The title character of Edward Albee’s The American Dream (1961) is physically an Adonis, mentally and spiritually a hollow man. Ken Kesey’s One Flew Over the Cuckoo’s Nest (1962)takes place in a regimented mental hospital (presided over by a true bitch-goddess called Big Nurse) symbolizing contemporary American society. Nonconformity is here punished by lobotomy. Norman Mailer’s An American Dream (1965)depicts an America ruled by a satanic totalitarian combine of corporate and criminal power. Bellow’s Mr. Sammler’s Planet (1970)presents an image of urban life dominated by gratuitous violence, mindless sensualism, and contempt for intellectuality, while a violence born of the inability to control one’s own fate is the recurrent theme of Joyce Carol Oates’s novels.
Just as a shadow of anxiety hangs over such ostensibly affirmative novels as Bellow’s The Adventures of Augie March and Kerouac’s On the Road, however, so too a small beam of light is discernible in some of the predominantly negative works mentioned above. Kesey’s novel ends with its previously passive narrator, an American Indian, finally breaking out of the mental hospital and running free. Mailer’s existential protagonist, Stephen Rojack, does manage to dough the skin of conventional American success (he has been a war hero, Congressman, university professor, television personality, husband of a beautiful wealthy woman), call up the forces of the primal self to resist the powers of evil, and set out for a more open future. Jules Wendall, in Oates.’s them (1969), is a “true American” who, believing he is “fated to nothing” and “could change himself to fit into anything,” is unquenchably optimistic. Loretta Wendall is equally confident that she can always make a new start. Mr. Sammler, like all Bellow’s heroes, refuses to despair or relinquish his faith in man’s capacity for dignity and compassion. The beam of light is brighter in Robert M. Pirsig’s Zen and the Art of Motorcycle Maintenance (1975). Like Hart Crane’s The Bridge (1930), it attempts via symbol to reconcile the Machine with the Garden; it assumes the familiar American form of a journey; and it concludes quite positively indeed. As the narrator and his son approach San Francisco, the former says: “It’s going to get better now. You can sort of tell these things.”
Confidence that things will get better was more heartily asserted in two popular works of non-fiction published in the 1970s. Charles A. Reich’s The Greening of America (1970)prophesied a new American paradise based on a transformation of consciousness. An updated version of the myth of America as the New Eden, it naively predicted that Americans would be reborn into a new openness, a new honesty, a new awareness of individual worth which is not competitive but cooperative. Ben Wattenberg’s The Real America: A Surprising Examination of the State of the Union (1974)offers statistics purporting to prove that progress is being made in such areas as education, civil rights, and occupational satisfaction.
By and large, however, the Pirsigs, Wattenbergs, and Reichs-not to mention the Reaganauts of the early 1980s who cheerfully predict that America will be in the future what it was in the past- seem to be whistling in the dark. For many Americans, not just writers, are no longer as confident of the future as they once were. They are no longer so sure, for example, that the United States can be-or should even try to be—a model of charity to the rest of the world. Once upon a time, their conviction that they were a chosen people led them to blame any difficulty on alien forces: the English crown or the Roman church, the devilish Indians, Satan abroad in Salem, the Yellow Peril, international Jewish bankers, “godless communism,” and “outside agitators.” My Lai, Cambodia, and Kent State have shaken that conviction. So have Watergate, the revelations of FBI and CIA misconduct, along with doubts whether the bluechip commissions which investigated the Kennedy and King assassinations did a competent job. The youthful idealists of the Sixties thought they could change the world; they are now middle-aged, disenchanted, and, like their younger brothers and sisters, concerned with economic survival.
Are Americans, then, still on their way? Does America still believe that, like Henry James’s Milly Theale and the nation she personifies in The Wings of the Dove (1902),it is the “potential heiress of all the ages”? Or has the nation come to the end of the road, the myth of progress discarded like empty baggage? Perhaps neither is entirely true. Perhaps Americans are, instead, conducting a kind of vigil -still hopeful but more wary than ever before—in which, like Samuel Beckett’s two Chaplinesque tramps, they wait for their version of Godot.
A Rebirth of Wonder?
Scott Fitzgerald’s Jay Gatsby, that great American dreamer of the Twenties, also conducted a vigil. But he was sustained by a sense of wonder which Fitzgerald compares to that of the Dutch sailors first contemplating the “fresh, green breast of the new world.” Gatsby never fully lost it. He may have been lucky; for many Americans today have had their sense of wonder dissipated by the frustration of their dreams, their hopes for the future sustained, if at all, only by a well-worn political rhetoric which serves as the current opiate of the people. Others, though fully alert to the disparity between dream and fact, nevertheless believe that a rebirth of wonder would be preferable to either cynicism or blind faith.
Some twenty-five years ago Lawrence Ferlinghetti expressed both his disappointments and his hopes in a poem called “I Am Waiting” (1955). Ferlinghetti was waiting, he said, for his “case to come up,” for a “rebirth of wonder,” for “someone to really discover America/ and wail.”[47] A few years later, Janice and Harry Angstrom, the principal characters in John Updike’s Rabbit, Run (1960) and Rabbit Redux (1971), are in a similar condition, though less able to articulate it. Like the nation they inhabit, they need to be born again. They need something to revitalize their youthful sense of wonder, but they cannot quite find it.
In the earlier novel, Harry suffers from a “closed-in feeling.” A former high-school basketball star, the Rabbit has since experienced only mediocrity. And Harry cannot abide mediocrity. Consequently, like earlier Americans, he runs. The trouble is, he doesn’t know where to run to. The institutions-domestic, social, religious-he has been brought up to believe in fail to satisfy him; yet he can find no adequate replacements, no new frontiers where he can fan the “little flame” inside him.
When we meet Harry ten years later in Rabbit Redux, he has stopped running. He is no longer, like Gatsby, trying to recover in the future what he once had in the past. He has decided, in fact, that “rebirth means death.” And so, bored by America’s new technological pioneers, the astronauts, whose explored space merely reminds him of the “lunar landscapes” of his Pennsylvania hometown, now a wasteland of urban blight and suburban housing tracts, Harry lapses into a somnolent indifference. His inner space is as empty as the craters of the moon, as cold as the air-conditioned, ironically-named Phoenix Bar where he drinks a frozen daiquiri every night after finishing his linotyping job.
Desperate to believe in something, Harry substitutes America for the “face of God,” facetiously refers to himself as “the fucking Statue of Liberty,” and stubbornly defends the Vietnam War against its critics. “America,” he thinks, “is beyond power, it acts as in a dream . . . . Wherever America is, there is freedom …. Beneath her patient bombers, paradise is possible.” The critics include his “liberated” wife, Janice; her lover, Stavros, who fears emotional commitment to anyone; a young Black militant, Skeeter, who has fought in Vietnam; and Jill, an upper middle-class dropout hooked on drugs. The America Harry truly believes in, however, is an older America which, as he nostalgically remembers it, was epitomized by family solidarity, green peaceful summer evenings and the smell of burning autumn leaves, Sunday morning church and Sunday afternoon baseball games, the Lone Ranger and Tonto. It is not the America he lives in. When some neighborhood bigots burn Harry’s home because ofJill’s and Skeeter’s presence there, Harry is again left hanging in empty space.
At the end of Rabbit Redux, Harry and Janice are tenuously re-united. They are not sure this will make things any better, not even for themselves, let alone the country: “If it was better,” Harry says, “1’d have to be better.” But they are trying, they are conducting a vigil, they are waiting to see. “How do you think it’s going?” Harry asks Janice. She replies, “Fair.”[48]
Janice’s reply still holds in Rabbit is Rich (1981), Updike’s third novel about the Angstroms. Her appraisal also fits the American Dream today. If its condition is “fair” at best, it is yet not to be entirely discounted as America’s case comes up. Earlier, Ferlinghetti’s attitude toward the Dream was similarly ambivalent. Not unlike Harry, though more sardonically, the poet was “waiting/for a religious revival/to sweep thru the state of Arizona/and . . . for the Grapes of Wrath to be stored.” He was waiting to “see God on television” so that His American identity could be corroborated, His place on America’s side confirmed. But God does not appear on the channels of the American Broadcasting Company—unless, of course, their reports of violence in the streets, of corruption in high places and low, of teen-age alcoholism and drug addiction and suicide are where the Grapes of Wrath are stored, waiting to be pressed into a new vintage Exodus or Resurrection—or Apocalypse—Americanstyle. Americans may indeed be waiting, with Ferlinghetti, for “the living end,” whether in the positive or negative sense of that ambiguous slang phrase.
Waiting to set sail for happiness in a reconstructed Mayflower was another of Ferlinghetti’s dreams, but the course for the 1980s, as charted by economist Alfred Kahn in a recent issue of the Los Angeles Times, does not point toward greater material prosperity. “The American people,” Kahn believes, “have no choice but to accept a temporary decline in living standards.” A decline in living standards? For Americans, manna-fed even in the howling wilderness? Is not that forecast a prelude in a minor key for anyone who still subscribes to the myth of progress? True, as another economist, Robert J. Samuelson, has editorialized in the same newspaper, “the United States has combined a fabulous endowment of natural resources with some native ingenuity and hard work to create an enormously productive economy that probably still gives its people the highest standard of living in the world.” Nevertheless, he added, “people’s fears for the future are real; they wonder whether the prosperity that they assumed would endure forever will now vanish like a morning mist.”[49]
Americans who continue to wait with Ferlinghetti “for the day that maketh all things clear” should not be surprised if that day is preceded by the return of nightmares old or forgotten. In Updike’s Rabbit Redux,Skeeter, whose race has lived a nightmare and who refers to his country as “these Benighted States,” tries to teach Harry that lesson. The White Rabbit is not receptive. “Trouble with your line,” Harry tells Skeeter, “it’s pure self-pity. The real question is, Where do you go from here? . . . This is the freest country around, make it if you can, if you can’t, die gracefully. But Jesus, stop begging for a free ride.” Skeeter counters that Harry is “white but wrong,” that Blacks are “technology’s nightmare.” Left out of the industrial revolution, they are the “next revolution.” Skeeter might be the Eldridge Cleaver of Soul on Ice (1968), or Rap Brown, or Angela Davis. Nor are his words just remnants of the Sixties. By changing a few, he might be speaking for other groups-for Indians, Hispanics, women-who have in one way or another been left out. Left out or not, their Dream is not fundamentally different from that of other Americans.
That Dream hangs on, albeit tenuously, its chief strands including a trust in the Constitution; a conviction that opportunities still remain for the individual to achieve material prosperity and his own version of happiness; a confidence that the United States can successfully meet any challenge; and a faith that the nation is sincerely dedicated to human equality, human rights, and freedom of choice. Until very recently, at least, the most fundamental and persistent strand, however, has been the belief in new beginnings. The strand which ties all the others together, it now may also be the most illusory. For to what extent is a national new beginning—which is to say, a new youth—desirable, even if possible?
Ferlinghetti was hopefully waiting for a rebirth of wonder. But he was also waiting “for Tom Swift to grow up,” for a nation of All-American boys to mature, for the Harry Angstroms to doff their lettermen jackets. Once, the West was fabled but yet unexplored; America was a young country, with such a vast expanse of fertile open space and time before it that an infinite number of new starts was conceivable. America is no longer young and never will be again; its open space is mostly taken, its vaunted natural plenitude clearly finite, its reputation as the land of opportunity suspect, both at home and abroad. Once, America prided itself on the freedom of its individuals and on the nation’s freedom from foreign entanglements; after all, Old Father Europe was an ocean away and America had done with him. Now things are different: the individual’s freedom to be a “simple separate person” seems increasingly limited by a bureaucratically regulated “En-Masse,” the nation as a whole entangled with every other nation in the world. Once, America was innocent enough to dream that it could not only control its own destiny but that the rest of the world would emulate its brand of democracy. Today, that destiny is far from manifest.
So where to? What next? To be an American is to dream: for good or for ill, that is the American’s heritage. If Americans are no longer so certain that their multi-faceted Dream is realizable, or that the future itself is limitless, that uncertainty may be a sign of their maturity. For maturity entails the recognition that what they are and have been, as well as what they dream of becoming, are the truths they must live by.
8. Guide to Further Reading
“All I know,” quipped Will Rogers, the American humorist, “is what I read in the papers.” Much may be learned about the American Dream by watching the press, for nobody does more than the Fourth Estate to keep the term in print. For the most part, however, the reading suggestions that follow are restricted to scholarly books, many of them containing bibliographies of their own. Most serious studies of American civilization deal with the Dream, but some do so more explicitly than others. Supplementing the works cited in the text and notes, and further illuminating the themes that we have emphasized, the contributions mentioned here are ones that we have found especially helpful.
Of the many comprehensive accounts of American history, perhaps the sanest and most gracefully written is Samuel Eliot Morison’s The Oxford History of the American People (New York: Oxford UP, 1965). Specifically where the Dream is concerned, Carl N. Degler’s Out of Our Past, rev. edn. (New York: Harper & Row, 1970), Sydney E. Ahlstrom’s A Religious History of the American People (New Haven: Yale UP, 1972), and Mary P. Ryan’s Womanhood in America: From Colonial Times to the Present, 2nd edn. (New York: Franklin Watts, 1979) are also significant. Although its liberal bias is pronounced, the most ambitious overview of American intellectual history continues to be Vernon L. Parrington’s Main Currents in American Thought (New York: Harcourt, Brace, 1927-30).
As for more specific times and motifs, a number of books are now classics in their fields. One thinks, for example, of Perry Miller’s work on the Puritans, especially Errand into the Wilderness (Cambridge, Mass.: Harvard UP, 1956). Carl Becker’s The Heavenly City of the Eighteenth-Century Philosophers (New Haven: Yale UP, 1932) and Bernard Bailyn’s The Ideological Origins of the American Revolution (Cambridge, Mass.: Harvard UP, 1967) remain salient for the era of Enlightenment and Revolution.
Six other works are perhaps even more decisive for exploration of the claims we have made. Commenting on “the great emptiness of America . . . where men and even houses are easily moved about, and no one, almost, lives where he was born or believes what he has been taught,” George Santayana’s Character and Opinion in the United States (New York: Charles Scribner’s Sons, 1920) is still full of nuggets worth mining. Henry Nash Smith’s Virgin Land: The American West as Myth and Symbol (Cambridge, Mass.: Harvard UP, 1950), which both drew upon and criticized Turner’s frontier thesis, virtually established the myth-and-symbol approach to studies of the American Dream. Edwin Fussell’s Frontier: American Literature and the American West (Princeton UP, 1965) reveals the pervasive presence and significance of frontier metaphors in major writers. Charles L. Sanford’s The Quest for Paradise: Europe and the American Moral Imagination (Urbana: Illinois UP, 1961) examines the European sources of the Edenic dream, arguing that it was not uniquely American. He nevertheless believes it to be the “most powerful and comprehensive organizing force in America” through the nineteenth century. R.W.B. Lewis’ The American Adam: Innocence, Tragedy, and Tradition in the Nineteenth Century (Chicago UP, 1955) targets the mythic hero and the “cultural dialogue” between influential proponents and critics of the myth. Subsequently, the blasting of America’s pastoral ideal by industrialism, the consequent transformation of American social theory, and modern America’s nostalgia for the lost natural world have been profoundly assessed by Leo Marx, The Machine in the Garden: Technology and the Pastoral Ideal in America (New York: Oxford UP, 1964).
Henry Steele Commager has written many books that are pertinent to the American Dream, but none excels The American Mind: An Interpretation of American Thought and Character Since the 1880’s (New Haven: Yale UP, 1950). Contributions to the Dream made by American philosophers and religious thinkers are also effectively set forth by Sacvan Bercovitch, The Puritan Origins of the American Self (New Haven: Yale UP, 1975); John K. Roth, American Dreams: Meditations on Life in the United States (San Francisco: Chandler and Sharp, 1976); Paul F. Boller, Jr., Freedom and Fate in American Thought: From Edwards to Dewey (Dallas: Southern Methodist UP, 1978); and Merle Curti, Human Nature in American Thought (Madison: Wisconsin UP, 1980). A contemporary analysis of American attitudes toward human rights and the impact of those beliefs on foreign policy can be found in American Dream, Global Nightmare (New York: Norton, 1980) by Sandy Vogelgesang. Still in a philosophical vein, John W. Gardner seeks to nurture individual responsibility and social regeneration with Morale (New York: Norton, 1978).
Two earlier books, Stewart H. Holbrook, Dreamers of the American Dream (Garden City, N.Y.: Doubleday, 1957) and Vernon Louis Parrington, Jr., American Dreams: A Study of American Utopias (New York: Russell & Russell, 1964), map “perfectionist” and reformist aspects of the Dream, which play a vital role in the tension between American aspirations and realities that has fascinated so many observers. For instance, AX Kaul’s The American Vision: Actual and Ideal Society in Nineteenth-Century Fiction (New Haven: Yale UP, 1963) discusses Cooper, Hawthorne, Melville, and Twain as representatives of a recurrent dialectic in which “the actual and the ideal function in mutual critique.” Marius Bewley’s The Eccentric Design: Form in the Classic American Novel (New York: Columbia UP, 1959) and Tony Tanner’s The Reign of Wonder: Naivety and Reality in American Literature (Cambridge and New York: Cambridge UP, 1965) also underscore the contrarieties in classic American literature, Tanner holding that a romantic “sense of wonder”is still dominant in modern American writing. Related views, portrayed this time in American painting, are discussed by Barbara Novak, Nature and Culture: American Landscape and Painting, 1825–1875(New York: Oxford UP, 1980) and ,joy S. Kasson, “Images of the American Dream,” in Jane L. Scheiber and Robert C. Elliott, eds., In Search of the American Dream (New York: New American Library, 1974), pp. 186-96.
The Twenties and Thirties were critical decades for the Dream. In The American Dream in the Great Depression (Westport, Conn., and London: Greenwood, 1977), Charles R. Hearn defines the success myth as the “very essence of what we conceive America to be” and analyzes its permutations during that period. Similar developments are documented by John O. Tipple, ed., Crisis of the American Dream: A History of American Social Thought, 1920–1940(New York: Pegasus, 1968) and Ellis W. Hawley, The Great War and the Search for a Modern Order, 1917–1933(New York: St. Martin’s, 1979), the latter concentrating on “the rise and collapse of the world’s first mass-consumption economy, and the continued search for a modern managerial order geared to the realization of liberal ideals.” For more personal glimpses of the Dream at this time, see Kenneth S. Davis, The Hero: Charles A. Lindbergh and the American Dream Garden City, N.Y.: Doubleday, 1959) and William R. Brown, Imagemaker: Will Rogers and the American Dream (Columbia: Missouri UP, 1970).
The Sixties are also proving to be a watershed. William O’Neill’s Coming Apart: An Informal History of the 1960’s (New York: Quadrangle, 1971) paints a dark picture, asserting that the shattering of an American cultural consensus during the decade has left the Dream in shards. In Nixon Agoriistes: The Crisis of the Self-Made Man (New York: New American Library, 1971), Gary Wills discusses the erratic fortunes of Richard Nixon’s career as symptomatic of a more general disturbance within the Dream. Although it deals principally only with imaginative literature, David Madden, ed., American Dreams, American Nightmares (Carbondale: Southern Illinois UP, 1970) may be the best single book about the state of the Dream in this era. Maddens introduction puts the recent scene in historical perspective, Robert B. Heilman’s essay defines “The American Metaphor,” and the other articles, by critics such as Leslie Fiedler, Maxwell Geismar, and Ihab Hassan, offer close analysis of how various authors view the Dream in our time.
Many other books deserve comment, among them Frederick I. Carpenter’s American Literature and the Dream (New York: Philosophical Library, 1955); Kenneth Lynn’s The Dream of Success: A Study of the Modern American Imagination (Boston: Little, Brown, 1955); Daniel Bell’s The End of Ideology, rev. edn. (Glencoe, III.: The Free Press, 1962); Daniel J. Boorstin’s The Image or What Happened to the American Dream (New York: Atheneum, 1961); Oscar Handlin’s The Uprooted, 2nd edn. (Boston: Little, Brown, 1973); and James Oliver Robertson’s American Myth, American Reality (New York: Hill & Wang, 1980). It would also be remiss not to note that research on the Dream finds fertile soil in more widely disseminated cultural expressions.
One such example is Martha Raetz’s “The Voice of America: Imagery and Metaphor in the Inaugural Addresses of the American Presidents,” in Sy M. Kahn and Martha Raetz, eds., Interculture (Vienna: Wilhelm Braumiiller, 1975), pp. 58-82. Much more exotic but also insightful are: Rex L. Jones, “Poker and the American Dream,” in W. Arms and Susan P. Montague, eds., The American Dimension: Cultural Myths and Social Realities (Port Washington, N.Y.: Alfred, 1976), pp. 170-80, which argues that “poker is a pure expression of the American Dream;” and Alan Gowans, “Popeye and the American Dream,” in Jack Salzman, ed., Prospects: An Annual of American Cultural Studies, vol. 4 (New York: Burt Franklin, 1979), pp. 549-57, which explores the thesis that “in the popular arts, the American Dream is a stock joke.” Leverett T. Smith also scores by turning attention to baseball and football in The American Dream and the National Game (Bowling Green, O.: Bowling Green University Popular Press, 1975). Finally, the editors of popular magazines—Time, Life, Fortune, Sports Illustrated, Money, People, and Discover—have compiled an important series about American prospects in the 1980s. These reports are published under a characteristically optimistic title, American Renewal (Chicago: Time Inc., 1981). Keeping pace, in the spring of 1981 the American Broadcasting Company made “The American Dream” into a weekly television drama focused on a fictional family in Chicago.
9. Notes
- e.e. cummings, Poems, 1923–1954(New York: Harcourt, Brace, 1954), p. 193; Lionel Trilling, The Liberal Imagination (New York: Viking, 1950), p. 251. Back
- Robert J. Ringer, Restoring the American Dream (New York: Harper & Row, 1979) and Studs Terkel, American Dreams: Lost and Found (New York: Pantheon, ‘1980). Back
- Crèvecoeur, Lettersfrom an American Farmer (1782; reprint edn., Garden City, N.Y.: Doubleday, n.d.), pp. 49, 50, 47. Back
- Charles L. Sanford, ed., The Quest for America, 1810–1824(New York: Doubleday, 1964), p. ix; Legare’s speech is reprinted in full, ibid., pp. 3-20. James T. Adams, The Epic of America (Boston: Little, Brown, 1931), pp. 415, viii. Back
- ”One’s-Self I Sing,” in James E. Miller, Jr., ed., Complete Poetry and Selected Prose by Walt Whitman (Boston: Houghton Mifflin, 1959), p. 5. Back
- Richard Chase, The American Novel and Its Tradition (Baltimore: Johns Hopkins UP, 1980), p. 1. Back
- Archibald MacLeish, Land of the Free (New York: Harcourt, Brace, 1938), pp. 83-84. Back
- Edwards, in Conrad Cherry, ed., God’s New Israel. Religious Interpretations of American Destiny (Englewood Cliffs, NJ.: Prentice-Hall, 1971), p. 55. Back
- Vernon L. Parrington, Main Currents in American Thought, 3 vols. (New York: Harcourt, Brace, 1927), vol. 1, p. 66. Back
- ”The Mayflower Compact,” reprinted in Daniel J. Boorstin, ed., An American Primer (New York: New American Library, 1968), p. 21; for Winthrop and Cotton, see Perry Miller, ed., The American Puritans (Garden City, N.Y.: Doubleday, 1956), pp. 82-83, 85. Back
- Thomas Paine, “Common Sense,” in Nelson F. Adkins, ed., Common Sense and Other Political Writings (Indianapolis: Bobbs-Merrily 1953), p. 51. Italics ours. Back
- Quotations from R.W.B. Lewis, The American Adam: Innocence, Tragedy, and Tradition in the Nineteenth Century (Chicago UP, 1955), p. 5; Emerson’s essay on “Nature” (1836), and Thoreau’s Walden (1854), as reprinted in The Complete Essays and Other Writings of Ralph Waldo Emerson, p. 3, and Walden and Other Writings, p. 25, both edited by Brooks Atkinson (New York: Modern Library, 1950). Back
- J.E. Miller, Jr., ed., Whitman, p. 69. Back
- Thomas H. Johnson, ed., The Complete Poems of Emily Dickinson (Boston: Little, Brown, 1960), p. 657. Back
- D.H. Lawrence, Studies in Classic American Literature (1923; Garden City, N.Y.: Doubleday, 1953), p. 64. Back
- The classic commentary on the Constitution is still The Federalist Papers (1787-88) by Alexander Hamilton, James Madison, and John Jay; the quotation is from No.1. For Paine, see Adkins, ed., p. 51. Back
- Some aspects of this tension are studied in John D. Lees, The President and the Supreme Court: New Deal to Watergate (1980), the third pamphlet in this series. For Marshall, see Richard D. Heffner, ed., A Documentary History of the United States, 3rd edn. (New York: New American Library, 1976), p. 80. Back
- Federalist Papers, Nos. 51 and 1. Back
- Roosevelt’s speech, “The New Nationalism” (1910), and Eisenhower’s “Farewell Address” (1961), both in Heffner, ed., pp. 230, 314. Back
- Ford, “Inaugural Address” (1974), ibid., p. 351; Carl Sandburg, The People, Yes (New York: Harcourt, Brace, 1936). Back
- Federalist Papers, esp. No. 10; for Washington, see Heffner, ed., p. 66. Back
- Nathaniel Hawthorne, Mosses from an Old Manse (1854; Boston and New York: Houghton Mifflin, 1882); see esp. “Earth’s Holocaust,” ibid., p. 455. Back
- See “The Old Manse,” ibid., p. 17. Back
- These essays may be found in various collections, e.g., those cited in n. 12 above. Back
- Turner, as reprinted in Ray A. Billington, ed., Frontier and Section: Selected Essays of Frederick Jackson Turner (Englewood Cliffs, N.J.: Prentice-Hall, 1961), pp. 61, 38, 62, 51. Back
- See esp. Cooper’s The Pioneers (1823) and The Prairie (1827). Back
- Critical reactions to Turner’s work are summarized by Ray A. Billington, The American Frontier Thesis: Attack and Defense ( Washington, D.C.: American Historical Association, 1971). Back
- “American Letter,” in Archibald MacLeish, Collected Poems, 1917–1952(Boston: Houghton Mifflin, 1952), p. 63. Back
- F. Scott Fitzgerald, The Great Gatsby (New York: Scribner’s, 1925). Back
- See Fitzgerald’s personal essay, in Edmund Wilson, ed., The Crack-Up (Norfolk, Corm.: New Directions, 1945), pp. 69-84. Back
- See, e.g., Frederick J. Hoffman, The Twenties (New York: Viking, 1949) and Malcolm Cowley, Exile’s Return: A Literary Odyssey of the 1920’s, rev. edn. (New York: Viking, 1951). Back
- William James, in John K. Roth, ed., The Moral Equivalent of War and Other Essays (New York: Harper & Row, 1971), p. 20. Back
- John Dewey, Individualism Old and New (New York: Minton, Batch, 1930), pp. 36, 32, 83, 93. See also B.F. Skinner, Walden Two (1948) and Beyond Freedom And Dignity (1971). Back
- See Leo Gurko, The Angry Decade (New York: Harper & Row, 1947) and Malcolm Cowley, The Dream of the Golden Mountains (New York: Viking, 1980). Back
- William Styron, Sophie’s Choice (New York: Random, 1979). Back
- Richard L. Rubenstein, The Cunning ofHistory (New York: Harper & Row, 1978), pp. 89-90. Back
- Perry Miller, ed., The American Transcendentalists (Garden City, N.Y.: Doubleday, 1957), esp. p. 350. Back
- Richard N. Current, ed., The Political Thought of Abraham Lincoln (Indianapolis: Bobbs-Merrily 1967), p. 329. Back
- Heffner, ed., Documentary History, esp. pp. 296-97. For American attitudes to the relief of ‘want,’ see James T. Patterson, The Welfare State in America, 1930–1980(1981), the seventh pamphlet in this series. Back
- Kennedy’s “Civil Rights Speech,” in Heffner, ed., p. 330. Back
- Langston Hughes, Montage of a Dream Deferred (New York: Henry Holt, 1951); see esp. “Harlem,” p. 71. Back
- King’s “I Have a Dream” speech (1963), reprinted in C. Eric Lincoln, ed., Is Anybody Listening to Black America? (New York: Seabury, 1968), esp. p. 66. Back
- Barbara Charlesworth Gelpi and Albert Gelpi, eds., Adrienne Rich’s Poetry (New York: Norton, 1975), p. 83. Back
- Allen Ginsberg, Howl and Other Poems (San Francisco: City Lights, 1956), pp. 9, 17. Back
- Philip Roth, “Writing American Fiction,” Commentary, 31 (March 1961), 224. See also Tony Tanner, City of Wards: American Fiction, 1950–1970 (New York and London: Oxford UP, 1971) and Stan Smith, A Sadly Contracted Hero: The Comic Self in Post-War American Fiction (1981), the fifth pamphlet in this series. Back
- W.E.B. DuBois, The Souls of Black Folk (Chicago: A.C. McClurg, 1903), p. 143.See also A. Robert Lee, Black American Fiction Since Richard Wright, a pamphlet forthcoming in this series. Back
- Lawrence Ferlinghetti, A Coney Island of the Mind (New York: New Directions, 1955),pp. 49-53. Back
- John Updike, Rabbit, Run and Rabbit Redux (New York: Knopf, 1960 and 1971). Back
- Los Angeles Times, March 1980. Increasingly, as numerous observers note, even a house of one’s own, long a basic element ofthe Dream, may be priced out of sight. See, for example, Lance Morrow, “Downsizing An American Dream,” Time (5 Oct. 1981), pp. 57-58. Back
Top of the Page
Philip Davies, The Metropolitan Mosaic: Problems of the Contemporary City
BAAS Pamphlet No. 4 (First Published 1980)
ISBN: 0 9504601 4 1
- The Contemporary City
- Social Problems in the Cities
- Financing City Governments
- The Politics of Governmental Fragmentation
- Sunbelt Cities and National Policy
- Guide to Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1.The Contemporary City
The general outlook upon cities and the prospects for urban life in the last quarter of the twentieth century tends to be characterized by pessimism and trepidation; a sense of doom surrounds the topic. The New York Times, interested in probing this attitude, recently conducted a survey which included the question, “What’s your long range view for the city as a place to live? In ten or fifteen years, how will it be?” The response confirmed that many people had a bleak vision of the city of the future. The largest group, over 40 per cent of those polled, expected the future to be worse; the white people in the sample were consistently more alarmist in outlook than the blacks.[1] In popular culture the city is often portrayed as a source of corruption and the home of decay and social collapse. Films such as Taxi Driver, Mean Streets and Midnight Cowboy paint the city as a malevolent environment, especially harsh on the poor and disorganized. In much contemporary literature the assumption is made that urban decline and disintegration are inevitable, the only question being precisely when and how it will all happen. For example, an article in Saturday Review claims that cities have come to represent “crowds, crime, noise pollution and traffic” and lays part of the blame for the dissemination of these turbulent images on the broadcaster Johnny Carson, who “night after night … fired shot after shot at New York, the barrage beamed at a nationwide television audience that learned to equate ‘New York’ with ‘city’ and both of them with being mugged, raped and ripped off.[2]
This folklore of widespread urban disaster is only a partial view of reality. Cities are major centres of cultural, educational and technical excellence, and they are vital to the national economy: even the 33 per cent of the United States’ population that lives outside the cities depend on urban markets for their livelihoods. Cities contain the centres of capital and investment which dominate the American economy. Many corporate headquarters remain in city centres, in spite of the widely heralded flight to the suburbs. In sharp distinction to the apocalyptic vision of the city, a recent report points out that most cities in the United States are not in distress, many have economies which are growing or stable, and are apparently able to manage successfully their changes in population.[3] Nonetheless, cities are possessed of a whole gamut of problems deriving in part from the very fact that they have attracted such large numbers of people. The size and density of the urban population exacerbate the difficulties faced by individuals and local governments. Overcrowding accelerates housing deterioration and puts excessive burdens on education, health and other services industrial concentration can create a blanket of pollution which will not easily disperse; rapid growth puts great strains on communities and may create unplanned cities incapable of adapting to changing circumstances. The greater burden of these problems is borne by the poor; the affluent are able to buy their way out of the situation, creating thereby a class-based system of social segregation. The poor, living in neighbourhoods of people similarly disadvantaged, find that a problem shared is not halved, but compounded. The great expansion of city population has created some complications in determining where cities start and finish. In the nineteenth century cities had easily recognizable boundaries, with the limits of city growth usually approximating to a clearly defined area of municipal government. By 1949, however, these conurbations stretched far beyond the boundaries of municipal government, and the United States Census began to define metropolitan areas in which several cities and towns clustered together in great urban conglomerations. By June 1977 there were 281 of these “standard metropolitan statistical areas”, or SMSAs, as shown on the cover of this pamphlet. Most of them have at their core a central city with 50,000 or more inhabitants, although in highly urbanized areas an SMSA may contain more than one central city. The remainder of the SMSA more or less approximates to the suburbs.[4]
The interdependence of modern society ensures that the central city has close links not only with the surrounding suburbs but also with non-metropolitan areas farther afield. Hence the problems of any city cannot be explained—or solved—in isolation. The city may be the victim of pollution originating outside its boundaries and beyond its control; it may suffer from the poor quality of life elsewhere as poor, unemployed people migrate to the town in search of urban opportunities; and its economic basis may be undermined by decisions taken by nationally and internationally important industries and businesses. In this way developments out side the city can create ‘city problems’.
For example, recent technological changes have had a major impact on the character, physical structure, and problems of cities. Cities established in the late nineteenth and early twentieth centuries were built according to the demands of heavy industry and rail transport, and provided massive and concentrated blue-collar employment. Since 1945, however, technological change has meant a move towards light engineering and white-collar work, neither of which suffers from the geographical inertia of heavy industries, necessarily built close to major mineral resources. Established cities will always be adapting to modern demands but the resources for industries based on new technologies may not be readily available, and areas with greater locational advantages will be the new centres of growth. In any case, the predominance of the privately owned automobile over public transportation—and of the truck over the railroad—mean that the central cities are no longer the obvious production and assembly centres; in fact, as city roads have become more crowded and highways have provided access to outlying areas, the suburbs have become the logical industrial sites. As major industries move beyond city boundaries, the cities they abandon are left with all the consequences of industrial decline. Among other things, buildings and factories which when operating were an asset to the city, when abandoned become a liability—often impossible to remodel for a new use, expensive to demolish, and dangerous to leave empty.[5] In this way the con temporary city inherits the detritus left by previous, successful generations . Furthermore, these technological changes have been accompanied by—and have helped to sustain—a migration out of the inner cities by affluent whites. The same logic that has compelled industrial location in the suburbs has also encouraged residential suburban development. Automobile ownership and large-scale highway construction have opened up tracts on the outskirts of cities for housing; large-scale builders have become dominant in the house-building market, using mass-production and assembly line techniques to produce large numbers of houses built to standard designs on suburban land. The people who have moved to the suburbs have been predominantly white and middle-class, with only 5 per cent of suburbanites being black. This is partly due to the generally higher level of affluence of the white community, but has been reinforced by policies of racial and social exclusion practised by many suburban governments. More recently affluent whites have been migrating also to the ‘Sunbelt’, the states of the South and South-West which have seen rapid metropolitan growth in the past two decades. In the 1960s one-and-three-quarter mil lion whites entered the South, and the most recent figures show that whites continue to leave the urban North for the suburban and non-metropolitan South in vast numbers.[6]
Table 1
Distribution of population by race, Spanish origin, and location of residence: 1978.
|
All Races |
White |
Black |
Spanish |
|
Total (m) |
% |
Total (m) |
% |
Total (m) |
% |
Total (m) |
% |
Metropolitan Areas: |
|
|
|
|
|
|
|
|
Central Cities |
59.7 |
28.0 |
44.5 |
24.1 |
13.7 |
55.2 |
n.a. |
51.1 |
Suburbs |
83.3 |
39.0 |
77.1 |
41.7 |
4.8 |
19.4 |
n.a. |
34.3 |
Non-metropolitan |
70.4 |
33.0 |
63.2 |
34.2 |
6.3 |
25.4 |
n.a. |
14.6 |
United States |
213.5 |
100 |
184.8 |
100 |
24.8 |
100 |
12.0 |
100 |
A reverse migration began even earlier, as large numbers of black people moved to the cities and to the North. Between 1940 and 1970 the black population changed from being 49 per cent urban to 74 per cent urban; in 1940, 22 per cent of blacks lived in the North, but by 1970 this figure had increased to 39 per cent. In the 1960s one-and-one-quarter million blacks left the South, primarily for Northern cities, though recent figures show that this migration of poor blacks is now almost balanced by a reverse flow of more affluent blacks returning to the South. Throughout the same decades South to North migration was taking place among low-income whites, in particular from Appalachia and the Ozarks to such cities as Cincinnati, Cleveland and Chicago. West Virginia, the state most hit by this white outmigration, suffered a 13 per cent population loss between 1950 and 1970. High levels of unemployment and poverty in their home areas constituted the main reason for these migations.[7] Thus demographic movements have tended to take more affluent and more skilled whites out of the central cities, and to move in more deprived and less skilled people, mainly black, at a time when he demand for unskilled labour is rapidly diminishing. Between 1970 and 1978 the white population of central cities dropped by almost four-and-a-half million while the number of blacks in cities rose by about 800,000. Since the younger and most fertile members of the population are over-represented in all migrating populations, mobility has an exaggerated long-term effect on changes in population balance. The whole of the most recent changes in black-white proportions in city populations can be accounted for by white outmigration and black natural increase. In some cities, even if no further white outmigration took place, the population would become increasingly black owing to the higher proportion of young people in the black central city population.[8] Consequently the de facto racial segregation in many cities is likely to increase, leaving a larger black population isolated in the central cities to suffer the effects of depressed urban economies. As a result America’s central-city population is not a random sample of the nation, as Table 1 demonstrates.[9]There is a concentration of racial minorities and economically deprived groups which, regardless of location, are in any case suffering from high unemployment, low levels of educational attainment, poverty, and inadequate housing. In many cases these are national problems which have become located in cities through migration—problems in the cities rather than problems of the cities.
2. Social Problems in the Cities
The 1960s was a decade of great civil unrest in America’s cities. Almost every summer there were serious disorders, usually concentrated in the black ghettos. In 1965 the Watts area of Los Angeles was the site of the worst riot in the United States for over twenty years; thirty-four people were killed, hundreds injured and approximately $35 million worth of damage done as blacks destroyed white-owned businesses and battled with the police and National Guard. Riots devastated many cities across the nation; Chicago, Detroit, Cleveland and Washington, D.C., all suffered major disturbances. The National Advisory Commission on Civil Disorders (the Kerner Commission) reported that the sense of grievance among black city dwellers was so intense that a minor incident could trigger a massively violent community response. The Com mission identified twelve such grievances including police practices, unemployment and underemployment, inadequate housing, inadequate education, racial discrimination, and inadequate public services.[10] At the end of the decade President Nixon declared that the urban crisis was over. That this complacency was misplaced was demonstrated by the savage riots in Miami in May 1980; most of the urban problems cited in the Commission’s 1968 Report still exist. At the root of many urban social problems lies poverty. Admittedly, as Table 2 demonstrates[11] the proportion of the population who are considered poor is lower in metropolitan than in non-metropolitan areas, but the concentration of poor people in the central cities makes the problem much more visible, acute and politically dangerous than in the rural areas. As it is, black and white poverty rates in the central cities exceed national and metropolitan rates, and there is evidence that poverty is increasingly a central-city problem. Between 1975 and 1977 the number of poor people in the United States fell by over one million. All residential areas of the United States shared in this fall except the central cities, where the number of people in poverty increased by 113,000 to 9,203,000.[12]
In the 1960s serious efforts were made to tackle this “re-discovered” problem of urban poverty, but it is no longer the subject of new political initiatives. The black vote is firmly in the Democratic camp, and the Republican Party seems unlikely to make substantial inroads into it. President Carter’s “Program for Better Jobs and Incomes” was little more than a reshuffling of legislative resources left over after the Nixon and Ford administrations. After a decade of rhetoric and raised expectations the central-city poverty rate remains stable—roughly one out of every six city inhabitants, one out of every three blacks. Moreover, the government is very stringent in its definition of poverty. The official definition of the urban poverty level was set at an annual income for a family of four of $3,022 in 1960, and had risen to $6,191 per annum in 1977. During the same period the median income of all families increased from $5,610 to $16,009, meaning that whereas in 1960 a poor family would reach the median by increasing their income by four-fifths, in 1977 the median was more than two-and-a-half times the poverty lcvel.[13] Anti-poverty legislation has not achieved its stated aim of eliminating poverty, but has instead left a relatively stable number of poor people increasingly out of touch with the national average income level. While the rate of persistent poverty is much lower than the overall poverty rate, there are many families moving out of and in to poverty each year. In the nine years beginning 1967, a University of Michigan study of five thousand families found that 25.l per cent of them had been below the poverty line for at least one of the years studied.[14] Poverty is thus much more pervasive than the annual rate alone would suggest. Despite aid from welfare programmes such as Aid to Families with Dependent Children, some groups in society—such as blacks and families headed by an unmarried parent—are still much more at risk than other groups. The working poor are helped very little by any form of government financial aid; fewer than one-third of this group received enough aid to lift them out of poverty. Even an increase in the level of welfare payments would not help all the poor, since over forty per cent of them do not receive benefits from the welfare system. This pervasiveness of poverty shows that a substantial proportion of the population, while not living in continual poverty, lives extremely close to that borderline.
Table 2
Percentage of the population below the poverty level, by race, and location of residence: 1977.
|
All Races
|
White
|
Black
|
|
%
|
%
|
%
|
Metropolitan Areas: |
|
|
|
Central Cities |
15.4
|
10.7
|
31.2
|
Suburbs |
6.8
|
5.9
|
21.3
|
Total |
10.4
|
7.6
|
28.6
|
Nonmetropolitan |
13.9
|
11.2
|
39.1
|
United States |
11.6
|
8.9
|
29.0
|
The relatively low incomes of persons living in central cities, especially those belonging to racial and ethnic minorities, put them in an especially insecure position with respect to changes in employment structure or levels of unemployment. The central cities have a larger share of national unemployment than they do of the national population. About one-third of the nation’s total unemployed live in central cities, whereas only 28 per cent of the tot population are to be found there. As can be seen from Table 3,[15] unemployment rates rose dramatically in the 1970s; this rise especially hit blacks, and even more particularly blacks living in central cities. Over half the inner cities’ teenage black workforce was unemployed in 1977. Whereas the national unemployment rate rose 1.92 times between 1970 and 1977, the rise was higher in the central cities (2.02 times) and was particularly high among persons of Spanish origin in central cities (2.15 times) and among inner-city blacks (2.36 times). It is clear that during the 1970s unemployment became even more concentrated in the central cities, and that an increasingly disproportionate share of the burden fell upon American blacks. Low income, unemployment, and a low rate of participation in the labour-force[16] combine to produce a population with a very high need for public assistance, the demand for which has increased in the 1970s, again owing faster in central cities than elsewhere. Between 1970 and 1977, whereas the population of families receiving public assistance increased from 3.5 to 5.3 per cent in the suburbs and from 5.7 to 8.0 per cent in non-metropolitan areas, in central cities it increased from 7.3 to 12.4 per cent.[17]
Table 3
Percent unemployment among males: 1970 and 1977.
|
Age Range
|
United States
|
Central Cities
|
Suburbs
|
|
|
1977
|
1970
|
1977
|
1970
|
1977
|
1970
|
Black
|
16-19
|
43.8
|
19.3
|
51.2
|
21.6
|
45.3
|
22.4
|
Total: 16+
|
14.3
|
6.6
|
16.3
|
7.1
|
12.8
|
6.3
|
White
|
16-19
|
17.7
|
10.4
|
19.2
|
11.8
|
18.3
|
10.0
|
Total: 16+
|
6.8
|
3.6
|
7.6
|
4.0
|
6.3
|
3.3
|
Spanish origin
|
16-19
|
20.8
|
14.6
|
23.2
|
14.5
|
21.3
|
16.0
|
Total: 16+
|
10.5
|
5.5
|
11.6
|
5.4
|
9.4
|
5.4
|
All races
|
Total: 16+
|
7.5
|
3.9
|
9.1
|
4.5
|
6.6
|
3.4
|
In a participant observation stud which gives insight into working-class lifestyles, Joseph T. Howell describes a small blue collar (or working-class) community on the edge of Washington, D.C. He classifies the families by a set of characteristics he calls ‘hard living’—toughness, political alienation, a strong sense of individualism, present-time orientation, rootlessness, marital instability and heavy drinking—as opposed to ‘settled living’. One family, the Shackelfords, have an income which puts them just above the poverty line, but the description of their crisis-ridden life—dilapidated housing, eviction, ill-health—belies the fact that technically they are not poor. Virtually all contact the Shackelfords have with the urban welfare establishment is bureaucratic, complex and degrading. A welfare benefit is denied because the relevant papers are misfiled; a cheque is stopped because a doctor’s receptionist gives incorrect information to the welfare office; each month the claim for food stamps involves arriving at the welfare office between 5a.m. and 7a.m., queuing for two or three hours, and perhaps still being deferred to the following day. At one point when the Shackelfords can find nowhere to live and are sleeping in their car, a social worker refuses to help them purchase a mobile home, saying ” . . . my supervisor and I feel that a mobile home is an unsuitable living environment for a lower income family.” When Howell telephones on the Shackelfords’ behalf to dispute the decision, he is informed that his interference can only hurt those whom he is trying to help.
Sam Moseby is a garage mechanic and the Moseby’s family in home is about the national average—well above the poverty line. However, the Dodge dealer Sam works for closes and Sam has to find another job. After a couple of weeks without work he finds a less skilled position, with poorer working conditions, and involving a pay cut of 40 per cent. A few months later illness makes him miss two weeks of work. This combination of events almost halves the family’s annual income. The experiences of both families exemplify the insecurity that can affect people, including those above the poverty line. Both families have to cope with outside influences on their lives—the welfare agencies, redundancy—over which they have no influence. Both families suffer changes in circumstances which help explain the pervasiveness of poverty found in the Michigan study.[18]
The concentration of the poor and racial minorities in the central cities further creates a whole range of complicated problems for urban services. For example, a Department of Health, Education and Welfare report points out:
On nearly every index we have, the poor and the racial minorities face worse than their opposites. Their lives are shorter; they have more chronic and debilitating illness; their infant and maternal death rates are higher; their protection, though immunization, against infectious disease is far lower.[19]
Health care is inadequate in areas of greatest need. Physicians are unevenly distributed geographically so that a suburb may have eight times more doctors proportionate to population than its neighbouring inner city. In part this is a consequence of a system where most of the costs are borne out of consumer fees or private insurance. Although government subsidies have increased in recent years, by 1974 they still accounted for only about 40 per cent of health-care expenditure. When the greater proportion of costs is paid by the individual (directly or through insurance premiums), the less affluent are discouraged from seeking medical care, and practitioners locate to serve that sector most able and willing to pay. Some government health programmes, especially Medicaid, a system of health benefits for the poor, have greatly helped urban populations. However, the availability of these benefits is not uniform since each state has the option to decide its level of participation, and the way in which it will distribute costs between levels of government. Hence, for example, Arizona does not participate in the Medicaid programme at all, whereas New York in 1978 provided nearly one-and-quarter million persons with 281 million worth of Medicaid benefits. Although the technology and personnel in American health care are of a very high standard, lack of co-ordinated planning has resulted in a maldistribution of funds and services and a rapid increase in costs which has hit the central cities particularly hard.[20]
Poor housing is an endemic problem in the inner city. Ten years after the Kerner Commission Report a study Reported a 13 per cent increase in overcrowded and substandard housing units, and also found that the number o households suffering some form of housing deprivation had increased by two-and-a-half times to 16.8 million. Housing segregation has not diminished, and racial minorities are still about three times more likely to occupy substandard housing than whites. Six main reasons have been cited for this continuing housing crisis: partisan politics have prevented comprehensive planning at the federal level; government-subsidized low income housing programmes have been subject to widespread fraud resulting in the building of substandard housing; changes in federal funding programmes have allowed local governments to shift spending priorities away from low-income housing; the federal government has always underinvested in home-building; the 1970s have seen rapidly increasing housing costs in a period of recession, undermining families’ ability to afford decent accommodation; and earlier estimates of housing need were mistakenly set too low.[21]
Selective redevelopment of inner-city housing has in fact taken place. Over a hundred neighbourhoods in the thirty largest cities now have some areas of ‘gentrification’, as increases in suburban housing prices and in commuting costs make inner-city living more attractive. Blocks of deteriorating housing have been renovated, mainly by young, childless professionals who have moved there from another part of the inner city to take advantage of the area going up-market. Although privately financed urban improvement may be encouraging, it can work to the disadvantage of the poor. The renovation of dilapidated housing by private-market operators involves the removal of the former low-income occupiers. As an area becomes gentrified, perhaps by the efforts of a few individuals, landlords may take the opportunity to evict low-income tenants, renovate the property and sell or rent at a much higher return. As an area improves the land taxes are reassessed and building codes are enforced so that owner-occupiers on low or fixed incomes find that they can no longer afford to maintain their homes. Such forces can lead to dramatic changes, as for example in Washington, D.C., where the Capitol Hill area has changed from being a mostly poor and black area to having a population which is 80 per cent white and predominantly professional. While the housing stock of a neighbourhood is improved by gentrification, the erstwhile inhabitants just move on to other slums.[22]
Racial isolation in schools and standards of educational achievement have been among the most widely debated topics in the urban public services during the past two decades. Although the United States Supreme Court declared in 1954 that laws which segregated education on racial lines were unconstitutional, the majority of American children still attend schools which are ‘single-race’. This racial isolation is particularly pronounced in the large urban schools where segregation ratios (the proportion of pupils who would have to move to achieve a racial balance) are often 70 per cent and in some cases as high as 90 per cent.[23] Given the heavily segregated housing patterns of the cities, the sheer logistical problems of achieving racial balance in schools are immense. Since active federal support for desegregation waned during the 1970s, there seems little prospect of any great progress in the near future.
There is still much debate as to whether desegregation affects achievement in schools, but the general level of achievement is another cause for concern. Achievement levels in the inner cities are lower than the nation average, and in recent years achievement scores in secondary schools have declined. A study of many cities, including New York City, Los Angeles, Miami and Baltimore, found sharp falls in reading, comprehension and mathematics test scores over the period 1966 to 1974.[24] The urban education system in still appears to be failing to provide adequate service.
This complex of problems has contributed to the growth of violence and crime in the central cities. A survey conducted by the New York Times in 1977 asked, “Is there any place within a mile of your home where you would be afraid to walk at night?” Seven out of every ten respondents answered in the affirmative. The reason for his public alarm was obvious, for in the previous year “there were 1,622 homicides in [New York] city … . The city’s homicide rate was 20.5 murders for every 100,000 residents … . Detroit has a homicide rate of 49.3 murders for each 100,000 persons.”[25] Crime rates in cities are higher than the national average for many categories of crime, including murder, robbery and assault, and the rates increase as the city gets larger, cities over 250,000 having crime rates double those of other cities. There is therefore a demand for greater police protection, especially by racial and ethnic minorities, who are much more likely than whites to be the victims of crime.[26] Public concern over the quality of city policing is widespread. Brutality, corruption and racial prejudice have on occasion been evident in police behaviour, undermining the trust between the community and the police. The characteristic reaction of the police force has been to close ranks against the public, treating all accusations as attacks on law enforcement. The police have become increasingly unionized and militant, and are emerging as a significant political force in many cities; indeed, black mayor Carl Stokes of Cleveland, Ohio, chose not to run for re-election in part because of the opposition of the police.[27] Some police foes have made substantial efforts to improve community relations, for example by recruiting officers from minority groups in the city, but such efforts often meet with considerable opposition from serving policemen and the public. In a situation where civilians are suspicious of police motives and efficiency, and where the police are in their turn sceptical about the extent of public support in their dangerous and difficult jobs, the morale of both groups has fallen and the fear of crime has increased.
The failure of urban services to provide an adequate level of care and protection often results from the special burdens put on these services, owing to the concentration of the poor and racial minorities in the cities. for example, high levels of unemployment mean high expenditures on public assistance to support this concentrated dependent population, and city budgets frequently can not afford to bear this extra burden. If city services are not always geared to meet the demands made upon them by the massive concentrations of people suffering hardship, this is at least in part because of the financial difficulties cities face.
3: Financing City Governments
Public facilities and services will always be expensive in a city, since the costs of land, labour and building are higher than elsewhere. High urban crime rates increase the cost of emergency services, the concentration of population puts extra burdens on public utilities, and urban streets are very heavily used. In fact the expenditures of central cities on services such as fire and police protection, water and sewerage services, garbage disposal, highway maintenance and recreation facilities are double those in suburban areas. But, in addition, the disproportionate concentration of poverty, unemployment, and disadvantaged racial and ethnic minorities in the cities results in many of the usual functions of local government becoming even more expense. Welfare payments, subsidized medical care and subsidized housing must all cost more in areas with high proportions of poor people if adequate levels of these services are being provided. In areas of high unemployment, training and job-creation schemes are a priority, and need money spending on them. Similarly, education is likely to be more expensive when the children come from socially deprived communities and need greater than average help to achieve their full educational potential. Hence the amount spent on education in cities exceeds suburban expenditure, even though school costs generally account for less than 40 per cent of central-city expenditure compared with the 60 per cent devoted to education by the suburbs. In many countries nowadays these problems would be considered a national responsibility, but in the United States a strong tradition persists of regarding them as local problems to be solved locally. Hence even though an increasing share of the costs is borne by state and federal governments, the main financial burden continues to fall on the cities’ municipal governments.[28]
How a municipal government faces up to these demands and burdens depends largely on which groups control a city’s policy—those who need the public services most, or those who have to foot the bill. To a remarkable extent the direction of city policy reflects the political structure it has inherited. In the late nineteenth century, city government generally followed the traditional arrangement with a mayor and councillors elected from wards, but not all mayor-centred governments were successful. Cities with institutionally weak mayors suffered from a lack of policy direction, while in some cities where the mayor had sweeping powers it was difficult to check the excesses of political leadership. Accusations of corruption and inefficiency in the early part of the twentieth century encouraged the adoption in some cities of alternative forms of government, notably the council-manager form and the commission system. In the former, a council was elected at large from the city, using a non-partisan ballot. This council appointed a manager whose job it was to attend to the administration of the city, and to carry out the council’s policies. The commission system also employed the non-partisan ballot and at-large election, but the electorate chose a person to head each of a number of Commissions, or administrative departments. Each elected Commissioner had independent responsibility for one department, and the Commissioners together formed the city’s legislative body. Cities which have grown up during the present century have had all these models to choose from, as have the increasing number of suburban governments which have been incorporated in recent years; by contrast, old regional centres and early cities have had long established governmental structures and functions which have tended to persist. Similarly the larger the city, the more likely is it to retain the mayor-council form of government, as Table 4 shows.[29]
Table 4
Form of government in cities, by size of population: 1972.
Population Group
|
Govt
Type |
1m+
|
0.5m-1m
|
0.25m-0.5m
|
0.1m-0.25m
|
0.05m-0.1m
|
|
No
|
%
|
No
|
%
|
No
|
%
|
No
|
%
|
No
|
%
|
a |
6
|
100
|
15
|
75
|
13
|
43
|
38
|
39
|
94
|
37
|
b |
|
|
5
|
25
|
14
|
47
|
51
|
52
|
142
|
56
|
c |
|
|
|
|
3
|
10
|
9
|
9
|
14
|
6
|
d |
|
|
|
|
|
|
|
|
5
|
2
|
All |
6
|
100
|
20
|
100
|
30
|
100
|
98
|
100
|
255
|
100
|
key:
a = Mayor-Council b = Council-Manager c = Commission d = Other
In ‘reformed cities’ the influence of social and ethnic minorities tends to be small, and that of the middle class predominates; in the older, larger cities local politicians are more responsive to deprived social and ethnic groups. As a result, the level of public spending and the number and scope of services generally vary with the age of the city. Newer cities are more narrow and specialized in their service functions: frequently some services, such as garbage disposal, may be left to private enterprise, others, like the provision of mains water, to a private licensed monopoly. Older cities, particularly the large cities of the North East and North Central states with their history of industrial development, are more comprehensive in the services they maintain, and central city governments in general support a wider range of services than the suburbs.[30]
Furthermore, the policy choices of municipal governments are restricted by the limitations imposed by their state government. After all, municipal corporations owe their origin to, and derive their powers and rights wholly from, the state legislature; they can thus exercise only powers which are expressly granted them by the state, or implied by the express powers, or which are judged essential to the operation of the corporation. This blanket restriction also limits the powers of local governments to raise money since any local innovation in taxation or borrowing policy must have state authorization. Since 1961 reapportionment has in creased the representation of central cities in state legislatures, yet, even so, state legislatures are often dominated by rural and suburban interests which are frequently unsympathetic to the problems of the central city and devoted to the principles of economical government.
The only major local tax allowed by all state governments is the property tax, which is similar to British “rates”, and over 80 per cent of local-government tax revenue comes from this source. However, property tax is inelastic. Except in a rapidly developing city, the valuation of property and the revenue accruing are unlikely to keep pace with the rising costs of public services. This is especially hard on big cities, where demands for services have increased most in recent years. These cities have therefore led the way in trying to develop local government income from alternative sources, and in persuading state governments to agree to the changes. As a result of the increasing use of local sales taxes (levied on most retail sales) and income taxes, cities with over 300,000 population had reduced their dependence on property taxes to about 60 per cent of local-government tax revenue by 1974. Furthermore, these large cities have developed local non-tax resources, such as user charges (fees for public facilities) and licence charges, until by 1974 they were providing almost one-third of local revenue. However, since only about half the states allow a local general sales tax, and only a minority (ten states in 1973) allow local income taxes, a large number of cities are unable to tap alternative sources of revenue.
As a result, most cities must still rely heavily on non-income taxes. This has two major consequences. Firstly the overall tax system is regressive; the poor indirectly pay a higher proportion of their income in taxes than do the affluent. Secondly, though federal income-tax receipts respond automatically to rises in national income, local tax sources are not so responsive. Hence, “each 1 per cent rise in GNP raises federal revenue by 1.5 per cent, but city revenues by only 0.5 per cent. With city spending rising so much faster than GNP, this spells financial trouble.’’[31] Consequently many city governments have difficulty in meeting their budgeted requirements, as the cost of labour-intensive city services steadily rises in all inner cities. Even a fiscally conservative city, committed to keeping costs to a minimum, may find that merely keeping services at the same level from year to year necessitates increased expenditure. Oakland, California, a medium-sized city with a conservative approach, has found an annual 8 per cent growth in expenditures necessary to maintain the previous levels of service provision.[32] One long established practice is to raise revenue by the issue of municipal bonds. Purchasers of these bonds in effect lend money to governments, generally at high rates of interest which is often tax-free. State governments often impose limits on how many bonds a municipality may issue, or require local referenda on bond sales to restrict the degree to which the device is used. The gross amount of municipal debt has increased considerably in recent years, but the debt-service payment (roughly equivalent to interest charges payable by the municipality) in all state and local government has remained stable at about 20 per cent of general revenue. This general average hides the fact that some local governments particularly in need of cash have committed themselves more heavily to this kind of debt, which is now costing them a large proportion of their revenue; for example, New York showed an increase in its debt-service payment to bond holders from 35 per cent of general revenue in 1966 to 44 per cent in 1973.[33]
Table 5
Intergovernmental aid to municipalities, by source of aid: 1965-1977.
Source of Aid |
1965
|
1970
|
1977
|
|
$m
|
%
|
$m
|
%
|
$m
|
%
|
State Governments |
2,475
|
17.3
|
6,173
|
23.2
|
14,236
|
23.4
|
Federal Revenue Sharing |
|
|
|
|
2,380
|
3.9
|
Other Federal Aid |
789
|
5.0
|
1,733
|
6.5
|
6,537
|
10.7
|
Other Intergovern-mental Aid |
|
|
|
|
1,023
|
1.7
|
All Intergovern-mental Aid |
3,534
|
22.3
|
7,906
|
29.7
|
24,176
|
39.7
|
Total Municipal Revenue |
15,881
|
100
|
26,621
|
100
|
60,924
|
100
|
Given the limitations which exist on the powers of city governments to solve their fiscal problems, it is understandable that they should look to higher levels of government for aid. State and federal aid to cities has increased substantially in recent years, as can be seen from Table 5:[34] intergovernmental aid now accounts for almost 40 per cent of total city revenue, having increased from 22.3 per cent since the mid-sixties. In the larger cities this dependence is especially high; in 1977 cities over 500,000 in population received 30.2 per cent of their general revenue in the form of state aid, although some of this money comes from the federal government, being channelled to the cities via the state government. In 1972 total state aid averaged about 60 per cent of the revenue local governments raised from local sources, but this figure hides the wide variation from up to 100 or 150 per cent in New Mexico, North Carolina and South Carolina down to 12 to 25 per cent in Massachusetts and South Dakota.[35] Similarly the purposes for which aid is given vary widely between states, though over 80 per cent of state support for local governments is for education, public welfare and highways.
These wide variations make it difficult to generalize about the benefits given by state aid. But undoubtedly the willingness or otherwise of state governments to become involved in the support of local government affects the ability of city governments to carry out their functions effectively, and while intergovernmental aid has helped cities maintain essential services, state regulations can make the cities’ financial problems even worse. For example, the raising of revenue for the Aid to Families with Dependent Children programme (AFDC) and Medicaid payments is a state responsibility, and only twenty-one states require local-government cash contributions for these purposes. In sixteen of these states local governments bear less than 10 per cent of Medicaid and AFDC costs; in Minnesota it reaches 21.8 per cent, while New York demands the highest local contribution of any state at 23 per cent. In this way the regressive policies of New York State regarding the division of responsibility for providing welfare and health-care services have put a particularly heavy burden on local governments in that state, and as a result in 1975 New York City had to raise about one billion dollars to pay its welfare-related expenditures.[36] The example of New York also demonstrates how intergovernmental aid, even when essential, can place an ailing city under unwelcome restrictions. The City’s fiscal problems became critical in April 1975 when its municipal bonds found no buyers, and it was saved from defaulting on its debts only by a series of emergency aid payments from the federal government. One of the conditions for this aid was that control over the city’s finances be placed in the hands of a state-dominated Emergency Finance Control Board. This probably ensures that the city’s ability to borrow money depends on its willingness to undergo sharp cuts in the city budget. Yet at the same time the maintenance of New York City’s services at an adequate level requires even greater expenditure. A recent report claims that the streets, water, sewer and transit systems are in a severe state of disrepair, and that this physical plant “needs a significantly increased rate of investment in maintenance and replacement if serious problems are to be avoided in the coming decades.”[37] This is an extreme but telling example of the central cities’ dilemma—the necessity on the one hand to provide services and on the other to maintain budgetary policies acceptable to their paymasters, be they creditors, other governments, or taxpayers.
In recent years the level of local taxes has led to a number of ‘taxpayers’ revolts’, the most dramatic of which was the passing in June 1978 of California’s Proposition Thirteen. This was the result of a public petition to the California state government, known as an initiative proposal, demanding that a proposition imposing major local tax cuts be put to the vote at a state-wide election. The overwhelming two-thirds majority received by Proposition Thirteen, together with similar initiative victories in Nevada and Idaho, caused widespread fears of an era of enforced austerity which would hit the fiscally distressed areas of the nation most hard. The initial indicators are not of sweeping national cuts. In the 1978 elections half the referenda proposing limits on taxation and government spending were defeated, while in 1980 a further tax-cutting initiative, Proposition Nine, was defeated in California by a 61 per cent majority—almost as great as that which passed Proposition Thirteen two years earlier.
In 1979 fiscal problems in Cleveland, Ohio, reached a critical point when the city defaulted on its bond payments—the first time a major city had done so since the 1930s. Subsequently a referendum to increase the personal income tax levied by the city was passed by a margin of two to one. This has to be set against the experience of Cleveland in 1970 when black mayor Carl Stokes supported one tax reform proposal, the white-dominated council presented a separate plan, and the racially divided electorate rejected both, precipitating a financial crisis in the early seventies. The inability to agree on an earlier solution certainly contributed to Cleveland’s 1979 crisis, but the local reaction in voting for increased taxes suggests that the trend favouring limitations on tax gathering is not uniform. However, the frequent shifts in local opinion give politicians no solid base from which to plan, and the solution to Cleveland’s problems is unlikely to be found in reforms which do not involve long-term planning and the co-operation of neighbouring governments.
America’s metropolitan areas contain sectors of considerable wealth and burgeoning prosperity, yet central cities often face this anomaly of high demand for services with inadequate fiscal resources. In order to resolve this problem the resources should be available to the governments whose need is greatest. That this need is rarely fulfilled is a direct consequence of local-government fragmentation and the politics of self-interest.
4: The Politics of Governmental Fragmentation
In its 1976 report on Improving Urban America, the Advisory Commission on Intergovernmental Relations explains how:
local governments were never created on any rational or planned basis. Counties and townships were created to serve as decentralized arms of colonial and then state governments, and cities and towns were added as residents sought to incorporate to achieve representation and self-determination of local policy matters. Special districts were added to fill in the service cracks not provided for adequately by existing general purpose governments or, in the case of schools, because of the 19th century belief that politics could thereby be removed from educational policy making.
In this brief history of American local government one finds the roots of urban America’s political fragmentation. The various local governments were not formed as part of any rational and efficient system; each individual government, with its own specific function, was established in response to a particular local need at a particular time. Once a government has been established it is difficult to dismantle; even though metropolitan society has be come increasingly interdependent, the metropolis remains a patch work of “competing, overlapping, uncoordinated, independent political units.”[38]
America’s Standard Metropolitan Statistical Areas in 1972 contained 22,185 different local governments. Sixty-five of these metropolitan areas contained over one hundred local governments each, while thirteen of the largest SMSAs had over 251 local governments each. According to the Advisory Commission, “Although wide variations [are] apparent the ‘typical’ SMSA had two counties, thirteen townships, twenty-one municipalities, eighteen school districts, and thirty special districts.”[39] This tangle of governments has been eased in recent years by extensive consolidation of school districts into larger units, but the number of special districts has increased—from 7,569 in 1967 to 8,054 in 1972. Special districts, like school districts, are generally responsible for one function only, for example fire protection, cemeteries, hospitals, sewerage or water supply. These districts were set up to provide a service for some geographical region—often here the existing general purpose governments would not or could not, provide that service. But, unlike school districts, special districts cannot all levy taxes; about half of them are limited to such sources as service charges, although almost all of them can receive grants from other governments.
The complexity of this governmental maze is increased by the fact that the borders of the various bodies are not necessarily coterminous. Special districts may provide their particular function in areas overlapping two or three counties, townships and municipalities. As a result a community may find itself subject to layer after layer of government. The average central city has more than four layers of local government, and in some metropolitan areas there are many more. For example, the 16,007 people living in Whitehall, Pennsylvania, in 1972 were subject to nation, state, and the following authorities: Air Quality Control Region, South western Pennsylvania Regional Planning Commission, Western Pennsylvania Water Company, and Allegheny County; the county’s Port Authority, Criminal Justice Commission, Soil and Water Conservation District, and Sanitary Authority; the City of Pittsburgh, the South Hills Area Council of Governments and Regional Planning Commission, Pleasant Hills Sanitary Authority, and the Baldwin-Whitehall Schools Authority and Schools District.[40]
General-purpose local governments, incorporated under the terms of state law, have continued to be established since 1945, especially in the suburbs, where independent municipalities have been incorporated or existing government boundaries strengthened. Between 1967 and 1972, 148 new municipalities were incorporated within SMSAs and in 1973 eighty-two more were added, bringing the total to over 5,400. Many of these local governments are small. Almost two-thirds of them had fewer than 5,000 residents and about 30 per cent had fewer than 1,000 in 1972. The governments have often been formed by small, homogeneous communities all the better to serve their own interests. Amongst their priorities will be the desire to preserve the character and homogeneity of the community by developing political and legal borders between the community and any neighbouring areas with urban problems.
The rapidly growing suburban community does not wish to see the social and economic problems of the inner city developing within its own territory, and it has a fine defence against this in zoning laws. Local governments are able to pass legislation on land use and building standards within their jurisdiction. This local discretion has been used by suburbs to maintain their homogeneity. By permitting the construction only of single-family dwellings on large plots, or by prohibiting multiple-unit housing or prefabricated housing, a local government may use its zoning powers to fix a minimum cost on a home in that community, thereby determining the socio-economic make-up of the population. For example, in 1971 in Westchester County, outside New York City, about 94 per cent of the land was zoned for residential uses and 99 per cent of that for single-family homes, one-eighth with minimum lot sizes of four acres.[41]
There is little incentive for municipalities to co-operate with neighbouring governments. The local government’s aim must be to provide an attractive range of services at reasonable cost. Low taxes may be an asset, but high taxes in conjunction with, for example, good educational facilities may be even more attractive to a middle-class family. It is up to the local government to find that combination of services which serves best to attract residents and development beneficial to the community. The fragmentation of local governments puts each local-government package ‘on sale’ to the potential residents or industrial and retail investors, and therefore places neighbouring local governments in competition with each other to attract the ‘best’ available consumer. In such a situation, any aid to other local communities would serve to increase the costs on a local government at the same time as benefiting a competitor.
Owing to the fact that expensive services and high-cost citizens are geographically localized, the fragmentation of local government gives people who can afford it the opportunity to isolate themselves from the problem. High costs necessarily translate into high local taxes; the local-government system ensures that these costs are confined within a given area. A relatively minor relocation may put a citizen into a different municipality, and thereby beyond the reach of the central cities’ high tax rates. It is the inner-city citizen who has to bear the cost of supporting inner-city governments. The suburbanite, dependent upon the inner city for his or her livelihood, can retreat behind local-government barriers to avoid paying for services essential to the maintenance of the central city. The artificial political barriers within the metropolis encourage a governmental autonomy which flies in the face of the reality of metropolitan interdependence. The many overlapping and competing governments hinder the prospects for metropolitan planning in the interests of a metropolitan area at large rather than to serve the ends of a particular locality. Even where the motivation for co-operation exists, it is not easy to achieve amongst this network of political boundaries. An added problem is suffered by those cities which straddle state lines; in 1970 there were thirty two metropolitan areas in such a position. For example, Quad City is made up of Davenport in Iowa, and of Rock Island, Moline and East Moline in Illinois; here no fewer than 252 interjurisdictional agreements have had to be made on such matters as law enforcement, transportation, health, sewage disposal, and parks and recreation.[42]
The limitations imposed by the fragmented metropolitan structure have contributed to the inner cities’ fiscal crisis. As the cost of city services has increased, so many families have moved to suburban communities where the tax burdens may be lower, and a higher proportion of local funds remains for education and related services. Left in the inner cities are those too poor to move. As a result of this movement, between 1970 and 1977 the cities lost individuals and families with an aggregate income of 48 billion. In the four years to 1974 the loss in rents was equivalent to the destruction of almost one-and-a-half million residential units, and the loss in food purchasing was equivalent to the closing down of a thousand supermarkets.[43] Moreover, the annual rate of income loss from the cities is still accelerating. This presents a picture of long-term shrinkage in central-city property values and consumer sales which will, in their turn, have an impact on local property tax receipts and sales-tax revenue, and must eventually lead to reductions in city services. This loss of revenue puts a yet larger burden on the taxpaying citizens remaining in the central cities, as the urban services still have to be maintained; hence, local tax rates must accelerate to maintain existing services, thus exerting still more pressure on families with the wherewithal to move away from the cities. Yet many cities have no alternative but to follow this self-defeating course of action.
The consolidation of small local governments into larger metropolitan units has been widely proposed in order that “the wealth, power and credit of the area as a whole may be mobilized for the solution of the over-all problems of the area.”[44] The aim is to reduce the number of small, independent local governments, to create larger, more mixed taxing regions, and so to enable inner cities to increase their tax base and obtain more generalized control over metropolitan development. If local-government boundaries are rationalized in such a way that the gross disparities in tax burden and levels of service no longer exist between local-government units, the incentive to migrate to lower tax areas should be reduced.
Since 1945 such large-scale metropolitan consolidation has been implemented in a few states. One method has been to allow inner cities to annex the surrounding urbanized areas. In some cases it has taken the form of city-county consolidation, as for example in Nashville-Davidson, Tennessee ( 1962) and the Anchorage-Greater Anchorage Area Borough, Alaska (1975). In the Tidewater region of Virginia a series of mergers of counties, cities and towns has not produced area-wide government consolidation, but the SMSAs concerned now each contain only six units of local government, all of viable size.[45]
However, more proposals of this kind have been lost than won. People who benefit from the differences in tax levels within metropolitan areas rarely support plans for annexation or consolidation of tax-levying local governments. If the area surrounding the central city is made up of already incorporated and independent communities, then annexation becomes very difficult since a majority vote in favour of annexation would be required from each of these surrounding municipalities. The power to annex their way out of problems is denied most particularly to older cities, surrounded by long-established, independently incorporated suburbs. Some newer cities of the South and West, where suburban settlement and incorporation have had less time to develop, have proved more able to annex surrounding territories, and some states, for example Texas, Oklahoma and Virginia, have given cities greater powers of annexation and have limited the rights of new suburbs to become independently incorporated when adjacent to city boundaries. Nevertheless, over 70 per cent of suburbanites live in communities with strong, well-established governments, and are therefore able to resist being swallowed up in some central city tax area.[46]
Opposition to the expansion of city-government boundaries does not come only from suburbanites. Political leaders from the inner city may fear for their power base if the electorate becomes more diverse. Black political leaders have been elected to office in increasing numbers as migration patterns have produced concentrations of black voters in central cities. Black mayors have been elected in such major cities as Atlanta, Detroit, Newark, Gary and for a time Cleveland, and the number of black council members and other officials has been increasing. Although many of these victories have been in cities with severe problems, they have provided the black community with a nationally recognizablc political leadership. Many black politicians fear that metropolitanization of local governments will dilute this black power, and that any benefits gained through the increased availability of tax resources will be offset by loss of power over the distribution of those re sources. One such case, in Virginia, has led to that state instituting a partial moratorium on annexation.[47]
State governments can help central cities in other ways also. F or example, the states which require local-government contributions towards the cost of the AFDC welfare programme levy the contributions at the county level, thus forcing some suburbs to share their neighbouring central city’s costs, though this is of no help to large cities like New York whose borders are coterminous with county boundaries. More usefully, the state governments some times assume the responsibility for some functions traditionally the preserve of local government. For example, several states such as Alabama and Montana handle public assistance on a state level, while in Hawaii education, usually the largest individual item in a local government’s budget, is a state function.[48] By assuming responsibility for such functions, the state is able to relieve inequities stemming from differences in resource capacity between local governments, but this option has not met with widespread favour.
The fact is Americans remain attached to their traditional belief in the value of strong, responsible local governments close to the electorate. Even the incredible fragmentation of city government has its defenders, who emphasize its efficiency and democratic nature. A fragmented system gives the city-dweller multiple points of access to the decision-making structure, and the variety of governments within a metropolis allows the urban family to maximize their preferences by moving to the area of their choice, complete with its ‘package’ of governments, services and taxes. Furthermore, this division of power fits in with generally held American beliefs in pluralism and individualism and hostility to concentrations of government power.[49] Yet the fact remains that a local government system divided within itself is highly detrimental to the needs of the central city.
5: Sunbelt Cities and National Policy
Problems of cities can also be viewed in the context of national and international economic and social trends. With technological and industrial developments the optimum location for industrial expansion may change. The Southern and South Western states—the Sunbelt—have been the latest beneficiary of such change; the oil and petro-chemical industries, for example, have brought tremendous prosperity to such cities as Houston and Dallas. The Sunbelt has also become increasingly accessible to investors with the development of an efficient highway system, a highly developed trucking industry and the domestic air service. The environmental attraction of Sunbelt climate is evident (once air conditioning is available!), and the South has extra incentives for the employer, for land is relatively inexpensive, the cost of living low, and the trade unions weak. The incentives for relocation had existed for some time, but post-war developments have made re location profitable, at a time when investment decisions are increasingly made in terms of an international market.
These regional job shifts have especially harmed the old cities of the North East and North Central United States. Between 1970 and 1975 employment in the West grew by 30.1 per cent, and in the South by 35.8 per cent, while the North Central area had a growth rate of only 14.7 per cent, and the North East actually lost 0.6 per cent of its jobs. This trend is even more marked in manufacturing employment. The 1960s saw rapid growth in this sector in Southern and Western states while manufacturing employment in the North remained stable. In the early 1970s the United States lost almost one-and-a-half million manufacturing jobs all from the North East and North Central areas. Between 1965 and 1972 New York City lost nearly 16 per cent of its jobs and Philadelphia lost 17 per cent.[50] Thus the North Eastern cities have been dealt a double blow. New areas have been opened up to compete for new investment, and the manufacturing industries on which the old cities developed have recently gone into national decline. The new industries offer employment primarily to skilled and white-collar workers, therefore the older central cities suffer competition for their more affluent workers while being left with the unskilled, whose opportunities remain restricted to the declining manufacturing sector — and to a public-service sector contracting result of budgetary stringency.
These regional shifts are largely the result of corporate business decisions taken on a national or international level. Obviously local governments have no jurisdiction at such levels; only the federal government could possibly design policies aimed at balancing the national benefits of urban growth against the problems inherent in the regional distribution of that growth. Local governments are in competition for those benefits, and have little incentive to co-operate with each other as each aims to maximize its tax base and minimize its problems. Only at the federal level could one expect to see the development of a national urban policy.
Comprehensive national policies covering any issue are not common in the United States, where each party’s Congressmen span a broad ideological spectrum and regional and constituency interests play a very large part in Congressional decision-making. Urban policy has proved to be no exception to this rule. The federal Department of Housing and Urban Development (HUD) was not created until 1965, even though a majority of the United States population had lived in cities as early as 1920. Even now many programmes with an urban bias are not controlled by HUD but by the Departments of Health and Human Services, Education, Commerce, Labor and Transportation; the problem of co-ordinating federal policy still remains. The Housing and Urban Development Act of 1970 was amended in 1977 to require a biennial ‘National Urban Policy Report’, the first of which appeared in August 1978. This followed the publication in March 1978 of the report of President Carter’s Urban and Regional Policy Group, which was the first ever Executive statement of a national urban policy.[51]
The federal government has been giving financial aid to cities ever since the ]930s, but essentially in an ad hoc, unplanned way. This has commonly taken the form of categorical grants or block grants, which are awarded for use in more or less specified policy areas and may require the expenditure of local funds to qualify. In recent years there has been a shift of emphasis from categorical grants towards block grants, which allow greater local discretion in the use of funds, but there still exist over 600 categorical-grant programmes administered by thirty different federal departments and agencies. A new method of fund allocation—revenue-sharing—was introduced by the State and Local Fiscal Assistance Act in 1972, which authorized the distribution of money to state and local governments according to a fixed formula which takes account of population, local taxes and per capita income.[52] Allocations are made according to the formula, thereby avoiding the expense and complications of individual applications.
There remains a maze of regulations and requirements to be negotiated in order to qualify for the maximum available funds. This encourages ‘grantsmanship’ on the part of local governments, sometimes in an unfortunate way: federal qualifying criteria may encourage a local government to take decisions which are not in the best interests of the community, but which will produce the grant. The competing, overlapping, complex grant programmes are evidence Or Congressional unwillingness or inability to develop an overall urban fiscal policy. Revenue-sharing goes some way towards redistributing resources towards areas of need, but by imposing maxima and minima on the amounts available to each government, the central cities are not given fiscal benefits equivalent to the share of the social and economic problems which they suffer. Even so, many cities with high concentrations of social problems are kept out of bankruptcy only by heavy dependence on federal aid.[53]
Actions of the federal government have affected urban development in other important ways. While a federal urban policy has not been thought out, this does not mean that the policies of the United States government have not had certain tangible outcomes. Uncoordinated and separate policies designed to attack a wide cross-section of urban problems can be seen to have combined to provide an unintentional overall effect. For example, the interstate highway programme has provided financial incentives to local governments to construct highways; much tax-paying property was destroyed in order to build these roads, which themselves then facilitated the relocation of affluent home-owners and local industry outside city boundaries. Highway building could therefore reduce a city’s tax base while leaving the city with major maintenance expenses.[54]
Other federal programmes also have encouraged post-war suburban development at the expense of the inner cities. The Housing Acts of 1949 and 1954 are famous for their slum-clearance programmes; urban renewal was in fact only a small part of the national housing policy, and its effectiveness in providing low income housing has been seriously doubted.[55] At the same time in the early 1950s Congress actually reduced the public-housing authorization almost every year, while extending the ability of the Federal Housing Administration (FHA) to provide mortgage insurance. Besides releasing money into the economy for house purchase, the FHA aided large builders with credit facilities, channelling federal money into profit-making suburban developments and away from high-risk central city areas. The FHA policy of designating (or ‘redlining’) urban areas within which loans were considered too risky virtually ensured further decline, while federal income-tax deductions subsidizing home ownership provided further incentives for suburban developments.[56]
A similar ‘accidental’ national urban policy encouraged the development of the Sunbelt. Federal aid has been available for road construction and for the provision and expansion of such facilities as water and sewerage, encouraging the construction of an urban infrastructure in nonmetropolitan areas. A further federal stimulus to the Southern economy has been the injection of government spending, particularly on defence-related projects. Defence contracts in 1975 provided an average $188 per person in the Sunbelt on salaries alone, as opposed to $54 per person in the Northern industrial states. Military spending policies essentially result in a redistribution of federal tax dollars from the cities of the North East and North Central areas to the South and West; according to one newspaper, “. . . in 1977 the average family of four in New York sent the Pentagon $1,800 more than the Pentagon pumped back into the New York area. The figures for Chicago, Newark, Cleveland and Rochester were comparable.”[57]
The assertion that federal disbursement policies aid the Sunbelt unfairly is disputed by some,[58] but as long as the Sunbelt has the benefit of cheaper labour, lower taxes and less expensive land, then competitive bidding will automatically favour it, and federal purchasing, when based on the need to get value for money, will naturally benefit areas like it. Only with the introduction of an ‘equity factor’—a formal bias in favour of disadvantaged cities— into bureaucratic decision-making could such a situation be pre vented. Unless and until such intervention is put into effect, federal taxation and spending policies will take money from the economies of the North Eastern states while channelling it into the Sunbelt.[59]
The older cities are, of course, represented at the federal level because Congressmen have to lobby for the cities in their constituencies, but the federal government is nevertheless rarely pro-urban in outlook. For it to be so, there would have to be a high proportion of liberal Democrats in Congress and a Democratic President—a rare combination. A political division, cutting across party lines, exists between Congressmen of the Sunbelt and the Frost belt (that is, North Eastern and North Central states), and this works to the disadvantage of declining cities. Although in the House party affiliation still remains the most important predictor of a pro- or anti-urban stance, in the Senate regional cleavage has become as important as, sometimes more important than, party membership as an indicator of attitudes on urban policy. This is primarily due to the strong anti-urban stance of Southern Sunbelt Senators and pro-urban stance of Eastern Republicans.[60] The 1980 census will be followed by a redistribution of Congressional seats which will increase Sunbelt representation as adjustments are made for population changes since 1970. Should regional sectionalism grow as a motivating force on policy decisions, the declining North East and North Central cities will find that they have even fewer lobbyists at the federal level.
Current economic and political trends are therefore clearly exacerbating the problems of declining Northern cities, but the cities of the growth areas are not without their own problems. While suffering less from the problems of decline, old housing and out-of-date capital equipment, growth cities have other difficulties. Even in relatively affluent regions the inner-city areas are generally in a less favourable financial position than their surrounding suburbs. In the South, no less than elsewhere, much of the recent investment has gone into suburban areas on the fringes of cities, especially as the generally weak local governments and minimal land-use regulations have made them particularly attractive to developers.
Even though the overall picture in the Sunbelt is one of economic and social well-being, the benefits of economic and industrial development are unevenly distributed within the population, and patterns of social disadvantage mirror those found in the cities of the North. Disadvantaged ethnic minorities are again concentrated in the central cities, but in addition to blacks the South has large communities of Spanish origin; Miami’s Hispanic (primarily Cuban) community makes up almost half that city’s population, while in the South West Mexican Americans outnumber blacks by two to one. An investigation of the rate of subemployment in eighteen North Eastern and Sunbelt inner cities found the latter to be slightly worse off, mainly because of the higher proportion of Sunbelt workers receiving substandard incomes.[61]
Much of the commercial boosterism of the South and West has stressed the low tax rates and minimum of local-government regulation, but this in itself can produce problems. Firstly, low taxes can support only minimal public services, which hit the poor hardest; secondly, minimal regulation has led to ‘urban sprawl’. This is an exceedingly inefficient use of land, especially as the costs of energy continue to rise. Just as the nineteenth-century cities of the North have become obsolescent, so this new Sunbelt pattern of land use may prove to have its own built-in obsolescence.
A growing city has to provide some services, but when a sprawling and unplanned city tries to introduce or upgrade a public service the logistical problems faced are enormous and the costs very high. A study in Santa Barbara, California, examined the costs in terms of city taxes of various levels of population growth. It was found that, in the twenty years up to 1995, a policy of ‘no growth’ would entail a 10 per cent increase in per capita property taxes to maintain service standards, even disregarding inflation. To develop and maintain services catering for an increase in population from 73,000 to 119,000 in the same period, each individual’s property taxes would have to rise by 45 per cent, while the maximum growth envisaged, to 170,000 population, would put taxes up by 58 per cent.[62]
The current high rate of investment in the Sunbelt states cannot be expected to carry on for ever. The state and local governments of the growth areas have as little direct influence on industrial investment as do those governments in declining areas. As long as decisions on investment are made on a private-enterprise basis, then the area which offers the best return to the investor will be the area of growth. When the growth area becomes obsolescent, its economic vitality will suffer and new investment will be placed elsewhere. In the South and West the indications are that this would leave central cities in poverty, suffering social problems similar to those of the North; the disadvantaged urban population has benefited only marginally from Sunbelt growth, and service provision in the sprawling Sunbelt cities may become an even larger expense than in the North. The impact of the problem of growth and decline is suffered primarily by the central cities, in the South and West as elsewhere, as taxes rise to accommodate in-migrants, to pay for public services that decline in efficiency while increasing in cost, and to finance incentives to business investment .
In conclusion, it is clear that the causes of the problems located in cities and the potential remedies lie outside any individual city’s province. The visible difficulties of social and economic deprivation, fiscal stress, ungovernability, and regional and industrial decline are the consequences of a complex combination of political, social and economic factors. There is no reason to expect simple and easy solutions to present themselves. Many Americans value highly the pluralistic nature of their society, the disjointed decision-making structures in political and social institutions; national regulation and direction are often perceived as inflexible and potentially stultifying. However, uncoordinated decision-making by individuals and institutions can lead to in equities. Whatever the arguments as to the value of this pluralism for a nation’s economy in the long run, it is clear that city problems are to a great extent consequent upon this approach to decision-making. There is no reason to believe that the best interests of suburbs, states, the federal government, major industrial and investment bodies and other significant decision-makers will combine fortuitously to produce policies in the best interests of the central cities. To maintain the fiction that those symptoms of distress found in cities are problems generated by cities themselves is to blame the victim. If the causes of distress in cities are to be tackled effectively, the co-operation of these various bodies is needed, in order to find solutions where the problems originate.
6. Guide to Further Reading
Those who wish to pursue the study of the modern American city beyond the limits of this pamphlet must turn to a number of disciplines. History explains the origins of many contemporary phenomena: Alexander B. Callow, Jr., ed., American Urban History, 2nd ed. (New York: Oxford University Press, 1973) and Zane L. Miller, The Urbanization of Modern America (New York: Harcourt Brace Jovanovich, 1973) both provide good, brief introductions to the subject. Geography also provides useful explanations of urban development, as for example in A. Pred, “Industrialization, Initial Advantage and American Metropolitan Growth”, Geographical Review, 55 (1965), 158-85, and J.R. Borchert, “American Metropolitan Evolution”, (1967).5 Another brief but useful introduction to the historical-geography approach is H.B. Rodgers, “A Profile of the American City”, in Dennis Welland, ed., The United States: A Companion to American Studies (London: Methuen, 1974), pp. 119-51.
Many aspects of contemporary urban politics, both in structure and style, can be traced to the ‘machine versus reform’ battles of the early twentieth century. The character of Progressive reform is analysed in J.A. Thompson, Progressivism (1979), the second pamphlet in this series, while the heritage of these years is discussed in Otis A. Pease, “Urban Reformers in the Progressive Era: A Reassessment”, Pacific Northwest Quarterly, 62 ( 19 61),49 -58. Of course, Progressive reform did not sweep away the political machine completely. The adaptation and survival of machine politics in Kansas City is the subject of Lyle W. Dorsett, The Pendergast Machine (New York: Oxford UP, 1968), while the second edition of Harold F. Gosnell, Machine Politics: Chicago Model (Chicago: Chicago UP, 1968) includes a Foreword by T.J. Lowi which draws comparisons between Gosnell’s original 1937 study and New York in the late 1960s. For the most commonly cited modern example of machine politics, see also Michael Royko, Boss: Richard J. Daley of Chicago (New York: E.P. Dutton, 1971) and Len O’Connor, Clout: Mayor Daley and his City (New York: Avon Books,1975).
Leadership in other cities has been treated in a variety of ways. The works of Floyd Hunter, Community Power Structure (Chapel Hill: North Carolina UP, 1953), and Robert Dahl, Who Governs? (New Haven, Conn.: Yale UP, 1961), have perhaps done most to stimulate a continuing debate about the distribution of power in cities. Peter Bachrach and Morton Baratz, Power and Poverty: Theory and Practice (New York: Oxford UP, 1970), is based on a study of Baltimore and contains especially important essays on the concepts of “nondecisions” and “the corridor of power”. Edward C. Hayes, Power Structure and Urban Policy: Who Rules in Oakland? (New York: McGraw-Hill, 1972), and Chester Hartman, Yerba Buena: Land Grab and Community Resistance in San Francisco (San Francisco: Glide Publications, 1974), both relate studies of power to the politics of urban development, a topic also covered in J.H. Mollenkopf, “The Post War Politics of Urban Development”, Politics and Society, 5 (1975-6), 247-96. A “revisionist theory” of community politics is developed in Clarence N. Stone, Economic Growth and Neighborhood Discontent: System Bias in the Urban Renewal Program of Atlanta (Chapel Hill: North Carolina UP, 1976), a study which is made all the more interesting by being set in the same city as Hunter’s original work.
The study of community power has always been concerned with the responsiveness of governments to different groups in society. In recent years such studies have increasingly been concerned with the efforts of the poor and racial minorities to influence urban policy. Saul Alinsky was an early advocate of organizing the poor, and two books, John H. Fish, Black Power/ White Control (Princeton, NJ.: Princeton UP, 1973) and Robert Bailey, Jr., Radicals in Urban Politics: The Alinsky Approach (Chicago UP, 1974), give an excellent insight into his work. Norman I. and Susan Fainstein, Urban Political Movements: The Search for Power by Minority Groups in American Cities (Englewood Cliffs, N J.: Prentice Hall, 1974), and Charles H. Levine Racial Conflict and the American Mayor (Lexington, Mass.: Lexington Books, 1974), are concerned with the role of minorities in urban politics. Joel D. Aberbach and Jack L. Walker, in Race in the City: Political Trust and Public Policy in the New Urban System (Boston: Little, Brown, 1973), contend that a general lack of public confidence in the fairness of the governmental process undermines any policy directed against racial discrimination. A major theoretical work on racial groups in cities is J. David Greenstone and Paul E. Peterson, Race and Authority in Urban Politics (New York: Russell Sage Foundation, 1973). In “The Future of Community Control”, American Political Science Review, 70 (1976), 905-23, the Fainsteins investigate the possibility that community control is a moderate and co-optative strategy which undermines efforts to gain larger redistributive changes.
The connections between race and poverty are thoroughly examined in Raymond S. Franklin and Solomon Resnick, The Political Economy of Racism (New York: Holt, Rinehart and Winston, 1973), an impressive analysis which is well supplemented by Charles Sackrey, The Political Economy of Urban Poverty (New York: W.W. Norton, 1973), and Bradley Schiller, The Economics of Poverty and Discrimination (Englewood Cliffs, N.J.: Prentice Hall, 1973). In Blaming the Victim (New York: Vintage Books, 1971), William Ryan attacks prevailing analyses of poverty and directions of anti-poverty policy, as do the various contributors to Pamela Roby, ed., The Poverty Establishment (Englewood Cliffs, NJ.: Prentice Hall,1974).
The other political and social problems concentrated in cities each deserve a separate and lengthy bibliography, but good introductions to a variety of topics are now available in the form of edited collections of essays. Willis D. Hawley et al., Theoretical Perspectives on Urban Politics (Englewood Cliffs, N.J.: Prentice Hall, 1976), contains some particularly thought-provoking essays, and David M. Gordon, ed., Problems in Political Economy: An Urban Perspective, 2nd edn. (Lexington, Mass.: D.C. Heath, 1977), provides a very good collection from a wide variety of sources, with valuable commentaries by the editor. Together with William Gorham and Nathan Glazer, eds., The Urban Predicament (1976)24 and William Tabb and Larry Sawers, eds., Marxism in the Metropolis (1978)37, these collections cover such topics as urban transportation, housing, education, crime, development policies, finance, employment, health, bureaucracy and government structure. A conservative analysis of urban problems is given by Edward Banfield in The Unheavenly City Revisited (Boston: Little, Brown, 1974), a revision of his The Unheavenly City: The Nature and Future of Our Urban Crisis (1970), which had helped prompt Norton E. Long’s The Unwalled City (New York: Basic Books, 1972) and Douglas Yates’ The Ungovernable City (Cam bridge, Mass.: MIT Press,1977).
Among the books on individual topics the following are very worthwhile: Samuel Bowles and Herbert Gintis, Schooling in Capitalist America (London: Routledge and Kegan Paul, 1976); Chester W. Hartman, Housing and Social Policy (Englewood Cliffs, NJ.: Prentice Hall,1975); K.H. Schaeffer and Elliott Sclar, Access for All (New York: Penguin Books, 1975) on transportation policy; Kenneth J. Neubeck, Corporate Response to the Urban Crisis (Lexington, Mass.: Lexington Books, 1974) and Neil Chamberlain, The Limits of Corporate Responsibility (New York: Basic Books 1973); Charles H. Levine, ed ., Managing Human Resources: A Challenge to Urban Governments (Beverley Hills: Sage Publications, 1977); Herbert Jacob, Urban Justice: Law and Order in American Cities (Englewood Cliffs, NJ.: Prentice Hall, 1973).
Recent trends in population shifts and changing intergovernmental relations are to some extent connected, and some of the literature on suburbanization, urban-federal relations, and the growth of the Sunbelt deals with these trends. The move to the suburbs is perhaps best covered. There is a good collection of essays in Louis K. Masotti and J.K. Hadden, eds., The Urbanization of the Suburbs (Beverly Hills: Sage Publications, 1973). Bennett Berger’s Working Class Suburb (Berkeley: California UP 1968) is a case study of blue-collar suburbanization. The political autonomy of suburbs, and its use to govern land use and especially to exclude minorities and the poor, is examined in Michael Danielson’s The Politics of Exclusion (1976)41. The position of those members of one minority group who are in the suburbs is the subject of Black Suburbanization (Cambridge, Mass.: Ballinger, 1976) by Harold M. Rose.
Wider intergovernmental relations are the subject of Mark Gelfand’s A Nation of Cities, 1933-1965 (1975)56 and Roscoe C. Martin, The Cities and the Federal System (1965; rept., New York: Arno Press, 1978), while more recent developments in federal-local relations are covered in David Caputo and Richard Cole, Urban Politics and Decentralization: General Revenue Sharing, (Lexington, Mass.: Lexington Books, 1974). The interest in suburban-central city relationships has perhaps led to a neglect of post-war population movements towards the South and West, but two good collections are those edited by George Sternlieb and James W. Hughes, Post-Industrial America: Metropolitan Decline and Inter-Regional Job Shifts (New Brunswick, N.J.: Center for Urban Policy Research, 1975), and by David Perry and Alfred Watkins, The Rise of the Sunbelt Cities (1977).59
Statistical evidence on past, present and possible future trends is provided in an extraordinary number of publications by the United States government, especially the Bureau of the Census. Apart from the annual Statistical Abstract of the United States there are publications dealing more particularly with racial and ethnic groups, regions, and various levels of government. The United States Government Printing Office can supply a comprehensive Directory of Federal Statistics for Local Areas: A Guide to Sources (Washington, D.C., 1978) which is an invaluable guide to available statistics.
7. Notes
- New York Times, 28 Aug. 1977,pp.1,41. Back
- Saturday Review, Aug. 1978, pp.16-21. Back
- U.S. Department of Housing and Urban Development, The President’s Urban and Regional Policy Group Report: A New Partnership to Conserve America’s Communities: A National Urban Policy (Washington, D.C., 1978). Back
- There is some imprecision in these categories because SMSAs are defined by county boundaries, and they therefore sometimes include rural areas beyond what might reasonably be considered suburban. As a result, SMSAs frequently exaggerate the type of urban areas, especially in states like Nevada, Arizona and Southern California where the counties are very large, as the back cover of this pamphlet shows. However, most available statistics relating to cities are based on these census definitions. For a more detailed discussion, see U.S. Bureau of the Census, Statistical Abstract of the United States, 1979, 100th Edition (Washington, D.C., 1979), pp.935-37. Back
- John R. Borchert, “American Metropolitan Evolution”, Geographical Review, 57 (1967), 301-32. Back
- Commission on the Future of the South, Report (Research Triangle Park, N.C.: Southern Policies Growth Board, 1974), esp. pp.20-31. Back
- Harold M. Baron, The Demand for Black Labor (Somerville, Mass.: New England Free Press, 1971); Alan Brown and Egon Neuberger, eds., Internal Migration (New York: Academic Press, 1977), 147-82; Niles M. Hansen, Rural Poverty and the Urban Crisis (Bloomington: Indiana UP,1970). Back
- L.H. Long, “How the Racial Composition of Cities Changes”, Land Economics, 51 (1975), 258-67. Back
- Statistical Abstract, 1979 pp. 17, 33. The percentage distribution of the Spanish population is for families, not individuals. Persons of Spanish origin may be of either race and therefore included in the previous two columns. Figures on the table may not add to total because of rounding. Back
- National Advisory Commission on Civil Disorders, Report (New York: Bantam, 1968),pp.143-50. Back
- Statistical Abstract, 1979, pp.462-63. Back
- Ibid., p.462. Back
- Statistical Abstract of the United States, 1977, 98th ed., pp.453; Statistical Abstract, 1979, pp.448,461. Back
- James N. Morgan and Greg J. Duncan, eds., Five Thousand Families—Patterns of Economic Progress, 7 vols. (Ann Arbor: Michigan Institute for Social Research, 1979), vol.6, ch. 8. Back
- U.S. Bureau of the Census, Social and Economic Characteristics of the Metropolitan and Nonmetropolitan Population: 1977 and 1970 (Washington, D.C., 1978), pp.64-71. Back
- There are many people, especially blacks, who because of disability or repeated discouragement have ceased to look for work and so do not register as unemployed. The official unemployment figures therefore do not include these people who have withdrawn from the potential labour force. Back
- Social and Economic Characteristics, p.15. Back
- Joseph T. Howell, Hard Living on Clay Street: Portraits of Blue Collar Families (Garden City, New York: Anchor Books, 1973). Other studies which use personal reportage of city life are: William Whyte, Street Corner Society (Chicago: Chicago UP, 1952); Elliott Liebow, Tally’s Corner (London: Routledge and Kegan Paul, 1967); Gerald Suttles, The Social Order of the Slum (Chicago: Chicago UP, 1968); Todd Gitlin and Nanci Hollander, Uptown: Poor Whites in Chicago (New York: Harper and Row, 1970); Studs Terkel, Working (New York: Avon Books, 1975). Back
- Quoted in Marian Lief Palley and Howard A. Palley, Urban America and Public Policies (Lexington, Mass.: D.C. Heath, 1977), p.192. Back
- Statistical Abstract, 1979, p.345. Back
- Gary A. Tobin, ed., The Changing Structure of the City (Beverly Hills: Sage, 1979), pp.233-40. Back
- Journal of the American Planning Association, 45 (1979), special issue on Gentrification. Back
- Tobin, pp.157-75. Back
- William Gorham and Nathan Glazer, eds., The Urban Predicament (Washington, D.C.: The Urban Institute, 1976), pp.231-80. Back
- New York Times, 28 Aug. 1977,pp.1, 34, 41. Back
- Statistical Abstract, 1979, pp.180-81. Back
- Clarence N. Stone, Robert K. Whelan and William J . Murin, Urban Policy and Politics in a Bureaucratic Age (Englewood Cliffs, NJ.: Prentice-Hall, 1979), p.297. Back
- U.S. Advisory Commission on Intergovernmental Relations, Improving Urban America: A Challenge to Federalism (Washington, D.C., 1976), and idem, City Financial Emergencies: The Intergovernmental Dimension (Washington, D.C., 1973). Back
- Daniel R. Grant and Herman C. Nixon, State and Local Government in America, 3rd ed. (Boston: Allyn and Bacon, 1975), p.396. Back
- R.L. Lineberry and E.P. Fowler, “Reformism and Public Policy in American Cities”, American Political Science Review, 61 (1967), 701-16; T.R. Dye and J.A. Garcia, “Structure, Function and Policy in American Cities”, Urban Affairs Quarterly, 14 (1978), 103-22; Demetrios Caraley, City Governments and Urban Problems (Englewood Cliffs, NJ.: Prentice Hall, 1977), pp.3-7. Back
- Herbert Kohler, Economics and Urban Problems (Lexington, Mass.: D.C. Heath , 1973), p. 420. Back
- Arnold J. Meltsner, The Politics of City Revenue (Berkeley: California UP, 1971); see also Jeffrey L. Pressman, Federal Programs and City Politics: The Dynamics of the Aid Process in Oakland (Berkeley: California UP, 1975). Back
- James M. Maxwell and J. Richard Aronson, Financing State and Local Governments, 3rd ed. (Washington, D.C.: Brookings Institution, 1977), pp.11, 196. Back
- Statistical Abstract, 1979, pp.307, 310. Back
- ACIR, Improving Urban America, p.63. Back
- U.S. Congressional Budget Office, New York City’s Fiscal Problem (Washington, D.C., 1975), pp.11-13. Back
- International Herald Tribune, 2 May 1979. See also William K. Tabb and Larry Sawers, eds., Marxism in the Metropolis (New York: Oxford UP, 1978), pp.241-66. Back
- ACIR, Improving Urban America, pp.143, 145. Back
- Ibid, p.147. Back
- Ibid., p.150. Back
- Kohler, Economics and Urban Problems, p.287; see also Michael N. Danielson, The Politics of Exclusion (New York: Columbia UP, 1976). Back
- Howard W. Hallman, Small and Large Together: Governing the Metropolis (Beverly Hills, Cal.: Sage, 1977). Back
- G. Sternlieb and J.W. Hughes, “New Regional and Metropolitan Realities of America”, Journal of the American Institute of Planners, 43 (1977), 227-41; Social and Economic Characteristics, pp.50-53. Back
- Edward C. Banfield, ed., Urban Government (New York: Free Press 1969), p.154. Back
- Hallman, pp.81 ff.; for discussion of an alternative to consolidation, see A. Reschovsky and E. Knoff, “Tax Base Sharing: An Assessment of the Minnesota Experience”, Journal of the American Institute of Planners 43 (1977), 361-70. Back
- Hallman, p.34. Back
- See C.V. Hamilton, “Blacks and Electoral Politics”, Social Policy, (May/June 1978), 21-27; T.P. Murphy, “Race-Base Accounting: Assigning the Costs and Benefits of a Racially Motivated Annexation”, Urban Affairs Quarterly, 14 (1978), 169-94. Back
- Parris N. Glendening and Mavis Mann Reeves, Pragmatic Federalism (Pacific Palisades, Cal.: Palisades Publishers, 1977), pp.247-49. Back
- K. Newton, “American Urban Politics: Social Class, Political Structure and Public Goods”, Urban Affairs Quarterly, 11 ( 19 75), 241-64. Back
- Sternlieb and Hughes, p.231; Tabb and Sawers, eds., Marxism in the Metropolis, p.247. Back
- See n. 3 above. Back
- Robert P. Inman et. al., Financing the New Federalism: Revenue Sharing, Conditional Grants, and Taxation (Baltimore: John Hopkins UP, 1975). Back
- Economics Department, The First National Bank of Boston and Touche Ross and Co., Urban Fiscal Stress: A Comparative Analysis of Sixty-Six U.S. Cities (New York: Touche Ross and Co., 1979). Back
- Caraley, City Governments and Urban Problems, p.138. Back
- For example, see the analysis of urban renewal given in Martin Anderson, The Federal Bulldozer (Cambridge, Mass.: MIT Press, 1964). Back
- B. Checkoway, “Large Builders, Federal Housing Programs, and Postwar Suburbanization”, International Journal of Urban and Regional Research, 4 (1980), 21-45; Mark I. Gelfand, A Nation of Cities: The Federal Government and Urban America (New York: Oxford UP,1975); Thomas P. Murphy and John Rehfuss, Urban Politics in the Suburban Era (Homewood, III.: Dorsey Press, 1976), pp.7-27. Back
- E. Blaine Liner and Lawrence K. Lynch, eds., The Economics of Southern Growth (Research Triangle Park, N.C.: Southern Growth Policies Board, 1977), pp.131-73; International Herald Tribune, 19 Mar. 1979, p.16. Back
- For example, see the essays in Liner and Lynch, eds. Back
- David C. Perry and Alfred J . Watkins, eds., The Rise of the Sunbelt Cities (Beverly Hills,Cal.:Sage,1977), p.50. See also W.P. Beaton and J.L. Cox with R.M. Morris, “Toward an Accidental National Urbanization Policy”, Journal of the American Institute of Planners, 43 (1977), 54-61. Back
- D. Caraley, “Congressional Politics and Urban Aid”, Political Science Quarterly, 91 (1976), 19-45; and “Congressional Politics and Urban Aid: A 1978 Postscript”, ibid., 93 (1978), 411-19. Back
- Perry and Watkins, eds., p.296. The subemployed include those registered as unemployed, those unemployed people who have stopped looking for work and are not registered as unemployed, part-time workers who would like full-time jobs, and workers who earn substandard wages and live below or near the poverty line. Back
- Richard P. Appelbaum et. al., The Effects of Urban Growth: A Population Impact Analysis (New York: Praeger, 1976), pp.314-15. Back
POSTSCRIPT
Following the 1980 census, seventeen House districts (and Presidential Electoral College votes) have shifted from the Northeast and Midwest to states in the South and West – New York suffering the largest loss (5 seats) while the biggest gains went to Texas (3) and Florida (4). This follows a similar regional shift of eighteen seats between 1950 and 1970, and population projections suggest a further fifteen seats might move the same way in the next two decades (cf. p.37). As national political power has shifted a little further to the Sunbelt, so one Sunbelt state has moved a little more into mainstream politics: Arizona finally joined all the other states participating in the Medicaid programme on 1st October 1982, seventeen years after the programme began, having developed a competitive health scheme approved by the Reagan administration. Even in this very conservative state, the financial burden on county governments of providing indigent health care proved too much to hold out further against ‘socialized’ medicine (cf. p.14). Regional urban and industrial change are examined in a most interesting and thoughtful way in two recent books: Carl Abbott, The New Urban America: Growth and Politics in Sunbelt Cities (Chapel Hill: North Carolina UP,1981), and B. Bluestone & B. Harrison, The Deindustrialization of America (New York: Basic, 1982).
Top of the Page
John D. Lees, The President and the Supreme Court: New Deal to Watergate
BAAS Pamphlet No. 3 (First Published 1980)
ISBN: 0 9504601 3 3
- The Constitutional Legacy
- The New Deal and the Court, 1933-1938
- Foreign Affairs and National Emergencies, 1936-1952
- The Warren Court, Civil Rights, and the Nixon Response
- The Court and the Revolutionary Presidency
- Epilogue
- Guide to Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1: The Constitutional Legacy
Every four years on Inauguration Day a newly elected President accepts the burdens and duties of the office. The ceremony is presided over by the Chief Justice of the United States, who requires the President to take the oath or affirmation set out in the Constitution:
I do solemnly swear (or affirm) that I will faithfully execute the office of President of the United States, and will to the best of my ability, preserve, protect and defend the Constitution of the United States.
The President, of course, is taking up probably the most powerful elected political office in history – and the most frustrating, impossible and man-killing job.[1] The Chief Justice provides a marked contrast. Unlike the new President, his name and face will often be unfamiliar to a majority of the many millions who nowadays watch an inauguration ceremony, yet his role in the ceremony and in the working: of the political system may be at least as important as that of any President.
At the Inauguration the Chief Justice is symbolically the more significant figure, for it is he who invests the new chief executive with legitimacy and constitutional authority. In his routine capacity, moreover, he can dictate the terms upon which the President exercises his power – as did the Chief Justice pictured on the front cover of the pamphlet at the 1973 Inauguration. Such authority derives from the fact that he presides over the Supreme Court of the United States – a body of nine appointed Justices who have been characterized in turn as guardians of the Constitution, and as undemocratic and irresponsible old men.
The role of the Supreme Court in American government is both less visible and less understood than that of the President. First and foremost, it is a court of law and the highest court in the American judicial system. When we think of judges, we normally think of persons who are impartial, neutral and non-political, and indeed this aura of impartiality is projected in many of the formal activities of the Justices of the Supreme Court. They wear long black robes, their proceedings are conducted with formality, dignity and decorum, and meetings to discuss pending cases are conducted in secret. Nevertheless the Supreme Court is, in. comparison with other judicial bodies throughout the world, a powerful political institution. From the beginning it has been involved in the policy making processes of American government. As a court of law operating under a written constitution its impact is judicial, but in performing this role it may make decisions which have direct implications for public policy, for example by invalidating a decision of a President. To this extent its influence on public policy is different from that of the Congress or the President, and the nature of its authority makes it a difficult institution to comprehend. It is the intention here to aid such understanding by analysing its relationship with the Presidency in the years 1933 to 1974.
The rationale for concentrating on this period is a simple one. The New Deal and the Presidency of Franklin Roosevelt produced a fundamental change in the American political process, affecting the national government as a whole and the role of the President in particular. World War II and the realities of the Cold War furthered the concentration of authority and responsibility in the executive, and provided the foundation for what Arthur M. Schlesinger, Jr., has described as “the imperial Presidency”.[2] Watergate, and the attempted cover-up, produced a political crisis which again affected the Presidency – and the future role of the chief executive – in ways which are still difficult to assess with precision.[3] At critical moments in these developments Supreme Court decisions were of great significance, and they raised crucial issues regarding the scope and the limits of executive and judicial authority.
In order to appreciate the importance of Court decisions in the development of the modern Presidency since 1933, it is necessary to understand the constitutional role of these two branches of government. The Supreme Court and the President operate within a system where powers and responsibilities are in practice shared by separate and distinct institutions. The Constitution groups national governmental functions under three general powers – legislative, executive and judicial – and provides for a co-ordinate status between the three institutions exercising these powers. Built into this formal separation of institutions is an elaborate pattern of checks and balances designed to make the activities of each partially but not wholly dependent on the actions and decisions of the others. Thus, for example, the nomination and appointment of Supreme Court Justices is initiated by the President, but the Senate must concur with the nomination. Justices are appointed for life, while the elected term of the President is fixed at four years with the possibility of reelection. Both may be removed from office if impeached by the House of Representatives for “treason, bribery, or other high crimes and misdemeanors”, and found guilty by the Senate. If the President is impeached, the Chief Justice presides over the trial in the Senate.
What is striking in any examination of the rise and apparent demise of “the imperial Presidency” is how little of the authority claimed or assumed by recent Presidents was granted formally by the Constitution – as the centre page of the pamphlet reveals. This was a consequence of the suspicion of one-man executives inherited from the Revolution, and a practical recognition by the framers of the Constitution that the President must be given few specific powers in the Constitution itself if ratification of the document was to be obtained. Because the President was granted such a modest catalogue of powers, much of the development of executive authority has been assumed, or delegated by statute or by congressional resolution. More importantly for our purposes, it has required legitimization by decisions of the Supreme Court. To this extent presidential authority has often rested on shakier constitutional foundations than the more carefully defined powers of Congress. Presidents have been helped by the fact that they can initiate nominations to the Supreme Court, but this by no means guarantees that they can control in any systematic way the decisions of the Court. Hence analysis of the political interactions between the President and the Supreme Court provides an excellent illustration of how the Constitution, and the system of government which has evolved, operate in practice.
At the outset it was not clear what would be the precise role of the Supreme Court. However, there are strong grounds for suggesting that the Court was intended to be a check on the legislative branch, which the framers feared might pass laws popular with current majorities of the people but contrary to the intent of the Constitution. Robert Scigliano, in his study of relations between the Supreme Court and the Presidency,[4] argues that these two institutions were intended to act, in certain circumstances, as an informal alliance or check on the legislative authority of Congress, and have in fact done so. However valid this interpretation may be of the constitutional basis of the respective authority of these institutions, events have served to make the actual relationships between the three branches of the national government very different. While Congress has generally not fared as well as the President with respect to Supreme Court decisions, on occasions both the President and Congress have been allied against the Supreme Court. Relations between the President and the Supreme Court have not always been harmonious, as can be seen by brief reference to specific events before 1933.
A federal court structure was created by the Judiciary Act of 1789, but there was no conflict between a President and the Court until the appointment of John Marshall as Chief Justice. The Federalists, heavily defeated in the 1800 elections, retreated into the judiciary. Marshall, formerly Secretary of State under President John Adams, became Chief Justice, and the lame-duck Federalist Congress created new federal judgeships to which Adams appointed worthy Federalists. The Federalists sought to ‘pack’ the judiciary as a possible check on the likely excesses of the new administration of Thomas Jefferson. Many Jeffersonians, however, believed that the judiciary had no right to control the popular will, which, as the source of the Constitution’s authority, could by definition do nothing unconstitutional.
When the Jefferson administration refused to deliver a commission to William Marbury, a last-minute judicial appointee of Adams, the Supreme Court, acting on a petition by Marbury and others, issued a ‘show cause’ order to Secretary of State Madison regarding delivery of the signed commissions. Madison ignored the order, and the Court was faced with a direct challenge to its authority. If Marshall now decided that Marbury should be given his commission, the Jefferson administration would again ignore the Court, and there was no way in which the Court could enforce its decision.
Marshall, however, proved equal to the challenge. Speaking for the Court, he affirmed that Marbury had a legal right to the commission but that the Supreme Court could not issue a writ of mandamus to federal officials, as Marbury requested, because Section 13 of the 1789 Judiciary Act granting the Court this authority under original jurisdiction was unconstitutional. The Constitution clearly prescribed the original jurisdiction of the Supreme Court, Congress could not alter that jurisdiction, and its attempt to do so violated the Constitution.[5] As a legal opinion the decision was narrow and strained in its logic. However, for Marshall it achieved several purposes. It allowed him to lecture Madison and Jefferson on their constitutional obligations as executive officials. By deciding against Marbury he avoided the possibility of any defiance of a Court decision. Most significantly, in declaring part of a statute void because it gave the Court authority contrary to the Constitution, he established the principle that the Court could declare void an act of Congress, without risking charges that the Court was engaged in any self-aggrandizement of authority. By so acting he established implicitly, if not explicitly, that the Constitution incorporated the principle of judicial review – the power of the federal judiciary, especially the Supreme Court, to judge on the constitutionality of actions of the coordinate branches of the federal government, President as well as Congress, and if necessary to declare such actions unconstitutional.[6]
As the decision on the immediate issue did not seriously embarrass Jefferson, Marshall avoided a direct attack on the Court by Jefferson. However, the President soon showed his willingness to use impeachment as a means of controlling the judiciary. In a test case concerning the injudicious behaviour of Justice Chase, the Jeffersonian majority in the House initiated impeachment proceedings, but in the Senate the administration failed to secure the two-thirds majority necessary for conviction. Further conflict ensued during the treason trial of Aaron Burr, when Marshall subpoenaed Jefferson to appear before the Court with certain documents. Jefferson refused to obey the order, arguing that the independence of the executive would be jeopardized were the President to comply. In the trial itself Marshall excluded much of the government’s testimony as irrelevant, and the jury found Burr not guilty. Though Jefferson contemplated the possible impeachment of Marshall at this time, he did not succeed in limiting the independence of the Supreme Court. In Scigliano’s opinion, Jefferson, at least as President, “never openly acted or spoke against the judicial branch, or any decisions rendered by it, or openly expressed his view of the role of the judiciary in the system of separation of powers.”[7]
Subsequently Marshall, who presided over the Court until 1835, made important assertions of national governmental authority in a series of decisions after 1815, and so affirmed the position and the duty of the Court as the final interpreter of the Constitution. Up to the Civil War, conflicts between Presidents and the Court occurred primarily over different interpretations of the role of the national government and of the judiciary within the national government, with Presidents such as Jackson often reflecting a states-rights or limited national government position.
In the decades after the Civil War the legitimacy and authority of the Court seemed more firmly established than that of the President. Presidents were on the whole weak and did not associate themselves with clearcut programmes of legislation. Hence they raised little objection when the Court began to strike down state and federal legislation deemed to infringe constitutional rights. Where- Marshall had used judicial authority primarily to give substance to the constitutional powers of the national government, the Court now demonstrated that judicial review could be the vehicle for judicial “supremacy” of a different kind. Insisting on a narrow interpretation of the authority of the elected branches of government with respect to many social and economic matters, it did not hesitate to declare unconstitutional specific state and national laws.
By the early twentieth century, however, more energetic Presidents were coming to power who were sympathetic to social reforms and the regulation of business, and they began to express concern at the potential threat of judicial challenges to their policies. Theodore Roosevelt, while in office, confined himself to vigorous verbal questioning of the validity of the judiciary’s claim to have the final say regarding constitutional interpretation, adding that appointed judges were incapable of formulating relevant legal precepts to meet contemporary problems. Woodrow Wilson also expressed concern about judicial “supremacy”, but he in turn did not challenge directly the Court’s authority. He did, however, come to recognize the potential significance of the President’s power to initiate nominations to the Court. So indeed did conservatives like the former President, William Howard Taft, who by 1920 believed that control of judicial appointments was one of the major issues of the day. The Republican ascendancy of the following decade, however, ensured that Presidents would continue to respect, and by their judicial nominations reinforce, the conservative outlook of the Court on social and economic questions.
The use of judicial review by the Court in the Gilded Age and in the first third of the twentieth century was accepted in part because of public respect for its authority, but also because it was not out of step with dominant public attitudes of the time.[8] Furthermore, the Court did demonstrate that it could use its authority to expand the existing constitutional powers of the President in certain circumstances. For example, in the Neagle case of 1890 [9] the Court held that the Constitution’s provision that the President shall “take care that the laws be faithfully executed” conferred an authority to protect all those who aid in the performance of federal governmental responsibilities. The Court found, in a clause that seemed to impose orily a duty on the President, a grant of power to prevent violations of the peace of the United States. In 1895 this was extended when the Court ruled, in the Debs case,[10] that the President might use military force within a state to ensure that national instrumentalities such as the postal service could function properly. Similarly, in 1926 in the Myers case[11] the Court strengthened presidential control of executive-branch personnel by asserting that a President may remove a subordinate officer without seeking the consent of the Senate.
Hence up to 1933 Presidents rarely challenged judicial authority directly, and no clash seemed likely unless a popular President, willing to use his authority, was determined to implement reforms and changes of the sort the Supreme Court might consider to be unconstitutional. The conjunction of a grave political and economic emergency, an assertive and popular President, and a strong-willed judiciary, provided in 1933 the ingredients for a constitutional conflict of major dimensions.
2 : The New Deal and the Court, 1933-1938
The stock-market collapse of October 1929 precipitated the most serious economic depression in American history. In the face of this crisis, President Hoover was reluctant to recognize the need for major institutional changes or strong national governmental action.[12] Consequently the severity of the depression made economic issues dominant in the 1932 presidential election. The major challenge was to find policies which might restore economic stability and public confidence, alleviate economic inequalities, and avoid future depressions. Franklin D. Roosevelt gained the Democratic nomination, and in a series of campaign speeches captured the imagination and votes of many Americans by his powerful assertion of the need for bold experimentation and a “new deal” by government – which could be achieved only by strong executive leadership of the type evident before under Woodrow Wilson.
Elected to the Presidency along with large Democratic majorities in both houses of Congress, Roosevelt was given a mandate to use appropriate national authority to deal with the domestic crisis. In his inaugural address Roosevelt made it clear that he believed in a flexible interpretation of the Constitution and in the legitimacy of federal legislation to deal with the existing emergency. He promised to seek “broad Executive authority”, and immediately sent a series of bills to Congress. In the famous Hundred Days, Congress passed an unprecedented amount of new legislation. The emergency programme initiated by Roosevelt constituted a far reaching assertion of federal governmental authority over economic affairs and individuals alike, generally justified in terms of the general-welfare or the interstate commerce clauses of the Constitution. Some of the legislation, moreover, contained substantial delegations of legislative authority to the executive. Those interests adversely affected by the new legislation speedily brought litigation, and it was evident that the attitude of the Supreme Court towards these measures would be of great importance.
The record of the Court suggested that it would not be sympathetic to the new initiatives. Through the 1920s the Court had been dominated by William Howard Taft, whom President Harding appointed Chief Justice in 1921. Under him the Court had declared unconstitutional state and federal laws relating to matters such as a minimum wage or the employment of children. Taft, retired in 1930, to be succeeded by the more moderate Charles Evan Hughes, but the personnel and outlook of the Court in 1933 had not changed greatly since the mid-twenties.
However, it was clear that the members of the Court would not find it easy to express a unanimous view on New Deal legislation. The most critical cases involved constitutional issues where Justices might produce equally relevant precedents to support either side of the argument. Four Justices (Van Devanter, Butler, McReynolds and Sutherland) had long-standing records of opposition to major extensions of federal governmental authority, especially with respect to the regulation of interstate commerce. Two Justices, Brandeis and Stone, seemed likely to be sympathetic to the extension of federal regulatory powers, though not necessarily of executive authority, and to the need for judicial restraint in order to permit the elected branches of government some flexibility in dealing with a crisis. Cardozo had been nominated somewhat reluctantly by Hoover in 1932 to meet the demands of a Democratic Senate, and he was expected to support Brandeis and Stone. Roberts had joined the Court in 1930, after Hoover’s initial selection had been rejected by the Senate. He appealed to both conservatives and liberals in the Senate, and together with Hughes came to play a crucial role in the future events. The stage was set for a possible constitutional confrontation between the Court and the elected branches of government led by the President, but also for a constitutional debate within the Court itself, as Chief Justice Hughes unhappily recognized.
The Supreme Court did not consider a case involving a New Deal statute until January 1935, though in 1934 the validity of two state laws was sustained with a majority of the Justices accepting that controls might be necessary in emergencies if they were in the public interest. In the sixteen months that followed the Court considered ten cases, or groups of cases, involving the New Deal statutes. In eight instances the decisions went against Roosevelt and the New Deal agencies. only two measures, one involving the Tennessee Valley Authority Act, were given guarded approval. Several of the invalidations were by unanimous or near-unanimous decisions, but some of the major ones were by the narrowest possible majority (5-4).
The first blow came in January 1935, in Panama Refining Co.v. Ryan.[13] The case concerned Section 9(c) of the National Industrial Recovery Act. This gave the President the authority to prohibit the transportation across state lines of oil produced in excess of limitations imposed by states in order to conserve resources and stabilize prices. There were precedents for such federal co-operation in the enforcement of state laws, but Hughes, speaking for eight of the nine Justices, declared that Section 9(c) was unconstitutional. He stated that it constituted an invalid delegation of legislative power because it set no guidelines or standards for executive action, nor any restrictions on executive discretion. Hence any executive orders issued under the authority of Section 9(c) were without constitutional authority. The Court thus, for the first time, held unconstitutional a provision in a statute delegating to the executive quasi-legislative authority.
In May 1935 the Court went further and in a unanimous opinion held the rest of the National Industrial Recovery Act to be unconstitutional. The Act had authorized a major industrial recovery programme co-ordinated by a National Recovery Administration. Through this the Roosevelt administration hoped to encourage the resumption of normal production, plus increased employment and wages. Codes of fair competition were set up which included certain provisions relating to the rights of workers. The Act had some initial beneficial effects, but by 1935 the NRA was in some disarray. The Schechter (or ‘sick chicken’) case[14] involved an appeal by slaughterhouse operators against a conviction for violation of the code of fair competition for the live-poultry industry in New York City. In his opinion Hughes considered three questions: was the law justified given the grave national crisis, did the law illegally delegate legislative power, and did it exceed the limits of the interstate commerce power? He answered the first question by asserting that extraordinary conditions did not “create or enlarge constitutional power”, and assertions of citra constitutional authority were precluded by the Tenth Amendment. To the second his answer was that the codes constituted a form of delegation utterly inconsistent with the constitutional duties of Congress. Finally, he affirmed that the poultry code attempted to regulate transactions within states and so exceeded the federal commerce power.
On the same day, in deciding the case of Humphrey’s Executor v. United States,[15]the Court further embarrassed the President by declaring unanimously that his removal of a member of the Federal Trade Commission was invalid. The Court held that Congress had conferred upon the Commission independence of the President, and so it was not an agent of the executive. This decision modified the Myers judgement of 1926, by holding that removal authority applied only to purely executive officers in the departments. Roosevelt was incensed by these decisions, seeing them as direct challenges to both his policies and his executive prerogatives. At a press conference he attacked the actions of the Court, but his dilemma was that here at least he was speaking out against a unanimous Court. His administration sought further legislation to regulate industry, and quietly began to consider ways of mounting a counter-attack against the Court.
On January 6th, 1936, the Supreme Court made its long. awaited decision on the second of the major New Deal recovery programmes. In United States v. Butler[16] the Court, dividing 6-3, ruled unconstitutional the processing tax of the Agricultural Adjustment Act. This decision revealed serious divisions within the Court with respect to certain constitutional issues raised by the New Deal legislation. Justice Roberts held the tax to be an illegitimate use of the taxing power. While Congress could tax and spend for the general welfare, it could not use the bait of tax revenue to effect federal regulation of economic activity such as agricultural production. Two other aspects of the case were significant. The first was the enunciation by Roberts of the role of the Justices in determining the constitutionality of legislation. His narrow, mechanistic approach and its consequences were challenged in a strong dissenting opinion by Justice Stone, supported by Brandeis and Cardozo, which not only attacked the logic of Roberts’ opinion but the tendency of the Court to act as a super legislature via tortured constructions of the Constitution. Stone’s advocacy of judicial self-restraint, together with strong public criticism of the Butler decision, confirmed the views of some members of the Roosevelt administration that the real problem was not the Constitution but the composition of the Court. Hence a few Roosevelt appointments, now or if he were re-elected in 1936, could change the situation.
Moreover, there were several initiatives in Congress for legislation to restrict the Court’s prerogative of judicial review, or to change the size of the Court. Any doubts as to the need to act were removed in May 1936 when the Court, again by a 6-3 vote, invalidated the Bituminous Coal Act of 1935,[17] and followed this up with a 5-4 decision setting aside a state minimum-wage law for women. In effect, in the words of Alpheus Thomas Mason, “From January through June, 1936, the Court wove a constitutional fabric so tight as to bind political power at all levels”,[18] a view shared by Roosevelt at the time. Nevertheless, putting first things first, Roosevelt concentrated on winning the 1936 Presidential election and gaining popular approval for the concrete achievements of the New Deal and his own style of executive leadership. In the Democratic platform and in his campaign he avoided specific statements as to what he might do regarding the Court, though opponents warned that a Roosevelt victory would lead to measures attacking the composition or the jurisdiction of the Court.
In November 1956 Roosevelt gained a landslide victory, winning the electoral-college votes of every state except Maine and Vermont. Very soon the Court would reconvene, and before it would be several cases challenging the validity of legislation initiated by Roosevelt. The options open to the President were several. He could wait for vacancies to occur, yet an unprecedented four years had gone by without such an opportunity. McReynolds and Sutherland, two of his strongest opponents, were elderly, but seemed determined to remain on the Court as long as he remained in the White House. He could wait and see whether the election result might lead one or two Justices to change their views. He could support various proposals to change the Court’s power by legislation or constitutional amendment, or recommend legislation to reorganize the judiciary. On February 5th, 1937, Roosevelt ended his silence by presenting legislation to Congress. There has been much speculation as to how and why Roosevelt acted in the way he did. Most interestingly, William Leuchtenburg[19] has argued that the President came to reject the view that the Court might change voluntarily, but believed that action through a constitutional amendment was difficult to frame and achieve quickly. He was unenthusiastic about various legislative proposals to change the jurisdiction of the Court, and finally agreed with Attorney General Cummings on the need for some plan to ‘pack’ the Court to permit the appointment of Justices in tune with the times. Cummings, in searching for an appropriate recommendation, came across one made in 1913 by Justice McReynolds, then Wilson’s Attorney General! In essence McReynolds suggested that the President should be permitted to appoint additional federal judges for every judge in courts below the Supreme Court who failed to retire at the age of seventy. Cummings used this as the basis of a proposal which he put to Roosevelt relating the principles of age and additional appointments to reform of the entire judiciary. The scheme was attractive as there had been recent demands for more lower court judges to relieve congestion, and complaints about the age of some of the Justices. Roosevelt accepted the proposal and a bill was drafted.
It was emphasized that the bill was intended to reorganize the federal judiciary rather than to ‘pack’ the Court. It provided that, whenever any federal judge who had served ten years or more failed to retire after reaching seventy, the President might appoint an additional judge to the court on which he served. No more than fifty additional judges might be appointed, and the maximum size of the Supreme Court was set at fifteen. In a message accompanying the bill, Roosevelt expressed his concern at the backlog of cases and for the efficiency of the federal courts in general. Most older judges were characterized as unable to perform their duties or antiquated in outlook.
Roosevelt had acted, but in an ultimately ineffective manner. It seemed obvious that the bill was a court-packing scheme in disguise. By failing to confront the real constitutional issue of the particular use of judicial review by the Court, it alienated his supporters on the Court and confused many of his congressional allies. Even Brandeis, himself eighty, joined Hughes in persuasively demolishing the charge that the Justices had a backlog of cases. Hughes wrote a letter to the Senate Judiciary Committee providing statistical evidence that no backlog existed and arguing that enlargement would impair the efficiency of the Court. This action undercut the validity of Roosevelt’s proposal. The bill itself also weakened his political position, strengthening conservative opposition and creating a powerful public sentiment that his proposal was wrong in principle. In July the Senate rejected Roosevelt’s bill, but as a gesture to the President later passed an uncontroversial Judiciary Reform Act.
Meanwhile the Court itself had removed the need for action. In a dramatic demonstration of an ability to recognize the force of public opinion and electoral realities, the Court, between March and June, validated state legislation and both the National Labour Relations and the Social Security Acts of 1935. Justice Van Devanter resigned, followed closely by Sutherland. The final victory appeared to be Roosevelt’s. Between 1937 and 1943 he nominated a new Chief Justice and eight Associate Justices, and by 1942 the Court had accepted a substantial enlargement of national governmental authority over economic matters. In fact, the real situation was more fluid.
As Herman Pritchett has shown, the Justices appointed by Roosevelt were not united in their views.[20]If anything, they edged tentatively towards a new judicial role outlined initially by Justice Stone, a Republican appointee but Roosevelt’s choice in 1941 as Chief Justice to succeed Hughes. In the otherwise obscure case of United States v. Carolene Products Co. in 1938,[21] Stone included in a footnote a practical guide for the application of the judicial self-restraint he had himself advocated in 1936 in his dissent from the Butler opinion. Put simply, his argument was that the Court should presume the constitutionality of economic legislation, deferring to the wishes of the elected branches of government, and confine the possible exercise of judicial activism to the defence of individual and minority rights as expressed in the Bill of Rights (the first ten amendments to the Constitution) and the Fourteenth Amendment.
The confrontation between Roosevelt and the Court had significant consequences for the authority of the President, but above all it concerned the nature and spirit of the Constitution. In some respects it was a replay of the clash between Jefferson and Marshall. The major antagonists in both instances were the Chief Justice and the President. In neither instance was the Court’s power formally under attack. Jefferson and Roosevelt, for different reasons, sought primarily to curb abuses of its authority but found Marshall and Hughes at least their equals in political skill. The failure of the court-packing plan and the judicial ‘retreat’ engineered by Hughes preserved the authority of the Court and meant that for a long time neither President nor Congress would try to curb the Court.
It can be argued that the conflict between Roosevelt and the Court was not over fundamental issues but about the specific use of authority by particular individuals. Roosevelt did not challenge the legitimacy of judicial review, only its particular use by the Hughes Court. In turn the Court did not challenge presidential power but up to 1937 questioned the delegation of certain authority to the President by Congress and the specific use of executive authority by a particular President. Hence both protagonists may be said to have shared a similar concern: that no single institution become too dominant, nor any set of officeholders so extreme in the use of their authority as to distort the purpose of the separation of powers. Roosevelt and others feared that the Court would become a “super-legislature”, a group of non-elected Justices preventing the elected branches of government from implementing policies deemed to be in the national interest. The Court and its supporters feared the tyranny of the majority as reflected in a chief executive acting as a virtual dictator in domestic affairs in certain circumstances. In fact both institutions emerged scarred but essentially unscathed, able in turn to have a significant influence on public policy, and sufficiently powerful for charges of “executive dictatorship” and “government by judiciary” to be made again in the future.
3: Foreign Affairs and National Emergencies, 1936-1952
Relations between the President and the Supreme Court in the 1930s demonstrate that the Court was prepared to challenge any extensions of executive authority to deal with domestic ’emergencies’, whether such authority was claimed by a particular President or delegated to the executive by Congress. With respect to foreign affairs, the general attitude of the Court has been different. While the Constitution divides the power over foreign affairs between the President and the Senate, and invites a degree of conflict over the privilege of making foreign policy, the President has certain advantages. Unlike domestic affairs, foreign policy has been deemed to be inherently a national governmental responsibility and an area in which the executive has particular responsibilities and obligations to respond to international crises and, in times of war, to use his authority as commander-in-chief of the armed forces.
The political and constitutional problems raised by World War II and America’s new international position in the era of the Cold War were massive but not unprecedented. Roosevelt’s conduct in the international crisis between 1939 and 1941 was analogous constitutionally to that of Lincoln in the Civil War crisis, and provides an important illustration of the dilemma that may face the Court when a President is forced in time of war or threat to national security to assume prerogative powers hitherto considered neither necessary nor constitutional. At the time of the Civil War certain issues arose concerning the nature of war powers (Congress having the authority to declare war), and important precedents were set by the Supreme Court. Overall, the Court showed restraint. Iii the Prize Cases in 1863 [22]a majority of the Court upheld the legality of the President’s decision to order the capture of certain neutral ships and cargoes. Their opinion stated that while the President could not initiate war, when it was begun by insurrection he must accept responsibility without waiting for legislative authority and “must determine what degree of force the crisis demands.” A minority of four Justices insisted that the basic war power belonged to Congress. The same issue was raised during World War l, but most of the critical war measures never came before the Court, and with one exception the few that did came well after the Armistice. As in the Civil War, the Court found it difficult to challenge the constitutionality of a federal war activity while the war was in progress. Moreover, war was formally declared by Congress and Wilson acted from the beginning through broad grants of authority delegated to him by Congress.
Roosevelt’s ability to take initiatives in the face of the various threats to national security between 1939 and 1941 was helped considerably by a decision of the Court in 1936,[23] written by Justice Sutherland, at that time the bete noire of the New Dealers. In 1934 Congress had passed a joint resolution permitting the President to place embargoes on the sale of arms and ammunition to warring nations. The resolution placed no restriction on the President’s discretion in establishing such embargoes. Roosevelt declared an embargo on the sale of arms and munitions to Bolivia and Paraguay, and the Curtiss-Wright Export Corporation was indicted for selling arms to Bolivia. Curtiss-Wright claimed that the embargo was an unconstitutional delegation of legislative power, an argument the Court had found persuasive in recent cases relating to New Deal economic programmes.
The Supreme Court rejected this argument. Justice Sutherland stated that the rule regarding delegation of power was a restriction on Congress in domestic affairs, but was irrelevant in foreign affairs, because here the national government had certain inherent powers. As to who should exercise these powers, Sutherland came close to holding that these rested almost exclusively with the President. Hence it was not necessary to demonstrate the validity of the joint resolution, since Roosevelt might have established an embargo on his own initiative. Sutherland’s view that the President is the “sole organ” of the federal government in foreign relations seemed to make him the sole executor of American foreign policy, even if such authority “must be exercised in subordination to the applicable provisions of the Constitution.”
This was an important concession to executive authority which Roosevelt exploited both before and during World War II, without any direct challenge from the Court.[24] While it is conceivable that Roosevelt’s continued use of executive prerogatives in foreign policy and of his powers as commander-in-chief without a formal declaration of war might have provoked a constitutional crisis, this became academic after the attack on Pearl Harbor in December 1941.
The question of the civil liberties and legal rights of citizens in times of war or times of international crisis has posed a more delicate issue for the Supreme Court. Lincoln’s suspension of the privilege of the writ of habeas corpus led Chief Justice Taney in Ex parte Merry man (1861)[25] to deny the right of the President so to act, arguing that only an act of Congress could effect this and concluding that if this action were permitted the people would no longer be living under “a government of laws”. Lincoln responded by arguing that the Constitution was silent as to who may suspend habeas corpus in an emergency, and continued to enforce it until Congress acted two years later. Following World War I, the Court upheld the constitutionality of the 1917 Espionage Act and the 1918 Sedition Act, which empowered the executive branch to punish expressions of opinion hostile to the government.
During and immediately after World War II the Court, dominated by Roosevelt appointees, demonstrated that the guideline provided by Stone regarding judicial intervention was unlikely to be applied in times of war, or even in the uncertain era of the Cold War, to protect the rights of individual citizens or groups of citizens against national governmental actions deemed necessary to maintain national security. For example, during the war the Court sustained an executive order, based on the commander-in-chief powers and later supported by statute, authorizing the Secretary of War to prescribe certain military areas from which persons might be excluded. This led to the segregation and internment in relocation camps of many thousands of Japanese-Americans, many of them citizens. Although given three opportunities in 1943-44, the Court never directly considered the legality of this action, nor the issue of the government’s authority to restrict the rights of American citizens when required by military necessity. In the Hirabayashi case the Court upheld a military curfew order, and in the Korematsu case upheld the exclusion programme without discussing the detention aspect, though three Justices vigorously dissented. Only in Ex parte Endo did the Court indicate concern, suggesting that the authorities should not detain citizens who had demonstrated their undoubted loyalty.[26]
Furthermore, in the years that followed, the Court, led after 1946 by the Truman appointee Fred Vinson, did not challenge the dominant political attitudes of the early Cold War years regarding internal security. For example, in Dennis v. United States (1951)[27] the Court sustained the constitutionality of the 1940 Smith Act, which had forbidden conspiracies to teach or advocate the violent overthrow of the government. Technically known as the Alien Registration Act, this law had been intended as a war measure comparable to the 1918 Sedition Act, but had been used in peace-time by the Truman administration to prosecute leading Communists.
However, the most significant event in relations between the Court and the President in this period arose when a domestic crisis in industrial relations coincided with an international emergency. In 1950 President Truman committed American troops to the de fence of South Korea against a North Korean invasion. Congress did not formally declare war, even though the military action undertaken on the President’s authority dragged on for three years. At the end of 1951 a dispute in the steel industry led to a threat of strike action which the executive felt might have severe repercussions on the military effort in Korea. Efforts to obtain a compromise through the Wage Stabilization Board failed, and the steelworkers’ union called for a nation-wide strike.
The President could have resorted to the Taft-Hartley Act which, among other things, permitted the President to obtain an injunction postponing for eighty days any strike threatening the national welfare. This act had been passed in 1947 by a Congress which had become controlled by the Republicans in 1946 for the first time since the Hoover presidency. The legislation reflected their view that a series of major strikes after the end of the war had shown that the labour unions had become too powerful, and that there was a need to equalize the positions of labour and management visa’-vis government. The unions, however, felt that the legislation severely restricted their activities. The President, whose good relations with organized labour had been damaged by his firm action during the strikes in 1946, seized the opportunity to veto the bill. When Congress overrode the veto, Truman used the issue to obtain labour support in his narrowly successful re-election bid in 1948. In the steel dispute in 1951 he was understandably reluctant to use the Taft-Hartley procedure, especially as he believed that the companies rather than the unions were the main obstacle to a settlement.
On the eve of the strike the President issued an executive order instructing Secretary of Commerce Sawyer to take over operation of the steel mills for the United States government. Addressing the nation, Truman declared that the country faced a serious emergency, that its security depended upon steel as a major component of de fence production, and that the Taft-Hartley procedure would have meant at least a short interruption in production. The President based his authority for acting on his powers as commander-in-chief of the armed forces and on inherent powers derived from the aggregate of powers granted to the President by the Constitution. He reported the seizure to Congress in a special message in which he invited them to legislate on the subject.
The immediate response was not unfavourable, though there was some press and congressional criticism. The steel companies were angry, and responded with attempts both to influence public opinion and to bring legal action against the President. Attention was quickly diverted from the facts of the case to the high-handed decision of the President. Truman went out of his way to stress that he had acted in terms of the inherent powers of the executive under the Constitution, which obliged him, in an emergency, to take whatever action he deemed necessary to protect the national interest.
Truman’s argument did not commend itself to the district court when the steel companies filed for an injunction against the seizure, but the court of appeals accepted a government request to stay such an injunction and allow the Supreme Court to consider the case. The steel companies took the initiative, and the case was argued before the Court. Two weeks later the Court gave its ruling in Youngstown Sheet and Tube Co. v. Sawyer.[28] Truman expected his action to be upheld, and few people believed that a Court composed of five Roosevelt and four Truman appointees would call a halt to the accretion of executive power. Writings by distinguished scholars at the time also supported the view that presidential authority, especially with respect to emergency situations, had been irrevocably expanded and that the Supreme Court was unlikely to interfere with its development or with conflicts between the executive branch and Congress.[29]
Having agreed to consider the case, there were several ways in which it might have been decided. For example, the case could have been dismissed because it involved a “political question” which could be dealt with only by the elected branches of government. Instead, by a 6-3 vote the Court held that the seizure by the President was an unconstitutional usurpation of legislative power. The decision, however, was less clearcut than the figures suggest. Although a majority opinion was written by Justice Black, every other Justice in the majority insisted on writing a separate concurring opinion, and one refused to join in the Court’s opinion concurring only in the result.[30] Seven opinions were written in all. As a result, the decision did not have the same impact as a single, agreed, majority or unanimous verdict.
The result, according to Schlesinger, was “a confusing, if intermittently dazzling, examination of the presidential claim to emergency prerogative.”[31] Justice Black addressed the issue of presidential power directly, asserting that it must originate from an act of Congress or from the Constitution. No act of Congress authorized Truman’s seizure, therefore the power had to come from the Constitution. He rejected the contention of the executive that the seizure power could be implied from the aggregate of powers granted to the President by the Constitution. It could not be justified as an application of the President’s military power as commander-in-chief, nor from his general executive powers, which do not include lawmaking. Black interpreted the separation-of-powers doctrine rigidly, a view shared by Justice Douglas. Both agreed that the seizure was a legislative power exercisable by the executive only with congressional authorization. The four other majority Justices rejected this formal position. Justice Frankfurter adopted a flexible approach, related to the specific issues. He argued that in 1947 Congress had deliberately not included seizure authority in the Taft-Hartley Act, and in seizing the steel plants Truman had exceeded his responsibility to execute the laws faithfully. He reiterated the view of Brandeis and others that the principle of the separation of powers was adopted in 1787 in order to preclude the exercise of arbitrary power.
The concurring opinion of Justice Jackson emphasized that practical realities left a “twilight zone” where Congress and the President might act together, or where actual authority was unclear. The legitimacy of unilateral executive action could only be justified if it was clear that Congress could not act. Such was not the case here. Jackson refuted the executive’s claim that all inherent powers rested with the President, attacked the use of the limited war powers granted to the President “as an instrument of domestic policy”, and rejected the use of inherent powers without statutory support. He pointed out that, in a similar crisis situation, Roosevelt had based his New Deal authority on delegated congressional, not inherent presidential, power. Thus the four Justices – Black, Douglas, Frankfurter and Jackson – who had been fervent supporters of the New Deal did not see it as a precedent for the steel seizure. Justices Burton and Clarke also could not regard the emergency as grave enough to warrant Truman’s action, but did not rule out the possibility that emergencies might occur in which a President could act even if Congress had not prescribed procedures for such action. Here, however, Congress had done so.
Interestingly, Chief Justice Vinson, a close political and personal friend of President Truman, wrote a strong dissenting opinion. Supported by Justices Reed and Minton, he defended the action taken by Truman. He argued that the President had acted wholly in accordance with his obligations under the Constitution. Citing the decisions of the Court in the Neagle and Debs cases, she deemed the emergency to be of the kind that, if the President had any constitutional authority to act in any situation without congressional direction, the seizure was warranted.
On its face the decision suggested that the Supreme Court had repudiated any claims for an inherent executive prerogative in internal affairs, or any expanded prerogatives in national emergencies. Yet the decision as a whole did not exclude presidential initiative in an indisputable crisis, but did draw attention to the importance of Congress in providing procedures whereby Presidents might take emergency action. If the Court did not deny totally the recourse to emergency powers, it did challenge the growing mystique of executive authority and autonomy. The decision also was greeted favourably by Congress and by public opinion, despite the fact that a steel strike was impending.[32]
In retrospect the decision seems little more than a minor hiccup in the accumulation of executive authority. It had a marginal bearing on unilateral presidential action in foreign affairs. Yet the Court, without abandoning the principle of self-restraint, made it clear that presidential actions were not immune from judicial review, even after the New Deal ‘revolution’.
4: The Warren Court, Civil Rights, and the Nixon Response
In 1952 General Eisenhower was elected President as a Republican. In domestic affairs it soon became apparent that he did not wish to provide leadership, preferring to defer to the legislative process. In September 1953, eight months after Eisenhower assumed office, Chief Justice Vinson died. Eisenhower gave careful consideration to the task of making his first Court nomination, and finally decided on Governor Earl Warren of California. There were sound practical, legal and political reasons for his choice. Warren had been influential in helping Eisenhower obtain the Republican nomination. He possessed the leadership qualities and administrative skills to be Chief Justice. He had some legal experience and an impressive political record. The nature of the latter was raised by several conservative Senators at the confirmation hearings, where he was unfairly charged with having “left-wing”, ultra-liberal views.
Almost immediately after his appointment was approved he became a controversial figure. On May 17th, 1954, speaking for a unanimous Court, Warren delivered the bombshell decision in Brown v. Board of Education of Topeha.[33] He declared that racial segregation in public education was inherently discriminatory and in violation of the “equal protection of the laws” clause of the Fourteenth Amendment. This dramatic decision, in effect, reversed the 1896 Court decision[34] which had upheld the right of state governments to distinguish between their citizens on the basis of race. The effect of this decision had been to make the principle of separate but equal” the legal rationale in the Southern states for the segregation of whites and blacks in all kinds of public and private facilities.
Warren relied heavily on sociological and psychological evidence to support the view that separate educational facilities are inherently unequal. The basic constitutional issue was whether enforced racial segregation, even if all other factors might be equal, deprived the minority group of equal educational opportunity. The nine Justices seemed agreed that it did.
Because the decision seemed certain to provoke bitter controversy and problems of implementation for the lower courts, the Court later ruled that the transition from segregation to desegregation should take place “with all deliberate speed”. Though it prompted a public furore, especially in the South, the decision was in fact the culmination of a line of policy and precedent developed since 1938 by the Court, and was made only after a series of conferences by the Justices which had begun before Warren became Chief Justice. However, if it was not a revolutionary doctrine, it was a classic example of judicial policy-making, and it marked the beginning of a new style of adjudication by the Court.[35] The Warren Court initiated a policy which Congress, under existing Senate rules and given the dominant influence of Southern Democratic Senators, would not support. It also provided the impetus for black groups who had sponsored the case to create the civil rights movement.[36]
The decision highlights the strengths and weaknesses of the Court. For some it was evidence of judicial statesmanship, while others attacked the Court for usurping the powers of the political branches of government. In point of fact, for almost a decade there was much deliberation but very little speed, and little help was forthcoming from the Eisenhower administration. Ingenious devices were employed by state governments and local school boards in the South to avoid compliance, and litigation in the courts posed serious dilemmas for Southern federal judges obliged to take note of the decision.[37] Eisenhower refused to state whether he agreed with the Brown decision, and only when there was violent resistance in 1957 to school desegregation in Little Rock, Arkansas, did he support the authority and supremacy of the federal courts. His action in placing the National Guard under federal authority and sending in troops to ensure peaceful integration was as much a response to the recalcitrance of Governor Faubus as evidence of an executive commitment to aid efforts at desegregation. In 1957 and 1960 Congress passed modest civil rights bills but Eisenhower gave only lukewarm support on both occasions, especially towards efforts to permit legal action by the Department of Justice to seek enforcement of school desegregation. In 1958 and again in 1964 the Court expressed disquiet at the slow pace of compliance with the Brown decision.[38]
Initiatives to speed desegregation through executive action were increased after the election of President Kennedy. Kennedy made a moral and practical commitment to civil rights, but efforts to obtain far-reaching legislation faced serious opposition in Congress from Southern Representatives and Senators. Not until crises occurred over the admission of black students to state universities in Alabama and Mississippi which involved the further use of federal troops did Kennedy move to initiate civil rights legislation in Congress. Only after the assassination of Kennedy did the executive, through the legislative skills of President Johnson, persuade Congress to accept the need for comprehensive action. There followed the Civil Rights Acts of 1964 and 1968, and the 1965 Voting Rights Act.
Beginning with the Brown decision, the Court led by Warren sought for over a decade to eradicate public and private racial discrimination. Acting alone, the Court was largely unsuccessful. A government-department study in 1968 revealed that over 60 per cent of black and white students still attended largely segregated schools. The major developments towards racial justice came belatedly, as a result of action by Congress and the President. Yet it is plausible to argue that little of this would have taken place without the Court initiative of 1954. In this sense the Court forced the elected branches of government to recognize the validity of the claims of significant minorities of Americans to equal protection and guarantee of their rights as citizens. On this, and later on other issues such as legislative reapportionment or equal voting rights,[39] the Warren Court was prepared to respond to perceived injustices in the absence of executive or congressional action. The Court also made a series of controversial decisions defending the rights of individuals in terms of the Bill of Rights, as, for example, when it restricted the operation of the 1940 Smith Act and declared unconstitutional several other measures used to penalize American Communists.[40] Such actions antagonized many interests in society and prompted demands in the 1960s for the impeachment of Warren.
The almost obsessive concern of the Warren Court with the importance of equality under and before the law in American society, whether with respect to voting rights, education, or the rights of criminal defendants, led the Court into a situation not unlike that in the early 1930s. Having ventured into the “political thicket”, it soon had to decide complex matters of detail which are ideally determined best by the legislature. Moreover, the Court not only began to assume unusual authority for itself but showed little confidence in, or respect for, the legislative process. In contrast to the 1930s, however, the Presidency was little affected by this arrogation of power by the judiciary. Indeed the Warren Court was only mildly restrictive of the executive, and in general contributed to the expansion of the authority of the modern President. Relations remained cordial in the Kennedy and Johnson administrations, as the Court was often making decisions which had the general support of these administrations. This is not too surprising since the new era of judicial activism was more than just the erratic application of the “preferred position” doctrine initiated by Stone. In essence it was, as Martin Shapiro has indicated, “the history of a political institution working out the implications of the victory of the New Deal coalition and the dominance of the New Deal consensus.”[41] Hence, while the attacks made on the decisions of the Warren Court echoed those made in the 1930s, they came from members of Congress rather than from the White House.[42] President Johnson on one occasion referred to Warren as the greatest Chief Justice of all time, and, as Roosevelt had with Justice Frankfurter, so Johnson sought advice and guidance from Justice Fortas.
Up until 1968 the judicial activism of the Warren Court was by and large on behalf of the current ‘winners’ in national politics. By 1968, as Richard Funston and others have argued, both the President and the Court had become out of step with the attitudes of many Americans.[43] The Court’s lack of public support and prestige at this time was reflected in the Gallup Polls. The Democratic presidential defeat in 1968 was the result not only of reactions against the Vietnam policy of President Johnson abroad and the mixed blessings of his Great Society legislation at home, but also of the defection of Southern Democrats to support Governor Wallace of Alabama, the new symbol of Southern opposition to Brown and other Court decisions.
Opposition also expressed itself in the struggle to appoint a successor to Warren. In June 1968 the Chief Justice informed Johnson of his desire to retire, and the President duly nominated his confidant Fortas, who was also a member of the egalitarian majority on the Court. The Senate, reacting to charges of “cronyism” and questionable extrajudicial activities by the nominee, refused to confirm him. This action may be seen as indicating a loss of credibility for the Warren Court similar to that which had already forced Johnson not to seek renomination by his party in 1968. Fortas was simply an inappropriate choice for Chief Justice by a lame-duck President. The conflict did not end here, for new revelations of doubtful extrajudicial activities led Fortas to resign from the Court in 1969.
Chief Justice Warren withdrew his resignation, and remained on the Court until May 1969. The Fortas debacle was an important reminder of the significance of Supreme Court nominations, and a stimulus to the Senate to use its power to confirm nominations as part of a wider campaign to reassert congressional checks on the executive. The Republicans won the Presidency in 1968, but the Democrats retained control of the House and Senate. The political situation was therefore ripe for conflict between the President and Congress, with the Court as a central issue.
No President since Franklin Roosevelt had made the Supreme Court a major election issue, or raised public expectations that he could and would change the nature and style of judicial decisions, as Richard Nixon did in 1968. In campaign speeches Nixon asserted the need for a Court and a judicial system which looked upon its function as that of interpreting existing laws rather than taking initiatives. He promised to “rebalance” the Court with “strict constructionists”, a label which was politically useful at the time but difficult to define precisely. While Nixon’s notion of “strict construction” was little more than an indirect denunciation of the Warren Court, it struck a responsive chord among voters, especially in the South. Nixon himself hoped for a more subdued Court, one that would go about its constitutional activities with caution and deference to the other branches of government, leaving major national policy decisions to elected politicians, not least to the President.
President Nixon’s first nominee, Warren Burger, was accepted by the Senate following the resignation of Chief Justice Warren. However, Nixon’s undisguised views about the Court led to trouble over his next nominee. In August 1969 Nixon nominated Clement F. Haynsworth, Jr., to fill the position vacated by Fortas. Haynsworth was a Southerner and chief judge of a federal Circuit Court of Appeals. After the Burger appointment, Nixon had been urged to name a Southerner to the Court as final confirmation of his identification with Southern attitudes which had aided his election. Civil rights and labour leaders immediately voiced opposition, but more serious was evidence produced at the Senate hearings ~n the nomination that Haynsworth had participated in decisions indirectly affecting the welfare of companies in which he had a monetary interest. Opponents of the nomination seized on this evidence, but ultimately it was the reaction of Nixon himself which brought about the rejection of Haynsworth. In the face of demands to withdraw the nomination Nixon became defensive and political. Rather than stressing the competence and the judicial record of his nominee, Nixon chose to reassert his desire to appoint a “conservative”, and put strong and direct pressure on Republican Senators to support him. In the end it was the defections of liberal Republicans and the three top Republican leaders in the Senate that proved critical in denying confirmation. Nixon expressed bitter regret at the decision, but vowed to nominate another candidate with the same “legal philosophy” as Haynsworth.
In January 1970 he nominated G. Harrold Carswell of Florida, another Southern Appeal Court judge. It seemed unlikely that Republicans would go against their President a second time, especially as there was now no evidence of any ethical improprieties or conflict of interests. However, the situation changed dramatically when the news media disclosed that in 1948 Carswell had made a speech supporting segregation and white supremacy. Carswell immediately repudiated any continuing belief in such ideas, but closer scrutiny of his record revealed damaging evidence of racist actions both personal and on the bench. This, allied to growing criticism from within the legal profession as to his competence as ajurist, strengthened opposition to the nomination.
Once again the President attacked the Senate, asserting that its “advice and consent” authority was a passive constitutional function. Rejection would impair the constitutional relationship of the President to Congress. He argued that the central issue in the nomination was whether his constitutional responsibility to appoint members of the Court could be frustrated by those who would substitute their own philosophy or subjective judgement. He believed that he was not being accorded the same right of choice as that given to previous Presidents. Nixon’s argument was a weak one, not least in that the Senate had in fact countermanded the choice of the President on twenty-four occasions before it rejected Carswell, often at times of intense political conflict between Presidents and the Senate. It had no foundation either in terms of the intentions of the framers of the Constitution or past practice, and was a deliberate attempt to transform his nominating authority into an exclusive appointing power. It strengthened the resolve of certain Senators to oppose the Carswell nomination, and thirteen Republicans voted against the nomination, which was duly rejected in April. Nixon asserted that the real reason for these rejections remained the fact that his nominees were strict constructionists and Southerners, and vowed that until the Senate was changed he would not nominate another Southerner. Within a week he named Harry A. Blackmun of Minnesota to fill the vacant Court seat. On May 12th the Senate approved the nomination by a 94-0 vote. In 1971 Nixon nominated and obtained the app ointment of two other Justices, one (Powell) from Virginia.
The conflict between Nixon and the Senate over the two nominations is significant in demonstrating the importance that President Nixon attached to the need to obtain a Court dominated by Justices with attitudes very different from those which were predominant in the Warren era, and the extent to which he was prepared to claim total executive authority in order to get his nominees accepted. He failed in his latter efforts, though they did serve to consolidate his political support in the South.
In fact, in his first term as President, Nixon nominated more members of the Court in a four-year period than any President since Harding. Also, between 1969 and 1976 Congress approved over a dozen measures which had a direct impact on the functioning of the federal judiciary. They included an increase in the number of federal judges, so that by early in his second term Nixon had appointed more federal judges than any other President, most of them Republican loyalists who shared his belief in total judicial restraint. Despite these changes the results did not match Nixon’s expectations.[44]
While the Supreme Court led by Burger, containing four Nixon appointees, came to behave in ways different from those of the Warren Court, there was no fundamental shift of attitude on the sensitive issue of civil rights. In 1969 Burger wrote a short opinion for a unanimous Court demanding the immediate termination of dual school systems, thus undercutting the efforts of the Nixon administration to encourage a ‘go-slow’ on desegregation; and in 1971 he wrote the majority opinion sustaining the use of busing to eliminate school segregation.[45] Even concerning issues such as law and order or criminal-law procedures, the change was one of degree or emphasis. The decisions of the majority of Justices on race relations, reapportionment, or the rights of criminal defendants constituted neither a new pattern of decision-making nor the erasure of the Warren legacy. Just as Eisenhower’s appointment of Warren had not halted certain trends in Court decisions, so Nixon as President failed to ‘reform’ the Court.
Indeed, the Nixon appointees tended to display a judicial attitude more potent and conservative than the “strict constructionist” rhetoric of Nixon: for they recognized the importance of deferring to established precedents and upholding the legitimacy of the judicial and constitutional processes. Ironically, it was Nixon’s own inability or unwillingness to recognize that the President also might be prudent to observe the constitutional limits on his authority that ultimately destroyed his Presidency.
5: The Court and the Revolutionary Presidency
The origins of the crisis which ended with the resignation of President Nixon lay in the emergence not of an imperious judiciary, but of the imperial Presidency. The latter was essentially the consequence of the complexities of international affairs which produced an unprecedented centralization of decisions over war and peace in the Presidency, and the unprecedented exclusion of others from policy-making. The Vietnam war accelerated this pattern of centralization and exclusion. Arthur Schlesinger, Jr., in explaining these developments, clearly indicates how in the 1960s and early 1970s there emerged an equivalent centralization of power in the domestic political process. Executive claims to inherent and exclusive authority, already triumphant in foreign affairs, now began to be made over domestic matters.[46]
In effect, Nixon began to establish a ‘revolutionary’ or ‘plebiscitary’ Presidency in both foreign and domestic affairs. Schlesinger has outlined the remarkable range of Nixon innovations up to 1972 – his appropriation of the war-making power, his particular interpretation of the appointing power, his unilateral efforts to abolish existing statutory programmes, his enlargement of executive privilege, his theories of impoundment and pocket veto – which together constituted “a new balance of constitutional powers”.[47] Congress rallied to challenge and contest such developments and belatedly to reassert its own constitutional authority, but the 1972 landslide election victory of Nixon seemed to be a vindication of his position.
The Supreme Court did little to halt the growth of the imperial Presidency. It refused to give an opinion on whether the President could constitutionally commit American troops to protracted action in South-East Asia without a congressional declaration of war. In 1968 it rejected a petition regarding the legality of the American involvement in Vietnam, though Justice Douglas wrote a lengthy dissenting opinion arguing that the Court should answer the question whether conscription was constitutional in the absence of a declaration of war. In November 1970 the Court refused to allow the state of Massachusetts to file a suit against the Vietnam War. Again Justice Douglas wrote a dissenting opinion, and was supported by two other Justices.[48] However, in 1971 the Court did give a brief opinion, supported by six Justices, permitting the publication of the so-called Pentagon Papers, and rejecting the arguments of the Nixon administration. The Papers revealed the strategic planning of the Vietnam involvement by the Johnson administration, and their publication dramatized the whole issue of national security and secrecy with respect to information on foreign policy, and of executive prerogatives in this area.[49] The administration was unhappy at this failure to obtain judicial support for its claim of inherent power to protect national security. The Court also gave modest protection to anti-Vietnam War protesters. Nevertheless, if the Court had not fulfilled Nixon’s expectations, up to 1972 it did not seriously embarrass him. Subsequently, however, the courts as a whole rendered verdicts which reflected unease with the claims and attitudes of the Nixon administration.
The 1972 election victory was followed immediately by a series of domestic initiatives by the Nixon administration. Among them were several executive orders demanding that certain funds appropriated by Congress, in particular for antipoverty programmes and environmental protection, should not be spent by the agencies concerned. This was a major escalation of executive impoundment of funds appropriated by Congress for programmes which the Nixon administration did not like. There were precedents for such actions, but their scale and character were a direct challenge to the clear power of the purse granted to Congress by the Constitution. When linked with the earlier refusal to enforce Title VI of the 1964 Civil Rights Act (which required that federal money be withheld from programmes or organizations that discriminated racially) and with Nixon’s use of the pocket veto when Congress was only temporarily adjourned to avoid the passage of legislation which he opposed, such actions were clear evidence that the President claimed unilateral authority to assume or to countermand the legislative power of Congress. Moreover, the selective enforcement of laws was a direct negation of his constitutional responsibility to “take care that the laws be faithfully executed.”
These actions led to a large number of cases in the federal courts, almost all of which went against the executive,[50] though few were decided by the Supreme Court. The steel-seizure case proved to be a useful precedent for the courts at all levels, both with respect to questions of inherent executive authority and over the division of power between the President and Congress. The Supreme Court slowly but surely was drawn into the political conflict because increasingly its constitutional authority, and that of the judicial process, was challenged by the Nixon administration. This was reflected in the decision in United States v. U.S. District Court (1972)[51] where Justice Powell, the new Nixon appointee from the South, rejected administration claims, based on legislation in 1968, that the President had the inherent power to order wiretaps in domestic security matters without judicial authorization.
It is ironic that defenders of a “strict construction” of the Constitution, and the strongest critics of the Warren Court in Congress such as Senator Sam Ervin of North Carolina, now found it necessary to seek judicial support in defending congressional authority in domestic affairs from executive actions. A Supreme Court which Nixon had tried to influence to assume a new role was obliged to delineate the limits of inherent executive authority and privilege in order to protect judicial procedures and the Constitution. It was over the Nixon interpretation of “executive privilege” as it related to the investigations that followed the Watergate break-in of July 1972 that the Supreme Court had a decisive impact on political events.
Executive privilege was not a new doctrine, but its constitutional basis was shaky.[52] In his first term Nixon personally invoked executive privilege on four occasions, resurrecting arguments made (but not accepted) during the Eisenhower administration that the President had “uncontrolled discretion” to keep executive information from Congress. Moreover, Nixon sought to extend this privilege to White House staff and to documentary information. When the committee headed by Senator Ervin investigating the Watergate break-in and other presidential campaign activities were told in June 1973 of the existence of taped conversations between President Nixon and his staff, they requested access to potentially relevant tapes. The President rejected the request, citing the need for confidentiality of presidential communications and papers, and the committee served two subpoenas on the President, calling for tape recordings of specific conversations. More significantly, the grand jury considering allegations made by one of the men convicted of the Watergate break-in requested the White House to produce specific tapes to help them with their investigations. When the White House refused to produce the tapes, the grand jury directed the Watergate Special Prosecutor Archibald Cox to subpoena the materials. The White House refused to comply with either subpoena, claiming with respect to the latter that the President was not subject “to compulsory process from the courts.”
The Ervin Committee brought suit against the President, but Judge Sirica of the U.S. District Court in Washington, D.C., ruled that Congress had provided no statutory basis for the suit, and the court had no jurisdiction to hear it.[53] The Court of Appeals later ruled that the Ervin Committee did not need the tapes to perform its duties. The subpoena on behalf of the grand jury, however, was considered favourably by Sirica, who directed the President to turn over the tapes to him for private inspection so that he could decide whether their contents were protected by executive privilege. Both sides appealed against the Sirica opinion, and in Nixon v. Sirica[54] the U.S. Court of Appeals for the District of Columbia Circuit upheld Sirica’s directive, making it clear that the President must obey a court order. This case was never appealed to the Supreme Court. In an attempt to escape compliance with the district court’s order President Nixon fired Watergate Special Prosecutor Cox, who had refused to accept a compromise whereby Nixon would prepare a statement based on the subpoenaed tapes which would be verified by a Senator and then submitted to Judge Sirica. The “firestorm” which followed Nixon’s action and the resignations of the Attorney General and his Deputy in protest, forced Nixon to announce that he would comply with the subpoena. Later it was disclosed that two of the subpoenaed tapes did not exist, and that there were gaps in other tapes.
In November 1973 the President appointed a new Special Prosecutor, Leon Jaworski, and in January 1974 announced that he had voluntarily provided him with all the material necessary to conclude investigations. In February the House of Representatives initiated possible impeachment proceedings against the President. In March the Watergate grand jury indicted several former members of the Nixon administration for conspiracy and obstruction of justice in the Watergate cover-up, and named Nixon as an unindicted coconspirator. In April, Jaworski asked Sirica to issue a subpoena for sixty-four tapes of White House conversations and other papers which he believed were necessary to produce a case against the Watergate cover-up defendants. Sirica issued a subpoena, ordering the President to produce these materials for judicial inspection to determine their relevance. White House attorneys asked that the subpoena be withdrawn, arguing that the President was immune from such orders. Nixon personally invoked executive privilege as protection against the subpoena, claiming that further disclosures “would be contrary to the public interest.” Sirica denied the request, rejecting both the argument that the courts could not resolve such an issue and the claim for an absolute executive privilege. The President sought review in the court of appeals, but Jaworski filed a petition directly to the Supreme Court. The Court accepted it, citing the steel-seizure case as a precedent for resolving the question promptly, and agreed to hear the case of United States v. Richard M. Nixon.[55]
Quite apart from the remarkable situation at the time, with the President under threat of impeachment, the Court’s decision was unavoidable but risky. Both before and after oral argument before the Court, the spokesmen for the President refused to affirm that he would obey an order to turn over the tapes. The implication remained that if the President did not agree with the “guidance” given by the Court, he might not feel obliged to obey its ruling. This may account for the fact that the Court came to a unanimous opinion, and gave special emphasis to the obligations of public officials to preserve the integrity of the criminal-justice system.
Hence a Court containing four Nixon appointees had little hesitation in re-affirming that it was the duty of the courts to state what the law is. The Court opinion delivered by Chief Justice Burger rejected the President’s contention that the issue was an intra-branch dispute, and asserted that the Special Prosecutor had standing to bring the case, that a justiciable controversy existed, and that the Special Prosecutor had made a strong enough case to justify a subpoena before the actual trial. The Court considered the claim that the separation-of-powers doctrine precluded judicial review of the President’s claim of privilege, and affirmed that such a claim of absolute privilege, if invoked regarding a criminal prosecution, would itself violate the separation of powers by preventing the judiciary from performing its duties. Citing the steel-seizure precedent, the Court claimed that it had held other exercises of executive authority unconstitutional and it could not permit the President to be his own judge of executive privilege. The Court emphasized the interdependence as well as the separateness of the branches of government. Neither separation of powers nor confidentiality could sustain an absolute presidential privilege of immunity from the judicial process in all circumstances. “Absent a claim of need to protect military, diplomatic or sensitive national security secrets”, the Court could not accept that the production of presidential communications for in camera inspection by the district court would significantly diminish confidentiality. The Court asserted that in this situation, relating to a possible criminal case involving executive officials, their judgement of the public interest should prevail over that of the President: “The generalized assertion of privilege must yield to the demonstrated, specific need for evidence in a pending criminal trial.” Referring to Chief Justice Marshall’s views in the Burr case, the Court went on to reaffirm the need for the district court to give presidential communications special protection and confidentiality, but they were in no doubt that the President should provide the material requested.
Hence the Supreme Court established that presidential communications in certain specific situations did not enjoy absolute privilege, while conceding that in certain other circumstances such a privilege might exist. In this sense it was a major defeat for President Nixon, but not necessarily for the Presidency, especially with respect to dealings with Congress. Its immediate effect was to force the President to give up certain tapes, which in turn prompted the House of Representatives to draw up articles of impeachment. The public release of three tapes that Nixon was now compelled to turn over to Sirica clearly revealed his participation in the cover-up. Faced with certain impeachment, and probable conviction and removal from office, Nixon resigned on August 8th, 1974. Any doubts as to the authority of Court decisions were consigned by Watergate to the same fate as befell President Nixon.
6: Epilogue
The Nixon Presidency revealed that the Court remains an important check on ‘revolutionary’ executive initiatives in the area of domestic affairs. It also suggests that the Court is singularly difficult to control through the appointment process. The Court has had little opportunity but also little apparent incentive to restrain the Presidency with respect to foreign affairs and national security, and may even have delineated an area where executive privilege might be legitimate. Hence, while the Court may have helped to create the modern Presidency, it still exerts a potent if erratic influence on the parameters of presidential authority. Moreover, one of the significant factors in United States v. Nixon was Nixon’s challenge to judicial review, and with specific reference to Marbury v. Madison the Court re-emphasized its duty to uphold the law and the Constitution. The Court can also take political initiatives in furtherance of its perceived responsibilities, but such action is likely to be controversial and also inconsistent with the canons of practice it has required of Congress and the President.
The main areas of interaction between the President and the Court include judicial appointments and the delicate issue of court-packing, acceptance of the Court’s interpretation of the Constitution as expressed in judicial review, and executive compliance with, and enforcement or non-enforcement of, judicial decisions. In consequence the Court, like Congress or the President, can at times be portrayed as the Saviour of the Republic, or the usurper of authority vested elsewhere. An important difference, however, is that the Court is a non-elected body. This may be desirable for a supreme judicial body, but if the Court can legitimately claim to be the “ultimate interpreter” of the Constitution as it relates to the authority conferred on the elected branches of government, it behoves the Court to resist the temptation to make policies.
The evidence suggests that, after the confrontation with Franklin Roosevelt and up to 1968, the Court was in tune with public opinion in giving limited support to the executive. It combined a sensitivity to the burdens of responsibility for foreign policy and national security with the courage to set limits on executive authority at home if it infringed or usurped legislative or judicial powers. Executive abuses of authority in foreign affairs may be as much the result of congressional cowardice as of benign judicial neglect.
Two persistent trends stand out. The first is the extent to which the Court, sometimes against its best judicial instincts, has legitimized expansions of national governmental and executive authority deemed to be necessary political responses to particular situations; in effect, it has tried to balance constitutional stability and the constitutional change consistent with shifts in public attitudes and political demands. The second trend is the will to limit or refine the scope of such expansion if it is challenged as conflicting with basic constitutional principles or might interfere with the judicial process itself. Both of these trends were evident in the example set by Chief Justice Marshall. If there has been any consistency of approach, it has been in the interpretation of the separation of powers as a formal separation requiring comity and co-operation between institutions if it is to work, but on occasions leading to political conflicts which must be decided ultimately by judicial decisions. Such decisions will include judgements about the legitimate scope of national governmental authority as well as the authority of particular institutions.
The behaviour of the Court towards the Presidency since 1933 has been in accord with the ‘creative tension’ built into the Constitution by the combination of the separation of institutions and a system of checks and balances. The President may act, but the Court provides an opportunity to check such action – and who prevails may depend in the end on public opinion. On most occasions major issues are resolved by the President and Congress, but the former depends more than the latter on the Court to legitimize any extensions of authority. In foreign affairs and times of ‘war’ the Court, for good or ill, has rarely embarrassed Presidents. Where particular Presidents have claimed authority or privileges at home which have been challenged as infringements on or a negation of the constitutional obligations of other branches of government, the Court has been less charitably disposed. Hence the state of the Presidency in the future, as in the past, will be conditioned by the potential impact of Supreme Court decisions.
7. Guide to Further Reading
There are many general and specific books on the Supreme Court and on the Presidency. This guide is intended to suggest material which will amplify the particular relationships discussed in the text. In order to understand the Court, there is no substitute for reading the Court’s own opinions; hence specific citations have been given, in the Notes, to the United States Reports (U.S.), the Federal Reporter, Second Series (F.2d), and Federal Supplement (F.Supp.). The classic study of the Court up to 1918 is Charles Warren, The Supreme Court in United States History, 2 vols. (1922; rev. ed., Boston: Little, Brown, 1926). A detailed historical analysis is provided by Alfred H. Kelly and Winfred A. Harbison, The American Constitution: Its Origins and Development (New York: W. W. Norton, 1948; rev. eds., 1955, 1970). Perhaps the most valuable historical summary is Robert G. McCloskey, The American Supreme Court (Chicago: Chicago UP, 1960). A short but succinct introduction to the contemporary Court is Archibald Cox, The Role of the Supreme Court in American Government (New York: Oxford UP, 1976), while Robert H. Jackson, The Supreme Court in the American System of Government (New York: Harper & Row, 1963), is a cogent statement by a respected member of the Court. A most useful general study of the Court is Richard Funston’s A Vital National Seminar (1978).6*
The classic constitutional work on the Presidency remains Edward Corwin’s The President: Office and Powers (1940; 4th ed., 1958).29 Among the many studies of the contemporary Presidency, the following are probably the most useful in helping to understand its various facets: Louis Koenig, The Chief Executive (New York: Harcourt, Brace, Jovanovich, 3rd ed., 1975), Thomas E. Cronin, The State of the Presidency (Boston: Little, Brown, 1974), and Arthur Schlesinger, Jr., The Imperial Presidency (1974).2
Two books deal directly with relations between the Supreme Court and the Presidency: Glendon A. Schubert, Jr., The Presidency in the Courts (Minneapolis: Minnesota UP, 1957), and Robert Scigliano, The Supreme Court and the Presidency (1971);4 the latter is more historically based and includes a perceptive discussion of appointments and performances in the light of presidential expectations. On judicial appointments, the major general work is Henry J. Abraham, Justice and Presidents: A Political History of Appointments to the Supreme Court (New York: Oxford UP, 1974). Studies of specific aspects of the selection process include Joseph P. Harris, The Advice and Consent of the Senate (Berkeley: California UP, 1953), and Joel B. Grossman, Lawyers and Judges: The ABA and the Politics of Judicial Selection (New York: Wiley, 1965), while David J. Danelski, A Supreme Court Justice is Appointed (New York: Random House, 1964), is an excellent case-study of the appointment of Butler in 1922 – and provides an interesting comparison to the discussion of the Carswell nomination in Richard Harris, Decision (New York: E.P. Dutton, 1971). The Fortas affair is assessed in Robert Shogan, A Question of Judgment (Indianapolis: Bobbs-Merrill, 1972).
Most text books on the Supreme Court discuss judicial review and the concept of “political questions”, but detailed analysis can be obtained from Alexander M. Bickel, The Least Dangerous Branch: The Supreme Court at the Bar of Politics (Indianapolis: Bobbs-Merrill, 1962), and Phillipa Strum, The Supreme Court and ‘To “Political Questions”: A Study in Judicial Evasion (University, Ala.: Alabama UP, 1974). A good summary case against judicial review is made in Charles S. Hyneman, The Supreme Court on Trial (New York: Atherton Press, 1963), Pt. 2, and a trenchant attack in Louis B. Boudin, Government by Judiciary, 2 vols. (New York: William Goodwin, 1932).
The conflict between Franklin Roosevelt and the Court has been assessed from a number of perspectives. Edward S. Corwin, Constitutional Revolution, Ltd. (Pomona, Cal.: Claremont Colleges, 1941), discusses the changing constitutional doctrines and their acceptance by the Court after 1937. Stimulating analysis of the Court in the 1930s and 1940s is contained in C. Herman Pritchett’s The Roosevelt Court (1948)20 and Alpheus T. Mason’s The Supreme Court from Taft to Burger (1979).18 Joseph Alsop and Turner Catledge, The 168 Days (Garden City, N.Y.: Doubleday Doran, 1938), is a contemporary account of the Court-packing struggle, while Robert H. Jackson’s The Struggle for Judicial Supremacy (New York: Knopf, 1941) assesses the crisis from the viewpoint of a protagonist of the President. The drama of the event is well portrayed in James M. Burns, Roosevelt: The Lion and the Fox (New York: Harcourt, Brace, 1956), while William Leuchtenburg’s essays (1966, 1969)19 provide new insights into the Court-packing plan. The Roosevelt-Hughes clash and the later influence of Stone is put into a historical context in Alpheus T. Mason, The Supreme Court: Palladium of Freedom (Ann Arbor: Michigan UP, 1962).
Maeva Marcus, Truman and the Steel Seizure Case (1977),30 is an exhaustive study which also assesses the implications of the case for the Presidency today, and has an extensive bibliography, while Alan F. Westin, The Anatomy of a Constitutional Law Case (New York: Macmillan, 1958), is another valuable study of the case. Clinton Rossiter, The Supreme Court and the Commander in Chief (1951),29 provides important information on the Court’s handling of ‘war’ problems, and is also available with additional text on the more recent period by Richard P. Longaker (Ithaca, N.Y.: Cornell UP, 1976).
The Warren Court has been the subject of a range of studies, some supportive, others critical. Among the best of the former is Archibald Cox, The Warren Court: Constitutional Decision as an Instrument of Reform (Cambridge, Mass.: Harvard UP, 1968). Mildly critical and eminently readable is Philip Kurland, Politics, the Constitution and the Warren Court (1970).42 Alexander M. Bickel, The Supreme Court and the Idea of Progress (New York: Harper & Row, 1970), is a thoughtful critique, while Richard Funston’s Constitutional Counterrevolution? (1977)43 is a de-tailed and controversial commentary on the Court under Warren and Burger.
The Brown case is analysed in great detail and with much literary panache in Richard Kluger’s Simple Justice (1976),36 which also has extensive bibliographical information. The consequences of this decision are analysed in detail in J. Harvie Wilkinson III, From Brown to Bakke: The Supreme Court and School Integration: 1954-1978 (New York: Oxford UP, 1979), while the style and attitudes of the Warren Court are criticized in Richard Maidment’s articles (1975, 1977)35 and in Raoul Berger, Government by Judiciary: The Transformation of the Fourteenth Amendment (Cambridge, Mass.: Harvard UP, 1977). The difficulties of implementing the decision are discussed in Jack Peltason’s Fifty-Eight Lonely Men (1961).37 For executive reactions to civil rights issues, Ruth P. Morgan, The President and Civil Rights (New York: St. Martin’s Press, 1970), is brief but useful.
The Court’s reticence regarding the war in Vietnam is analysed critically in Anthony D’Amato and Robert O’Neil, The Judiciary and Vietnam (1972),48 and the President’s claims to war-making authority challenged in Jacob Javits, Who Makes War (New York: Morrow, 1973). The attempt by Nixon to ‘pack’ the judiciary is discussed in detail in James Simon’s In His Own Image (1973),44 though the analysis is rather shaky. Raoul Berger’s Executive Privilege (1974)52 is a devastating and scrupulously documented attack on the myth of executive privilege.
Several sources provide detailed discussion of the legal consequences of Watergate. The five-volume collection edited by A. Stephen Boyan, Jr., Constitutional Aspects of Watergate: Documents and Materials (Dobbs Ferry, N.Y.: Oceana Publications, 1976-79), contains extensive information on executive privilege and other legal issues. The crucial role of the federal courts in the Watergate affair is examined in Howard Ball, No Pledge of Privacy: The Watergate Tapes Litigation (Port Washington, N.Y.: Kennikat Press, 1977), while the documentary material on United States v. Nixon, including the oral argument before the Supreme Court, is available in Leon Friedman, ed., United States v. Nixon: The President Before the Supreme Court (New York: Chelsea House, 1974). The related congressional investigations are examined in James Hamilton, The Power to Probe: A Study of Congressional Investigations (New York: Random House, 1976), and a stimulating assessment of the constitutional crisis is contained in Philip Kurland, Watergate and the Constitution (1978).3
*For full bibliographical details, see the appropriate reference in the Notes as indicated.
8. Notes
- See Ira H. Carmen, Power and Balance: An Introduction to American Constitutional Government (New York: Harcourt Brace Jovanovich, 1978), pp. 170-224. Back
- Arthur M. Schlesinger Jr., The Imperial Presidency (Boston: Houghton Mifflin, 1973,and London: Andre Deutsch, 1974). Back
- See Philip B. Kurland, Watergate and the Constitution (Chicago: Chicago UP, 1978). Back
- Robert Scigliano, The Supreme Court and the Presidency (New York: Free Preu, 1971).Back
- Marbury v. Madison, 5 U.S. (1 Cranch) 137. This standard form of notation refers to the United States Reports, vol. 5, p. 137, though for the early years of the Court the volumes are often referred to by the name of the current official Reporter, in this case William Cranch’s first volume. Back
- On the significance of judicial review, see Richard Y. Funston, A Vital National Seminar: The Supreme Court in American National Life (Palo Alto, Cal.: Mayfield Publishing Co., 1978), pp. 1-32. Back
- Scigliano, p.35. Back
- Arguably, this was true even in the so called “Progressive era”: see J.A. Thompson, Progressivism, the second pamphlet in this series, esp. pp. 39-40. Back
- In re Neagle, 135 U.S. 1. Back
- In re Debs, 158 U.S. 564. Back
- Myers v. United States, 272 U.S. 52. Back
- See Harris Warren, Herbert Hoover and the Great Depression (New York: Oxford UP, 1959). For the New Deal period, see William E. Leuchtenburg, Franklin D. Roosevelt and the New Deal, 1932-1940 (New York: Harper& Row, 1963). Back
- 293 U.S.388. Back
- Schechter v. United States, 295 U.S. 495. Back
- 295 U.S. 602. The decision was qualified later in Morgan v. Tennessee Valley Authority, 312 U.S. 701(1941). Back
- 297 U.S. 1. Back
- Carter v. Carter Coal Co., 298 U.S. 238. Back
- Alpheus T. Mason, The Supreme Court from Taft to Burger (Baton Rouge: Louisiana State UP, 1979), p. 97. This is a revised and enlarged edition of id., The Supreme Court from Taft to Warren (ibid., 1958). Back
- W.E. Leuchtenburg, “The Origins of Franklin D. Roosevelt’s ‘Court-Packing’ Plan,” in Philip B. Kurland, ed., The Supreme Court Review, 1966 (Chicago: Chicago UP, 1966),pp. 347-400. See also id., “Franklin D. Roosevelt’s Supreme Court ‘Packing’ Plan,” in Harold M. Hollingsworth and William F. Holmes, eds., Essays on the New Deal (Austin: Texas UP, 1969). Back
- C. Herman Pritchett, The Roosevelt Court: A Study in Judicial Politics and Values, 1937-1947 (Chicago: Quadrangle, 1948). Back
- 304 U.S. 104. Back
- 67 U.S. (2 Black) 635. Back
- United States v. Curtits-Wright Export Corporation, 299 U.S. 304. Back
- In United States v. Belmont, 301 U.S. 304 (1937), and United States v. Pink, 315 U.S. 203 (1942), the Court tacitly accepted that executive agreements with foreign governments, entered into by the President alone, could be enforced in the courts as internal law, and hence be legally analogous to treaties. Back
- 17 Fed. Cases 144 (No. 9487). Back
- Hirabayashi v. United States, 320 U.S. 81 (1943); Korematsu v. United States, 323 U.S. 214 (1944); Exparte Endo, 323 U.S. 283 (1944). Also in 1944 the Court by a narrow majority reversed a conviction under the Espionage Act of 1917 (Hartzel v. United States, 322 U.S. 680). Back
- 341 U.S. 494. Back
- 343 U.S. 579 (1952). Back
- See Edward S. Corwin, The President: Office and Powers, 1787-1948 (1940; 3rd ed. rev., New York: New York UP, 1948); C.H. Pritchett, “The President and the Supreme Court,” Journal of Politics, 11 (1949), 80-92; and Clinton Rossiter, The Supreme Court and the Commander in Chief (1951;rept., New York: DaCapo Press, 1970). Back
- For the background to the case and analysis of the written opinions, see Maeva Marcus, Truman and the Steel Seizure Case: The Limits of Presidential Power (New York: Columbia UP, 1977), pp. 102-227. Back
- Schlesinger, Imperial Presidency, p. 143. Back
- For the outcome of the steel dispute, see Marcus, pp. 249-56. Back
- 347 U.S.483. Back
- Plessy v. Ferguson, 163 U.S. 537. Back
- See R.A. Maidment, “Changing Styles in Constitutional Adjudication: The United States Supreme Court and Racial Segregation,” Public Law (1977), 168-86, and id., “Policy in Search of Law: the Warren Court from Brown to Miranda,” Journal of American Studies, 9 (1975), 301-20. Back
- See Richard Kluger, Simple Justice (New York: Knopf, 1976, and London: Andre Deutsch, 1977). Back
- See Robbins L. Gates, The Making of Massive Resistance (Chapel Hill: North Carolina UP, 1964), and Jack W. Peltason, Fifty-Eight Lonely Men (New York: Harcourt, Brace, 1961). Back
- Cooper v. Aaron, 358 US. 1,and Griffin v. County School Board of Prince Edward County, 377 U.S. 218. Back
- See Baker v. Carr, 369 U.S. 186(1962). Back
- See in particular Yates v. United States, 345 U.S. 298 (1957). In the 1960s the emphasis shifted to the defence of the rights of criminal defendants and religious minorities. Back
- M. Shapiro, “The Supreme Court: From Warren to Burger,” in Anthony King, ed., The New American Political System (Washington D.C.: American Enterprise Institute, 1978), p. 193. Back
- For development of these arguments, see Philip B. Kurland, Politics, the Constitution and the Warren Court (Chicago and London: Chicago UP, 1970), pp. 21-50 and 98-169. Back
- Richard Y. Funston, Constitutional Counterrevolution? The Warren Court and the Burger Court: Judicial Policy Making in Modern America (Cambridge, Mass.: Schenkman, 1977), pp. 297-325. Back
- For a different interpretation, see James F. Simon, In His Own Image: The Supreme Court in Richard Nixon’s America (New York: McKay, 1973). Back
- Alexander, v. Holmes County Board of Education, 396 U.S. 19, and Swann v. Charlotte-Mecklenburg Board of Education, 402 U.S. 1. Back
- Schlesinger, Imperial Presidency, Ch. 8. Back
- Ibid., p. 252. Back
- See Anthony A. D’Amato and Robert M. O’Neil, The Judiciary and Vietnam (New York: St. Martin’s Press, 1972). Back
- New York Times v. United States, 403 U.S. 713 (1971). Back
- On impoundment, see Train v. City of New York, 420 U.S. 35 (1975), and City of New York v. Ruckelshaus, 385 F. Supp. 669 (D.D.C. 1973); on the pocket-veto power, see Kennedy v. Sampson, 511 F.2d 430 (D.C. Circ. 1974); on the dismantling of the Office of Economic Opportunity, see Williams v. Phillips, 360 F. Supp. 1363 (D.D.C. 1973). Back
- 407 U.S. 297. This decision was reinforced in United States v. Giordano, 416 U.S. 505 (1974). Back
- See Raoul Berger, Executive Privilege: A Constitutional Myth (Cambridge, Mass.: Harvard UP, 1974). Back
- Senate Select Committee on Presidential Campaign Activities v. Nixon, 366 F. Supp. 51 (D.D.C. 1973). Back
- 487 F.2d 700 (D.C. Cir. 1973). Back
- 418 U.S. 683 (1974). Back
Top of the Page
J.A. Thompson, Progressivism
BAAS Pamphlet No. 2 (First Published 1979)
ISBN: 0 9504601 1 7
- Introductory
- Progressive Ideology: The Challenge to Laissez-Faire
- Pressures for Reform
i The Humanitarian Impulseii The Quest for Efficiencyiii The Upholding of Traditional Valuesiv The Role of Business Interestsv The Urban Working Class
- In Conclusion
i Was There A ‘Progressive Movement’?ii Was There A ‘Progressive Era’?
- Guide to Further Reading
- Notes
British Association for American Studies All rights reserved. No part of this pamphlet may he reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without permission in writing from the publisher, except by a reviewer who may quote brief passages in a review. The publication of a pamphlet by the British Association for American Studies does not necessarily imply the Association’s official approbation of the opinions expressed therein.
1. Introductory: Progressivism
Perhaps because it extends to little more than two hundred years in all, the history of the United States tends to be segmented into comparatively short periods or ‘eras’, each of which is accorded some distinctive character. Thus the early part of the twentieth century has become known as ‘the Progressive Era’. The description reflects the perception of many contemporaries who saw the period from about 1900 to the First World War as being dominated by a ‘Progressive movement’ that sought to curb the power of large business interests, to purify the political process and make it more responsive to ‘the people’, and to extend the functions of government in order to protect the public interest and relieve social and economic distress.
This movement was seen as having arisen first in the 1890s in some cities. It then moved on to the state level with the election of governors like Robert M. La Follette, whose legislative programme made Wisconsin a model for other reformers to follow. It gained a national voice after Theodore Roosevelt became President in 1901, although little legislation was passed until his second administration (1905-09). When his successor, William Howard Taft, proved less sympathetic to reform, Roosevelt exploited the dissatisfaction of reformers and ‘insurgent’ Republicans in an attempt to gain the party’s nomination for a third term. Defeated by Taft, he still ran in 1912 as the nominee of the new Progressive (or ‘Bull Moose’) Party. The division of the Republicans made possible the election of the Democrats’ reform-minded candidate, Woodrow Wilson, after an election campaign which saw a major debate on the issues of Progressivism. “It was a time of national restlessness and awakening,” wrote Frederic Austin Ogg in 1918, “of sharp reaction against the old order in business, politics, and government which was fastened upon the preoccupied and unsuspecting nation in the great epoch of material prosperity from the late seventies to 1890 . . . . The reaction set in slowly in the first Roosevelt administration; in the second it gathered momentum and achieved important results; under Taft it lagged, at least within government circles; under Wilson it swept on irresistibly, forcing vested interests under rigorous control, pouring light into darkened corners, and opening the way for more direct and effective popular rule.”[1]
The simple outlines of this picture have been much blurred by more recent historiography. Closer study of those actively involved in promoting Progressive reforms at all levels has revealed some ambiguity in their objectives and motives and great diversity in their priorities and programmes. The extent to which many looked backward to the economic conditions and moral values of an earlier age was emphasized by Richard Hofstadter and George Mowry.[2] Other historians, notably Samuel P. Hays, have stressed the influence of interest groups and argued that, despite the democratic rhetoric in which they were advocated, many of the reforms—such as those in the structure of municipal government—were elitist in their consequences.[3] A more sceptical analysis of the creation of economic regulatory agencies—like the Federal Trade Commission and the Federal Reserve Board—has suggested that business ‘interests’, far from being the passive targets of the Progressive movement, were both its principal beneficiaries and its most effective component. So much so, indeed, that Gabriel Kolko has seen the period as representing “the triumph of conservatism”, when the “political capitalism” that characterized modern America took shape.[4]
These various waves of revisionist historiography have raised two broad questions about Progressivism. The first is whether there was anything sufficiently united in its programme, motives and social composition to justify the term ‘Progressive movement’. The second is to what extent the history of the early twentieth century in America was shaped by the activities of such a movement, or even of disparate groups of progressive reformers. In other words, what, if anything, was distinctively progressive about the ‘Progressive Era’? It is upon these broad questions that this pamphlet will focus.
As with other issues in American history (for example, ‘isolationism’), the real historical complexities of Progressivism have been compounded by semantic difficulties. There are no agreed criteria for establishing who ‘the Progressives’ were. Unlike the Populists, they cannot be defined by support for a particular political party. The Progressive Party founded in 1912 was primarily the vehicle for Roosevelt’s independent campaign for the Presidency. Although some who were most active in its organization were dedicated reformers who compared themselves with those who launched the Republican party in the 1850s, the vast majority of those who voted for Roosevelt in 1912 voted for no other candidate of the Progressive party either at that time or any other, and cannot be assumed to have supported the party’s programme. On the other hand, many who never joined the Progressive party, including some Republicans as well as many Democrats, had views that seem unquestionably ‘Progressive’. The matter is further confused by the common use of the two terms ‘Progressive movement’ and ‘Progressive Era’. To some historians, Progressives are those who participated in a movement that had certain definable objectives—and was not necessarily confined to a particular historical period.[5] Others use the term to describe almost everyone who was engaged in political or quasi-political activity in the period 1900-14—at least if they sought any changes in laws or institutions. When viewed in this way, Progressivism no longer seems to describe an identifiable set of political attitudes.
In order to provide a more precise meaning for the term, this pamphlet begins with an account of the ideas of the most prominent and articulate reformers and social critics of the period. It then turns to an analysis of the various social forces that contributed to the political pressure for reform, and considers the relationship between each of these forces and the aspirations of Progressive publicists. The question of how far it makes sense to speak of a ‘Progressive movement’ involves an examination of the relationship of these forces to each other, particularly in the arena of politics. They were not naturally compatible, but an uneasy, and usually temporary, harmony was facilitated by a public mood generally sympathetic to calls for reform. This mood was the most strikingly distinctive characteristic of the ‘Progressive Era’, and its rather superficial and transient quality, together with the diversity of the impulses and objectives involved, does much to account for the very limited extent to which the hopes of Progressive reformers were actually realized during this period.
The Background
At the outset, however, we must recognize that Americans now found themselves in the midst of great changes. Between the 1870s and the 1920s the United States was transformed by the related processes of industrialization, urbanization and immigration. In the thirty years preceding 1900 the American population grew from less than forty million to over seventy-five million. The nation’s production of bituminous coal increased ten times, of crude petroleum twelve times, of steel ingots and castings more than 140 times. The number of Americans living in ‘urban areas’ (i.e. places with a population of over 2,500) rose from ten to thirty million, and these aggregate figures do not reveal the spectacularly rapid rate of growth of some large cities, particularly in the Middle West. Chicago’s population, for example, increased from 503,000 to 1,000,000 in the single decade of the 1880s. But rural America too was growing—if not so rapidly. The number of farms, as well as the number of acres under cultivation, had doubled between 1870 and 1900, and the production of wheat, cotton and corn had increased from two to two-and-a-half times. At the turn of the century, three-fifths of Americans still lived in rural areas and over a third of all those gainfully employed worked on farms.
This material progress was bound up with some developments that seemed disturbing to .many Americans. One of these was the rise of ‘the trusts’; this term, originally a technical one, came to be applied to all those large corporations that enjoyed monopolistic or oligopolistic market power, or simply possessed great financial resources. They were usually the product of some form of amalgamation. The turn of the century saw a ‘merger boom’. Whereas from 1887 to 1897 there had been only eighty-six industrial combinations with a total capital of less than $1500 million, in the five years from 1897 to 1902 2,653 independent firms in manufacturing and mining disappeared into combinations with a total capital value of 6,320 million. The United States Steel Corporation alone was capitalized at $1,370 million when it was created in 1901 by the investment banker J.P. Morgan. There was widespread concern about the political as well as the economic implications of this process of concentration.
A second source of anxiety was the widening of social divisions. Americans, with the exception of a few Southerners, had traditionally prided themselves on the absence in their country of the sort of class structure that existed in Europe. But there were signs at the turn of the century of the emergence of a self-conscious working-class movement. During the 1890s there had been some bitterly fought industrial conflicts, notably the Homestead steel strike of 1892 and the Pullman railway boycott of 1894. The first durable national organization of trade unions was the American Federation of Labour, founded in 1886. Its membership grew strikingly between 1897 and 1904, from 265,000 to 1,676,000.
If America was not, after all, to be exempt from the same tensions that other industrial societies suffered, she was also to experience some that were less common. These were the result of the scale and character of immigration. Between 1880 and 1914 almost twenty-three million people entered the United States. Fairly constantly between 1890 and 1920, around 15 per cent of the American population were foreign-born, and a further 25 per cent were children of at least one foreign-born parent. Immigration reached a climax in the decade 1905-14, during which over ten million people entered the country. Not only did the newcomers arrive at this time in larger numbers than ever before but they seemed to be more alien than their predecessors. About 70 per cent of them came from Eastern or Southern Europe rather than from Britain, Ireland and Germany, the three countries which until the 1890s had accounted for around 85 per cent of all American immigrants. These ‘new immigrants’ were Catholic, Jewish or Orthodox in religion rather than Protestant.
Almost as disturbing to assumptions of cultural homogeneity, perhaps, was the continuing ethnic consciousness and increasing self-confidence displayed by the descendants of an earlier wave of immigration—the one that had poured in during the late 1840s and early 1850s, principally from Ireland and Germany. By 1900 politicians of both parties in the Middle West had learned through painful experience the importance of not offending the susceptibilities of German-American Catholics and Lutherans, while Irish Catholics, through their control of the Democratic party, dominated the politics of several cities and towns in the North-eastern states.
It was in the cities that the problems of progress were most obvious. The ‘new immigrants’ tended to congregate there. In 1910 the foreign-born and their children comprised more than two-thirds of the population of most major cities in the Northeast and more than three-quarters of that of New York, Boston and Chicago. Many of them lived in appallingly squalid and overcrowded conditions. Basic municipal services could not cope with the growth. As late as 1900 such cities as Baltimore and New Orleans had no sewers at all, while two-thirds of Chicago’s streets were mud. Fire and disease were constant hazards. The formal system of government, reflecting the traditional American distrust of power, provided for a large number of elected officials none of whom enjoyed much authority. In practice power was exercised in most cities by a political machine, usually headed by an identifiable ‘boss’. This system bred corruption, particularly in the grant of franchises to private corporations to provide public services—street railways (trams), gas, water, telephones, etc. These franchises were worth large sums of money, and returns on such investment were generally not taxed.
The power of the trusts, the condition of the cities, and the apparent threats to the homogeneity and cohesion of society—these were some of the more obvious problems generated by the speed and character of American economic development. The responses to them of different groups of Americans did much to shape the issues and alignments of politics in the ‘Progressive Era’.
2. Progressive Ideology: The Challenge to Laissez Faire
It is in the realm of ideas that the identity of Progressivism is least in doubt. In the early twentieth century a number of writers and publicists devoted considerable critical attention to the consequences and concomitants of the sort of capitalist economic development the United States was experiencing. Though the particular ills diagnosed and remedies prescribed varied from writer to writer, there was sufficient agreement for them to be regarded as exponents of a common body of thought, and their writings provide the best means of defining what Progressivism meant at the time.
Such criticism of contemporary society was, of course, not new. In the later nineteenth century large readerships had been attracted by such different works as Henry George’s heterodox economic tract Progress and Poverty (1879), Edward Bellamy’s Looking Backward (1887), a fictionalized picture of a collectivist utopia, and Henry Demarest Lloyd’s hostile history of the Standard Oil Company, Wealth Against Commonwealth (1894).But the early twentieth century witnessed a great outpouring of social criticism. For about a decade following 1903, several popular magazines published ‘muckraking’ articles in which journalists like Ray Stannard Baker, Charles Edward Russell and Lincoln Steffens made their reputations by exposing examples of political corruption, financial chicanery and economic exploitation. A spate of books by such publicists as Frederic C. Howe and William Allen White argued the case for various kinds of reform, while other authors, notably the Socialists Robert Hunter and John Spargo, produced documented studies of the extent and character of urban poverty. A few years later Herbert Croly, Louis Brandeis, Walter Weyl and Walter Lippmann were among those who sought to provide more comprehensive analyses of what needed to be done if what Croly called “the promise of American life” was to be redeemed. For all their differences, there was general agreement among these writers on the need for some extension of the role of government in regulating economic activity. This challenge to the doctrine of laissez-faire, which had for long enjoyed both intellectual prestige and apparent popular assent, was a defining characteristic of Progressivism.
This sort of intellectual reaction to industrial capitalism was not, of course, an American monopoly. In Europe at this time, not only were various forms of Socialism gaining support in a number of countries but the movement towards collectivism was being promoted by such diverse agencies as the Imperial German government and the British Liberal party. Indeed, one could well argue that what was more peculiarly American was the deep and persistent appeal of economic individualism. In the late nineteenth century this owed most of its intellectual authority to such theories as that form of evolutionary sociology known as Social Darwinism and to the laws of classical economics. It was institutionally buttressed by the narrow view taken by the nation’s judges, particularly those on the Supreme Court, of the powers granted to government in the United States Constitution. But it derived most of its popular support from the related beliefs that in America, unlike the countries of the Old World, opportunity for economic advancement was open to all, and that the degree of an individual’s achievement depended primarily upon his own character, talents and efforts. By combining a faith in the reality of equal opportunity with the conviction that the qualities that led to worldly success were above all such virtues as industry, sobriety, frugality and honesty, this complex of assumptions was able to enlist the two great sources of moral authority in American thought—the liberal democratic tradition and Protestant Christianity.
The attack on laissez-faire proceeded on the same variety of fronts as its defence. Social scientists, notably Lester Ward, questioned the assumptions of Social Darwinism by pointing out that even in nature the ‘survival of the fittest’ applied to species as well as individuals and did not exclude mutual aid, and also by emphasizing the possibility of conscious social action that transcended the purposeless character of biological evolution. Economists like Richard T. Ely, Simon N. Patten and Thorstein Veblen, each in a distinctively individual way, sought to modify the laissez-faire tradition of their discipline by emphasizing the theoretical status and historically relative nature of its ‘laws’, and by suggesting that their applicability to present conditions was limited. The political scientist J. Allen Smith and the historian Charles A. Beard sought to show that the Constitution, far from being the embodiment of timeless political wisdom, had from its inception been designed to protect the vested interests of a wealthy minority.
For most people, however, the moral arguments were the crucial ones. No less than the upholders of laissez-faire, advocates of reform appealed to values derived from Christianity and the American democratic tradition. They maintained that the sort of behaviour encouraged and rewarded by the existing economic system was ruthless, selfish competitiveness of a kind completely antithetical to the teachings of the New Testament. They denied that the vast inequalities of income that were so striking a feature of their society could plausibly be attributed simply to differences in the character and capacities of individuals. “The princely fortunes which have come into existence during the past few years,” wrote Howe, “are not traceable to thrift,, intelligence or foresight on the part of their owners any more than the widespread poverty of the masses of the people is due to the lack of these virtues on their part.”[6] By implying that no real equality of opportunity existed in contemporary America, this argument deprived the status quo of its moral justification while at the same time it opened the way for treating poverty as a general social problem rather than a series of individual cases.
‘Equal rights for all and special privileges for none’ was a very traditional American doctrine. The emphasis placed upon the principle of equality of opportunity by Progressives is one reason why some historians have seen their outlook as essentially backward-looking. Richard Hofstadter, for one, argued that Progressivism was in many ways closer to the entrepreneurial liberalism that he identified as Jacksonian Democracy than it was to the later New Deal, which possessed “a social-democratic tinge that had never before been present in American reform movements.” “Progressivism, at its heart, was an effort to realize familiar and traditional ideals under novel circumstances,” he wrote. “The ordinary American’s ideas of what political and economic life ought to be like had long since taken form under the conditions of a preponderantly rural society with a broad diffusion of property and power.”[7]
There is no doubt that Progressive writers were prone to compare the past favourably with the present in some respects. But they would certainly not have wished to forego economic and technological progress. On the contrary, they were excited by its potential social benefits. “It is the increasing wealth of America,” argued Weyl, “which makes democracy possible and solvent, for democracy, like civilization, costs money.”[8] The trouble, as critics from Henry George to Herbert Croly emphasized, was that at present this increased wealth was distributed so unequally.[9] This desire for a more equitable distribution of income was, indeed, a second defining characteristic of Progressivism. But neither industrialization nor urbanization was to be deplored in itself—even someone as conscious of urban problems as Howe saw the city as “the hope of democracy”.[10] In other words, it was “the broad diffusion of property and power” rather than the “preponderantly rural society” that seemed attractive about the past, and, as their name implied, Progressives were confident that the future could be better still.
The concern to limit economic inequalities was a feature of what might be called the liberal or democratic side of American political debate from the days of Thomas Jefferson to those of Franklin D. Roosevelt. What did change was the belief that this and the other elements of a properly democratic society could be achieved through the free working of a competitive system. This conviction was hard to sustain in the context of a large-scale urban and industrial economy, particularly one dominated by large corporations. The trusts appeared to threaten the assumptions of traditional American liberalism in a number of ways. Their economic power seemed to exempt them from the discipline of the market and leave them in a position to exploit their suppliers, their customers and their employees. Their vast capital resources, and ability if necessary to sustain temporary losses, enabled them to keep new competition from the field and hence to close opportunities to aspiring entrepreneurs. This had disturbing implications not only for social mobility but for political democracy itself. Americans had traditionally believed that popular government required an independent citizenry, and the prospect of a society divided between corporate magnates and hired employees hardly constituted the republican ideal. Beyond this, there was a simple fear of the power of money in politics. “I do not expect to see monopoly restrain itself,” declared Woodrow Wilson in 1912. “If there are men in this country big enough to own the government of the United States, they are going to own it.”[11]
For these reasons, the rise of the trusts constituted a crisis for American liberalism. Progressive writers agreed that the situation called for some action but differed as to precisely what. The fundamental issue was one of diagnosis. Was monopoly or oligopoly the natural result of technological progress and economies of scale? Or were the trusts artificial creations, maintained through essentially unfair competitive methods by those with privileged access to capital and legal protection? Those who took the latter view sought to restore competition through the enforcement and strengthening of the Sherman Anti-Trust Act of 1890, which had outlawed “every contract, combination in the form of trust or otherwise, or conspiracy in restraint of trade.” A number of them would supplement this programme with a reform of the currency and credit system (so as to open the doors of opportunity and weaken the hold of powerful interlocking financial groups), and some went so far as to call for government ownership of natural resources and natural monopolies (including in some cases the railroads) on the grounds that private monopolies in such fields could impair competition in other spheres through discriminatory behaviour. But at the heart of this outlook was a Jeffersonian suspicion of concentrated power, which included a suspicion of over-strong government. This was generally the position that Woodrow Wilson, much influenced by Louis Brandeis, adopted in his 1912 campaign, and it is conveniently described by his campaign slogan that year, “the New Freedom”. In that same election Theodore Roosevelt advocated an alternative approach which he called “the New Nationalism”. Like Croly and Charles Van Hise, Roosevelt argued that attempts to break up the trusts would be both futile and economically damaging; rather, the government should regulate their activities, through a commission or other agency, to prevent them engaging in unfair business practices or exploiting their customers. The New Nationalist variant of Progressivism called for an enhanced role for government on a continuous basis, and was more naturally sympathetic to proposals for social-welfare legislation than the New Freedom approach.
The difference between these two approaches was to be of significance in the history of American liberalism into the New Deal era and beyond.[12] But its divisive effect should not be exaggerated. Only the most theoretically inclined of Progressives cleaved consistently to one or the other position. Many seemed to favour both regulation and anti-trust prosecutions. In reality few were prepared to risk the economic consequences of a thoroughgoing attempt to put the New Freedom into practice, as the course of Wilson’s own administration was to show. On the other hand, not many more were prepared completely to renounce the rhetoric of anti-trust. This ambivalence, and the apparent abstruseness of the arguments in 1912 over the relative merits of ‘regulated competition’ and ‘regulated monopoly’, led several to agree with William Allen White that “between the New Nationalism and the New Freedom was that fantastic imaginary gulf that always has existed between Tweedledum and Tweedledee.”[13]
White’s witticism implied that, whatever their differences on the trust question, Progressives shared a common political philosophy. At the centre of this philosophy, as it was expressed in the writings of Progressive publicists, was a commitment to democracy. The core meaning of democracy was government by the people, and to make this more real and effective Progressives proposed a whole range of constitutional and political reforms—the direct election of United States Senators by the voters (rather than by the state legislatures), primary elections to choose the party candidates in local, state and even presidential contests, the secret ballot, women’s suffrage, the initiative, referendum and recall (devices by which the voters could directly make or repeal laws and eject elected officials at any time), and a constitutional amendment to make it easier to amend the Constitution. These measures were advocated on a variety of grounds. The first was principle—democracy was the most morally legitimate form of government. It was also superior on utilitarian grounds. Though few believed public opinion to be infallible (apart from anything else, it could be ‘corrupted’ by newspapers), Lincoln Steffens spoke for many when he declared that “in the long run the people will go right more surely than any individual or set of individuals.”[14] Moreover, responsibility for decision-making would itself provide a moral education for the citizenry. “The democratic theory,” explained Steffens, “is founded on the expectation that self-government, by its very abuses, will tend gradually to develop in all men such a concern for the common good that human nature will become intelligent and considerate of others.”[15] Progressive writers were generally confident that the transfer of power from the party bosses to ‘the people’ would open the way to the other reforms they sought.
However, some of the political reforms urged by Progressive publicists would have the effect of diminishing, not enlarging, the part played by the people in government. This was true, for example, of ‘the short ballot’—the name given to proposals for reducing the number of officials who were popularly elected. Such proposals were defended on the grounds that ‘the long ballot’ symbolized a spurious democracy, since voters could not realistically be expected to make informed judgements on the qualities and performance of a whole host of minor office-holders and in practice tended simply to support a party ticket. True public accountability would be better secured by giving elected chief executives the authority to run the whole administration, and hence the capacity to implement the programmes they campaigned on. This argument reflected a general respect for the qualities of professional administrators and experts as well as a desire for strong executive leadership. Some historians have seen these traits as reflecting an elitism that discredits the Progressives’ democratic pretensions.[16] It may be observed, though, that neither a faith in bureaucracy and technical expertise nor the hope that charismatic leadership would inspire the people to rise above their petty, selfish concerns and establish social justice has been confined to that generation of American liberals.
Democracy, however, was much more for most Progressive writers than simply a form of government. “A real and not merely a formal democracy does not content itself with the mere right to vote,” declared Weyl, but demands “a new social spirit” of cooperation and altruism.[17] This aspect of Progressivism was frequently expressed in language reminiscent of evangelical revivalism—not least by William Allen White who, despite his nationwide fame, chose always to remain the editor of a small-town Kansas newspaper. “The problem of democracy is at base the problem of individual self-sacrifice coming from individual good will,” he explained. “The struggle between democracy and aristocracy in America is in every man’s heart.” White himself saw Progressivism as a sort of secular Great Awakening. “We were a money-mad nation,” he wrote remorsefully. “In the soul of the people there is a conviction of their past unrighteousness.”[18] White’s style was very much his own, but writers as different as Jane Addams and Herbert Croly also saw democratic ethics as more or less synonymous with Christian love.[19]
This point of view naturally emphasized ‘the common good’ or ‘the general welfare’ rather than individual rights, and it was from this perspective that Progressives approached social questions. It led to a certain ambivalence in their attitudes towards the organized labour movement, despite their commitment to greater economic equality. During strikes the sympathies of most were more likely to be with the workers than the employers, but there was a general sense that trade unions too were selfish, sectional interests. The American Federation of Labor, in particular, was condemned by some for its seeming indifference to the plight of the great mass of workers who were unorganized, its reluctance to engage in radical political action, and the indifference or hostility of its leaders, particularly Samuel Gompers, to social welfare legislation. For these reasons, some Progressive writers even found the quasi-revolutionary Industrial Workers of the World (‘Wobblies’) more sympathetic. For the most part, however, the answer to the problem of labour relations was held to lie in “industrial democracy”—a vague term which could mean anything from the establishment of collective bargaining to some form of workers’ representation or co-partnership.
It was, however, primarily through action by government that Progressives sought to create their social democracy or “cooperative commonwealth”. They advocated a whole range of measures from the abolition of child labour, the regulation of the hours and conditions of work, minimum-wage laws (especially for women), compulsory insurance against accidents, unemployment, sickness and old age, codes of standards for housing, and various improvements in education. By the nature of the federal system, most of these were matters for legislation at the state level—though the campaign to abolish child labour came to focus on the need for national action. Benjamin P. De Witt, the first historian of the “Progressive Movement” as well as a participant in it, emphasized that, although it might be overshadowed by “such preliminary measures as the initiative, referendum, recall, direct primaries and others”, in reality “the social phase of the progressive movement in the state is by far the most important—as much more important than the other phases as the end is more important than the means.”[20] To meet the costs of these measures (including funds for the effective enforcement of restrictive laws), and to attack the problem of maldistribution of wealth from the other end, Progressives advocated the imposition of direct taxes on incomes and inheritances.
In this demand for social-welfare legislation to be financed by increased direct taxation, as in their call for the extension of government control over economic life, Progressive publicists were adopting a political position essentially similar to that of their ‘new liberal’ and even ‘social democrat’ contemporaries in Europe. There were, to be sure, some distinctively American aspects to their thought—the anxiety over the social and political consequences of the rise of big business, the concern with finding mechanisms for making public opinion more widely and immediately effective in politics, and, on the other hand, rather less interest in the potentialities of public ownership. But several of the specific programmes—particularly in such fields as social insurance and municipal ownership of public utilities—were frankly derived from European precedents, and the basic commitment to using the power of government to achieve a less unequal distribution of wealth was the same. This constituted the core of Progressivism.
It would, however, be an odd view of the American political process that attributed its outcomes simply to the activities of social critics and journalists. In order to assess how far, and for what reasons, the aspirations of Progressive publicists were realized in practice, we must analyse further the various social forces that contributed to the pressure for reform.
3. Pressures for Reform
i. The Humanitarian Impulse: The Social Gospel and Social Work
None of the forces that contributed to the pressure for reform was more directly expressive of the democratic and Christian ideals invoked by Progressive publicists than middle-class efforts to improve the conditions of life of the poor, particularly in the cities. Many of those who took part in these efforts were Protestant ministers and laymen, but others lacked a confident belief in God while generally retaining a commitment to Christian ethics. The major expression of this humanitarian impulse was voluntary social work of one kind or another, but it also came to have a not insignificant effect on politics and legislation. A few of those involved in this movement were drawn directly into politics—quite a number of local Socialist candidates at this time were ministers—but far more contributed to the educational and lobbying activities of pressure groups like the National Child Labor Committee, the American Association for Labor Legislation, the Committee of One Hundred on National Health, and so on.
The late nineteenth and early twentieth century saw the rise within American Protestantism of the ‘Social Gospel’. The essence of this was the adoption of a progressive approach to economic and political questions, together with a greater emphasis upon the social role of the church in this world. Most Protestant clergymen in the mid-nineteenth century had accepted the conservative orthodoxy that the prevailing economic system rewarded the morally deserving. By contrast, adherents of the Social Gospel expressed an aversion to the ethics of unrestrained capitalism, particularly in the field of labour relations. There was, however, considerable variation in the degree of their radicalism. Some did not go much beyond appealing to employers to heed ‘the golden rule’, but many supported proposals for social legislation and some of the most famous, including W.D.P. Bliss, George D. Herron and Walter Rauschenbusch, were self-proclaimed Socialists.
The Social Gospel was probably always a minority movement in American Protestantism. Its greatest strength was in those denominations, such as the Unitarians, Episcopalians and Congregationalists, that were theologically liberal and appealed to the urban middle class. It made much less headway among Methodists and Baptists, churches which had a large rural membership and an allegiance to fundamentalist theology. But by the early twentieth century its adherents occupied strategic positions in some of the famous urban pulpits, the religious press, and, perhaps above all, theological seminaries. They were particularly active in interdenominational movements, such as the Federal Council of Churches, which was founded in 1908 and at its first meeting adopted a manifesto, “The Church and Modern Industry”, calling for welfare legislation and the strengthening of trade unions.
One leading historian of the Social Gospel movement attributes its rise, above all, to the disquiet created by the bitter labour conflicts of the late nineteenth century[21] It was clearly also a product of conditions in the cities, not least the problem of ‘the unchurched masses’. The fact that the urban working class was composed overwhelmingly of infidels and Catholics led the Protestant denominations to intensify their evangelical and philanthropic activities in the 1880s and 1890s through such means as Home Missions, Institutional Churches (where the facilities included club rooms, libraries, gymnasia, baths, etc.) and the Y.M.C.A. Experience in the slums and with welfare work led many young ministers to see the causes of poverty in a new light. The writings of English Christian Socialists such as F.D. Maurice and Charles Kingsley may have had some influence, and so may the growing appeal of a more ‘modernist’ theology and the more optimistic view of human nature associated with it. However, a concern with the reform of this world was no novelty for American clergymen, as the history of abolitionism and the temperance movement attests, and the rise of the Social Gospel probably owed more to social than to intellectual changes.
The Social Gospel naturally fostered an interest in social work, which at this time also attracted a number of young people without any firm theological commitment. The late nineteenth and early twentieth century was a transitional period in the transmutation of the age-old tradition of charity into modern, professionalized social work, mostly under government auspices. There were two major developments in the later nineteenth century, both of which originated in England. The first was the Charity Organization Society movement which sought to make philanthropy more efficient by co-ordinating the activities of different organizations, avoiding duplication, and, above all, finding out more about the recipients so as to discriminate the worthy cases, who would profit from help, from the unworthy, who would not. This enterprise involved the gathering by volunteer visitors of a great deal of information about individuals, which gradually led them to a more general view of the social causes of poverty. This in turn produced, in the slightly chilling language of ‘scientific’ philanthropy, a shift of emphasis from ‘correction’ to ‘prevention’—that is, to a desire to promote such social reforms as an improvement in housing conditions.
The second major development in social work was to give a much greater impetus to social reform. This was the establishment in the poorer parts of cities of settlement houses where middleclass residents could live amongst, and seek to help, the largely immigrant populations. The original model was Toynbee Hall in the East End of London but many more such settlements were eventually established in the United States than in Britain. By 1910 there were over 400 and, although the majority of these were denominational and not very different in character from Missions or Institutional Churches, the largest and best-known settlements, such as Hull House in Chicago, Henry Street in New York and South End House in Boston, were non-sectarian in character. It was these that made the most significant contributions to social reform.
They did so in a number of ways. The initial involvement was usually prompted by the need to secure some improvement in local municipal services from street-cleaning to the public schools, or some specific reform like the establishment of special courts for juvenile delinquents. This led on to agitation at the state or national level for such measures as the abolition of child labour. Leading settlement workers, like Jane Addams, Lillian Wald and Florence Kelly, became active in a wide variety of reform organizations from the Women’s Trade Union League to the National Association for the Advancement of Colored People. In 1912 Jane Addams and a few others were drawn into a national political campaign when Roosevelt’s Progressive Party adopted an advanced programme of social reform along the lines they had been demanding. The settlements also assisted the cause of reform in less direct ways. Their residents were active in the movement to gather more comprehensive and systematic information about urban conditions. The most ambitious such enterprise was the Pittsburgh Survey, published in six volumes between 1909 and 1914, which drew a good deal upon the efforts of settlement workers. Research of this kind provided ammunition for reformers. Some settlements, like Hull House, helped to foster trade-union activity, especially among women workers. Not least important was their influence upon the individuals who worked or lived in them, even if only briefly. A notably high proportion of those active in reform causes in the first half of the twentieth century had an association at some time in their lives with a social settlement, where they had the opportunity not only to acquire a closer acquaintance with urban poverty but to experience a sense of camaraderie among people trying to do something about it.
The primary reason for the growth of the sort of concern with social conditions exemplified by the Social Gospel and the settlement movement was, of course, those conditions themselves. The impulse to alleviate them owed something to fear—fear of epidemics and fires as well as of social conflict and radicalism. Social workers received many of their funds from wealthy businessmen. But it seems likely that most of the educated young people who constituted the overwhelming majority of settlement-house residents were more idealistic than anxious. Their shock at the poverty they encountered in the slums, and their refusal to accept it as inevitable, were themselves a reflection of the nation’s economic advance. Before the industrial revolution life had been cheap and a fatalistic attitude to suffering natural, Walter Weyl observed, “but to-day our surplus has made us as sensitive to misery, preventable death, sickness, hunger and deprivation as is a photographic plate to light.”[22] In addition, as Jane Addams herself pointed out, the social settlements answered to a “subjective necessity” on the part of many of their residents, particularly the young women college graduates who, lacking a clear social role, felt useless and “shut off from the common labor by which they live and which is a great source of moral and physical health.”[23]
ii The Quest for Efficiency
‘Efficiency’ was a vogue word in early twentieth-century America. An ‘Efficiency Society’ in New York attracted the support of several worthy public figures, while enterprising promoters set themselves up as efficiency experts and answered readers’ letters in magazines. The high-priest of this movement was undoubtedly Frederick W. Taylor, the pioneer of ‘scientific management’. Taylor had acquired his fame by achieving some spectacular increases in the output of manual workers in manufacturing industry through the use of time-and-motion studies and careful attention to the exact kind of tools best suited to each task. But he found a receptive audience when he wrote in 1911 that his principles “applied with equal force to all social activities; to the management of our homes; the management of our farms; the management of the business of our tradesmen, . . . of our churches, our philanthropic institutions, our universities and our government departments.”[24]
This concern with efficiency can be seen as an aspect of the process of modernization, an adaptation to the demands of large, complex organizations such as the modern corporation or the modern city. It was associated not only with the increasing role of salaried managers and technical experts in business, but with the movement in the older professions towards more exacting standards and greater emphasis upon formal qualifications, and the parallel growth in numbers and influence of professional organizations like the American Medical Association and state and local bar associations. Specialized expertise, preferably formally attested and given greater weight by some form of professional body, could be a source of status and self-respect even to those who did not enjoy the dignity of self-employment. For by far the fastest growing economic group in America at this time was ‘the new middle class’ of salaried employees—their numbers rose from 756,000 to 5,609,000 between 1870 and 1910.[25] Many of these were salesmen and clerks, but the class also included a good number of natural evangelists for the gospel of efficiency, such as engineers, technicians, architects, public-health experts, educational administrators and so on.
The part played by such people in promoting some of the legislative and political reforms of the early twentieth century has been stressed by historians like Samuel P. Hays and Robert H. Wiebe.[26] At the federal level, the Pure Food and Drug Act of 1906, for example, owed a very great deal to the efforts of Dr. Harvey Wiley, Chief Chemist in the Department of Agriculture. Similarly, the attempts by the national government, particularly during Roosevelt’s Presidency, to secure the conservation of natural resources originated in the bureaucracy with scientifically-minded men like the forester Gifford Pinchot and the geologist W J McGee. Their primary object, as Hays has emphasized, was neither the preservation of the natural wilderness nor the democratization of access to its resources but the prevention of waste. Such techniques as fire control and sustained-yield forestry were designed to achieve the most efficient exploitation of the environment, and the conservationists’ approach was more likely to be understood by large corporations than by small farmers.
In a broadly similar fashion, municipal-government reform seems in many cases to have been promoted by business and professional men seeking greater efficiency in the provision of civic services, and a better environment for themselves and their families. This was particularly true of the movement to replace the whole structure of mayor and council by some form of commission or city-manager system of government. This originated in 1900 when a hurricane and tidal wave in Galveston, Texas, overwhelmed not only the city but the capacities of its council. In response to an appeal from leading local property-owners, the state legislature appointed five commissioners to take over the town government. In 1903 the commissioners were made elective, and in this form the system was copied by more than 400 cities before the First World War. The modified version of the scheme by which the commissioners appointed a professional city-manager to run the day-by-day administration was pioneered by Dayton, Ohio, in 1913—also in the wake of a great flood. The citymanager plan had been adopted by more than 130 cities by 1919. These new forms of government were explicitly advocated on the grounds that they would be run on business lines, and in almost every case, as James Weinstein has shown, the campaign for their adoption was led by the local Chamber of Commerce and other organized business groups.[27] The administrations established in this way were not doctrinaire followers of laissez-faire;indeed, sometimes, as at Dayton, they extended the scope of municipal ownership. But their overriding goals were efficiency and economy, and the desire to keep taxes down generally shaped both their own labour relations and their attitude to social reform. Although the adoption of commission and city-manager systems was largely limited to small and medium-sized cities in the Middle West, New England and the Pacific states, municipal reform movements in the larger cities were, according to Hays, similarly dominated by business and professional groups seeking to extend to government “the process of rationalization and systemization inherent in modern science and technology.”[28] This involved efforts to reduce the power of ward politicians, who were often of lower-class origin, in favour of prominent businessmen and professional administrators. Key demands in this respect were for a shift from ward to city-wide election of councils and schoolboards, and for the removal of local elections from the sphere of party politics.
To Hays, municipal reform, like conservation, reveals the contrast between the rhetoric and the reality of Progressivism. In both cases, he argues, a self-proclaimed fight on behalf of ‘the people’ was actually an attempt to remove the decision-making process from an arena responsive to grass-roots pressure and place it in the hands of an elite.[29] While there is undoubtedly some validity in this perspective, it seems fair to point out that the issues were in some ways more complicated. To Progressives, as we have seen, democracy involved the idea of the primacy of the common good as well as that of government by the people. The localized character of political representation, traditional for legislatures at all levels in the United States, often tended to give more weight to particular interests than to the general welfare. “Conservation,” Gifford Pinchot declared, “means the greatest good to the greatest number for the longest time.”[30] To Pinchot and others, the need to safeguard the interests of future generations and to balance the competing claims on, for example, the use of water for power, navigation, irrigation and recreation, would be best sewed if policy were determined by qualified and disinterested administrators responsible to a nationally elected executive rather than left either to the market-place or to the jockeying of various lobbies in Congress.
More generally, the gospel of efficiency and Progressive ideology should be seen neither as incompatible nor as identical. The organizing committee of the New York Efficiency Society included prominent Progressive publicists, social workers and politicians. Men like Brandeis, Croly and Lippmann wrote warmly of scientific management in particular as well as of efficiency in general. This reflected both their respect for professional expertise and their faith that increasing productivity would ease the achievement of a more just and harmonious society. Approaching from the other direction, Pinchot naturally adopted the language of Progressivism when in 1908-9 he began to campaign for public support for conservation measures that were being obstructed by Congress. “No other policy now before the American people is so thoroughly democratic in its essence and in its tendencies as the Conservation policy,” he maintained, as he denounced “the uncontrolled monopoly of the natural resources which yield the necessaries of life.”[31] On the other hand, the goal of efficiency appealed to many who in no way shared the Progressive desire for a more equal distribution of wealth. Several of those involved in the municipal-reform movement were no more committed to this objective than were the businessmen who adopted the techniques of scientific management in the hope of making bigger profits. There were both progressive and conservative versions of the gospel of efficiency.
iii The upholding of traditional values
If the process of ‘modernization’, with its emphasis on the pragmatic virtues of rationalization and its tendency towards bureaucracy, contributed to the pressure for reform in the early twentieth century, so too did an almost antithetical impulse. This was the desire to protect, and if necessary re-capture, what were seen as being the values of an earlier age. This backward-looking element in Progressivism, to which Mowry and Hofstadter have drawn attention, was neither humanitarian nor technocratic in inspiration. Rather it appealed to traditional moral standards that found sanction both in the ethics of Protestantism and the ideal of a virtuous republic. It sought to purge politics of corruption, business of dishonesty, and, if possible, private life of sin, in an effort to revive the ideals of democratic citizenship and combat the ‘materialism’ of recent times.
This was the sort of concern most apparent in the crusades of the first decade of the twentieth century. It was reflected in the early muckraking articles which, according to the editorial in McClure’s which is generally taken to mark the beginning of the movement, constituted “such an arraignment of American character as should make every one of us stop and think.”[32] The same traditional moralism inspired Charles Evans Hughes’ exposure of the malpractices of New York life insurance companies in 1905-6, and the spectacular graft prosecutions which made the reputations of such Progressive political leaders as Joseph W. Folk in Missouri and Hiram W. Johnson in California.
Richard Hofstadter linked this moralistic aspect of Progressivism with the fact that many independent businessmen, lawyers and clergymen were prominent in reform movements. “Progressivism,” he suggested, “was to a very considerable extent led by men who suffered from the events of their time not through a shrinkage in their means but through the changed pattern in the distribution of deference and power.” The dominant classes of an earlier age—“the old gentry, merchants of long standing, the small manufacturers, the established professional men”—were now being overshadowed by “the newly rich, the grandiosely or corruptly rich, the masters of great corporations.”[33]
This “status revolution” thesis of Hofstadter’s has been much criticized, partly perhaps because it has been taken to claim more than he seems to have intended. Several studies have shown that the social characteristics of Progressive leaders were not readily distinguishable from other politicians at the time, and that their ranks included, for example, a fair number of self-made businessmen.[34] Some have questioned whether there was a status revolution at all. It has been pointed out that many well-established mercantile families maintained their fortunes in the industrial revolution, and that few millionaires actually started life in ‘rags’. But Hofstadter did not dispute these facts,[35] and there is much literary evidence to confirm his insight that a sort of snobbish resentment of business magnates helped to create a sympathy for reform on the part of some members of an older elite.
However, as Hofstadter himself pointed out, in the late nineteenth century such feeling had found expression in reform movements (like civil-service reform) which were ‘Mugwump’ rather than Progressive in character. These attracted respectable gentlemen who combined austere disapproval of business crudities and governmental corruption with strict adherence to the principles of laissez-faire conservatism on social and economic questions.
There was, therefore, nothing new in the kind of reform movement which consisted in ‘turning the rascals out’ (and, if possible, putting them in the penitentiary) and installing ‘good’ men in their stead. The tradition stretched back at least as far as the campaign against the Tweed Ring in New York City in the late 1860s, and had been carried through the later nineteenth century by ‘Good Government’ clubs, Municipal Reform Leagues and so on. The common pattern was for such reformers to win elections occasionally, following the revelation of some particularly flagrant scandals, but to hold office only briefly. Neither their attachment to the principles of government economy and appointment on merit, which curtailed their ability to exploit the potentialities of patronage, nor their tendency to enforce ‘blue laws’ regulating drinking, gambling, sports and prostitution, was conducive to sustained electoral success. This tradition was reasonably compatible with the businesslike drive for greater efficiency in municipal government, and some of the reform administrations of the early twentieth century, such as those of James D. Phelan (1897-1902) in San Francisco and Seth Low (1901-03) in New York City, reflected both.
While municipal reformers of this type were almost invariably native-born, middle-class or upper middle-class Protestants, the urban political machines which they opposed were generally controlled by Irish bosses and drew most of their support from lower-class voters of immigrant stock. Hofstadter has provided a vivid picture of the contrast between the “types of political culture” represented by “the Yankee reformer” and “the peasant immigrant”. Whereas for “the Yankee”, democratic government was “an arena for the realization of moral principles of broad application”, the immigrant looked to politics for “concrete and personal gains and … sought these gains through personal relationships.”[36] It was not surprising in these circumstances that some municipal reformers tended simply to attribute the corruption of politics to the influence of “an ignorant proletariat, mostly foreign born”,[37] and that there was some support for proposals to restrict the suffrage by requiring citizenship, educational, or even property qualifications—In practice, outright disfranchisement was largely confined to the Southern states (where, of course, Negroes were the chief victims), but some of the same effect seems to have been achieved by the imposition of requirements for personal registration by voters.[38]
There can be little doubt, in fact, that a good deal of the dynamism behind efforts to maintain traditional standards and mores derived from nativist reactions to the changing composition of the American population. This was most directly apparent in the campaign to restrict immigration. The initiative in this movement came from the Immigration Restriction League, which had been founded in the 1890s by a group of Bostonians from well-todo, well-established families. They advocated a literacy test, barring immigrants who could not read a short passage in any language. This proposal gained increasing support in Congress until in 1912 and 1914 bills enacting it were passed, only to be vetoed by Presidents Taft and Wilson. The alignment in Congress on this issue was a sectional one, with “the South and Far West almost unanimous for restriction, the urban areas of the North strongly against it, and considerable opposition lingering in the old immigrant districts of the Midwest.”[39]
An almost identical alignment was to emerge in the struggle over the national prohibition of alcohol. Although this was not achieved until 1919 in the immediate aftermath of the First World War, the first decade of the twentieth century saw a considerable extension of local ‘dry’ laws, particularly in the West and South. Of course, neither Prohibition nor immigration restriction involved a conflict between the interests of different geographical regions in the way some disputes over the tariff or internal improvements have done. More detailed analysis of some of the many contests in various states over the drink question in this period clearly indicates that the division on this issue was not so much directly sectional, or even between rural and urban America, as it was one of class and ethno-religious background.[40] Prohibition, as studies of the politics of several Midwestern states in the later nineteenth century have amply demonstrated, was pre-eminently a question that pitted Americans of a ‘pietistic’ religious commitment, whether of old-stock, British or Scandinavian origin, against those of a ‘liturgical’ faith, notably Catholics and German Lutherans.[41]
Both Prohibition and immigration restriction have been seen as Progressive causes.[42] It is true that both represented a departure from the principles of laissez-faire, and each enjoyed the support of a good number of social workers and other reformers. However, these ‘reforms’ also appealed to many whose views on economic issues were far from Progressive. The Anti-Saloon League, founded in 1895, deliberately abandoned the broad reform programme of the Prohibition Party in order to attract more support for the single ‘American Issue’. The chief proponents in Congress of a literary test for immigrants were Henry Cabot Lodge and his sonin-law Augustus P. Gardner, both conservative Republicans, and John Higham has observed that most of the New England patricians who advocated immigration restriction “worshipped tradition in a deeply conservative spirit.”[43]
Nor would it be right to interpret the whole concern to revive traditional values as no more than a sublimation of nativist prejudice. The ethnic issue was not at all involved in the anti-trust movement, which owed a good deal of its broad appeal to moral objections to the business practices of the tycoons and nostalgia for a competitive order that was thought to have fostered economic opportunity, social mobility and a self-reliant citizenry. Nor were all municipal reformers content to blame corruption upon immigrants. “The ‘foreign element’ excuse is one of the most hypocritical lies that save us from the clear sight of ourselves,” wrote Steffens sternly in The Shame of the Cities.[44] Similarly, the restriction of immigration could be supported on other than racist grounds. The American Federation of Labor came out in favour of a literacy test in 1897, and began to campaign hard for it after 1906, on the grounds that unrestricted entry kept wages low. This argument appealed to some Progressive reformers. On the other hand, a concern to Americanize the immigrants, for a mixture of humanitarian and prudential reasons, was shared by many—notably, the social worker Frances Kellor and the various organizations she animated—who did not join the campaign for restriction. With regard to Prohibition, scientific evidence about the harmful physiological effects of alcohol, and social research into the connections between drinking, crime, vice and poverty, provided arguments for it that had no direct connection with a desire to maintain the hegemony of Puritan values. It was even supported by a few Catholic reformers.[45]
However, there were some, such as the sociologist Edward A. Ross, the settlement leader Robert A. Woods, and the journalist William A. White, who combined a commitment to Progressive reform with nativist, not to say racist, views. The intellectual link between these attitudes was provided by an ‘Anglo-Saxon’ interpretation of the American democratic tradition. “What is this universal movement in our cities for home rule but the old race call for the rule of the folk?” asked White. “Here in the United States we have two things which have made the Teuton strong in this earth: the home with the mother never out of caste and the rule of the folk by the ‘most ancient ways’—the supremacy of the majority.”[46] But it would seem that even this sort of commitment to democratic values served to preserve one from the extremes of nativist intolerance. When in 1924 the Ku Klux Klan was waxing strong in Kansas, White broke his life-long abstention from direct participation in politics to run as an independent, anti-Klan candidate for governor, because “to make a case against a birthplace, a religion, or a race is wicked, un-American and cowardly.”[47]
iv The role of business interests
The sort of moral crusade exemplified by the Prohibition movement is sometimes interpreted in terms of the desire of a social group, such as old-stock Protestants, to affirm or enhance their standing in the community by having their own values formally legitimated. This is, however, a rather subtle form of self-interest, and indeed of the drive for social status, by comparison with a simple desire to improve one’s material well-being. This latter motive commonly plays a fairly large part in politics, and it would seem from the work of such historians as Wiebe, Hays and Kolko that business interests contributed a good deal to the pressures for reform in the early twentieth century, particularly in regard to economic regulation both at the state and the federal level.
To see the economic self-interest of businessmen as a significant force for reform is, of course, substantially to revise the earliest accounts of ‘the Progressive movement’, which portrayed it as an uprising of ‘the people’ against the entrenched power of business interests. The most thoroughgoing variant of this revisionism is that advanced by such New Left writers as Kolko and James Weinstein who see the most important reforms of the period as representing a successful effort by big business to utilize the authority of the federal government to stabilize its dominant position. According to Kolko, corporate leaders had two immediate motives in seeking the establishment of federal regulatory agencies, which they confidently expected to be able to control. The first was to establish order within their industries by limiting competition. Kolko argues that many of the large amalgamations created in the late nineteenth and early twentieth centuries were over-capitalized, inefficient and vulnerable to competition. For example, while U.S. Steel had produced 61.6 per cent of the nation’s steel output when it was created in 1901, by 1920 the figure was only 39.9 per cent. In 1899 Standard Oil refined 90 per cent of the nation’s oil; in 1911, when the Company was divided by a Supreme Court order, the proportion was down to 80 per cent. Other trusts, like International Harvester, American Telephone and Telegraph, the Amalgamated Copper Company, and the ‘Big Six’ meat packers, were also suffering a declining share of the market in the early twentieth century. The second motive of corporate leaders in seeking federal regulation was to obviate the threat of state laws that were likely to be both inconveniently varied and more subject to radical, popular pressures; this applied, for example, in the case of the railroads.[48] Beyond these specific objectives, Weinstein in particular argues, businessmen sought “the stabilization, rationalization and continued expansion of the existing political economy, and subsumed under that, the circumscription of the Socialist movement with its illformed, but nevertheless dangerous ideas for an alternative form of social organization.” To this end they endorsed and adapted the ideology of Progressivism, seeking a “new corporate order” in which business would be both licensed and regulated by nonpolitical federal agencies, labour would be both represented and disciplined by responsible unions, and social problems would be contained by various welfare measures. This outlook was promoted by the National Civic Federation which brought together such corporate leaders as Elbert Gary and George Perkins, conservative labour leaders like Gompers and John Mitchell, and prominent civic leaders like Seth Low.[49]
There seems little doubt that the traditional view both of businessmen’s attitudes and of their influence stood in need of modification. As at other times in American history, the attitudes of businessmen to reform proposals in the Progressive era were shaped less by a dogmatic allegiance to laissez-faire than by assessments of the effects upon their own interests of particular measures. Nor were they by any means simply the passive objects of legislation. On the contrary, business organizations grew greatly in number, membership and resources in this period, and neither Congress nor state legislatures were by any means indifferent to their representations. However, the Kolko-Weinstein thesis also seems somewhat over-simple. In .particular, it would be wrong to assume that opinion was united either among businessmen in general, or even among the leaders of large corporations. On the contrary, as Wiebe has shown, such issues as railroad regulation, banking reform and the tariff revealed several lines of conflict.[50] Some of these were sectional—hostility to Eastern corporations and bankers served as a unifying force in many Western and Southern states. But within regions and states, different groups also had conflicting interests. In Minnesota, for example, the business community of the Twin Cities and agrarian interests in the rest of the state both supported the ‘insurgent’ revolt against the Payne-Aldrich tariff bill of 1910, but the issue of a reciprocity agreement with Canada found them on different sides.[51] Nor was any one set of interests clearly dominant. In the case of railroad regulation, the various federal acts between 1903 and 1910 tended increasingly to favour the shippers as against the carriers, but after 1910 the shippers fell out among themselves and a reaction set in. The Federal Reserve Act of 1913 did not completely please any one of the different banking interests—the Wall Street magnates, the city bankers of the Middle West, or the country bankers.
The most central issue, that of the trusts, was by no means the least divisive. Small businessmen, organized in the National Association of Manufacturers, tended to favour enforcement of the Sherman Anti-Trust Act. The apparent activation of this act by Roosevelt, when he followed his prosecution of the Northern Securities Company in 1902 by setting up a Bureau of Corporations in the new Department of Commerce in 1903, produced divergent reactions on the part of corporate leaders. Standard Oil enlarged its legal department and sought to conceal evidence of its illegal railroad rebates. The Morgan-dominated corporations, U.S. Steel and International Harvester, on the other hand, reached ‘gentlemen’s agreements’ with the Administration, by which they would make confidential information available to the Bureau, on the explicit condition that their trade secrets would be protected, and with the implicit assumption that, if any of their practices were judged to be illegal, they would not be prosecuted for past violations. This approach reflected the views of Morgan executives, like George Perkins, who believed that large corporations would, only receive the legitimation they needed if they accepted a degree of public accountability. In 1908 Perkins and other members of the National Civic Federation sponsored the Hepburn amendments to the Sherman Act. These would have allowed the Bureau of Corporations to grant immunity from prosecution after investigating the practices of particular companies, legalized railroad ‘pools’, and exempted labour unions from the Anti-trust Act. The amendments were opposed by the National Association of Manufacturers and rejected by Congress—sufficient testimony in itself that the views of the National Civic Federation did not have complete political dominance in this period. Those who had backed the Hepburn amendments quietly supported the establishment of the Federal Trade Commission in 1914, but attempts to give the Commission authority to grant immunity from anti-trust prosecutions were again defeated, and, indeed, the Clayton Act of the same year was designed to strengthen and make more precise the anti-trust law. In the event the F.T.C. was to prove generally sympathetic to the interests of large corporations. But this reflected the character of appointments to it more than its terms of reference, and the eventual outcome should not obscure the complex interplay of pressures that lay behind its establishment.
If the importance of divisions within the business community must be recognized and the political dominance of large corporations not be assumed, so too the extent to which reform was the product simply of the pressures of different business interests must not be exaggerated. In the first place, even in the case of federal economic regulation, Congress was subject to other influences. For example, although .the large meat-packing companies welcomed federal regulation, both in order to ease the entry of their products into European markets and to maintain the standards of smaller competitors, they objected strongly to certain features of Beveridge’s Meat Inspection bill of 1906. Pressure for such legislation stemmed in large part from the public outcry over conditions in packing plants following publication of Upton Sinclair’s novel The jungle and the report of two special investigators to the President. The final law did go a long way towards meeting the objections of the packing companies, but not all the way.[52] Secondly, the involvement of organized business interests in the movement for reform was limited in scope. Economic regulation was naturally a prime concern, and they also promoted some types of municipal government reform and workmen’s compensation laws. But, on the whole, they either opposed or were indifferent to most kinds of social-welfare legislation and such political reforms as direct elections.[53]
v The urban working class
Businessmen are not the only people moved by self-interest, and it would seem natural to expect that support for social and political reforms too would be derived from those who would directly benefit from them. In the case of most forms of social legislation, these would be primarily wage-earners and citydwellers. However, until recently most historians have not seen the urban working class as contributing much to the pressure for reform in the early twentieth century. There were several reasons for this. One was the assumption that the reforms stemmed from the activities of a single Progressive movement. The background of prominent reform leaders and the language in which they advocated change left little doubt that this movement was fundamentally middle class in character. At the same time, the political attitudes and habits of the various ethnic communities that composed the working population of the cities did not seem to offer much prospect of their providing effective support for reform. There was little evidence of class-consciousness. It is true that the
Socialist Party of America reached its , peak in these years. But at the national level this amounted to no more than the 6 per cent of the popular vote that Eugene Debs obtained in the presidential election of 1912, and in fact the party’s strength was largely limited to a few particular groups—the tenant farmers of the Southern Plains states, especially Oklahoma, the German trade unionists of Milwaukee, the Eastern European Jews in the New York garment industry, and the western lumbermen, metal miners and migrant labourers who constituted most of the membership of the LW.W. The organized labour movement was weak, and generally cautious and conservative in its approach to politics and reform. In 1910 less than 6 per cent of the total labour force was unionized. The leadership of the A.F.L., notably Gompers, was not only extremely hostile to Socialism but generally reluctant to support any reform proposals other than those which would directly benefit trade unions in their organizational and bargaining activities. Politically, the lower classes in the cities were apparently divided between Republicans and Democrats along the same lines of ethno-cultural cleavage as other elements of the population. More significantly perhaps, they provided the votes for the urban political machines that had developed since the 1880s. These machines were traditionally regarded as the natural adversaries of reform and reformers.
However, the recent work of historians like J. Joseph Huthmacher and John D. Buenker has indicated that politicians associated .with urban machines in fact contributed a great deal to the movement for social reform.[54] In the industrial states of the North-east and Middle West, laws regulating the working conditions of women and children, establishing minimum wages, and providing for workmen’s compensation received strong support from legislators representing urban working-class constituencies. Such representatives were also generally sympathetic to efforts by organized labour to secure the legalization of such practices as picketing and boycotting and the limitation of the use of injunctions in industrial disputes—though comparatively few of such measures gained sufficiently widespread backing to be enacted. They also favoured a more progressive taxation system, and helped to secure the ratification of the Sixteenth Amendment to the Constitution, which laid the basis for a federal income tax. Urban machine politicians also commonly called for more effective regulation of public utilities and other businesses in the interests of consumers, and in some cases came out for municipal ownership. These generally Progressive attitudes are linked by Buenker to the rise of “reform machines” in the early twentieth century, as the political bosses responded to the needs of their constituents and learnt from the electoral success of reform mayors such as Hazen Pingree of Detroit, Samuel ‘Golden Rule’ Jones of Toledo, and Tom L. Johnson of Cleveland in the 1890s and early 1900s.[55]
What is, at first glance, more startling than this support for social legislation is the favourable attitude adopted by urban machine politicians towards constitutional reforms often advocated on the grounds that they would reduce the power of political ‘bosses’. For example, the Seventeenth Amendment to the Constitution, which provided for the direct election of United States Senators, received the overwhelming support of urban legislators in the industrial states. This becomes less surprising, however, when one recalls that under the existing system Senators were chosen by the state legislatures, the apportionment of representation for which generally failed to give urban areas a weight corresponding to their population.[56] Since in these states the rural areas were generally Republican and the large cities Democratic, urban legislators also had a party motive for favouring a change that would enhance the influence of their constituents. The same sort of direct self-interest explains why urban legislators sought increased powers for city governments (known at the time as ‘home rule’) and legislative re-apportionment. Interestingly, many machine politicians also came to look favourably upon proposals for direct legislation and primary elections as they realized that these innovations would in practice place a greater premium on the sort of organization that could gather signatures and turn out voters. The same pragmatic attitude, however, tended to make machine politicians more ambivalent about other reform proposals, such as the extension of civil-service rules, corrupt-practices legislation and women’s suffrage.
Indeed, the contribution of the urban working class and its political representatives to reform must not be exaggerated. Buenker himself emphasizes that successful legislative achievement in this period depended upon the support of a coalition of different groups.[57] Nor can we assume that new-stock working-class voters were always Progressive in their attitudes to social and labour questions—this does not seem to have been the case with Italian-Americans in San Francisco, for example.[58] It is true, as Buenker points out, that the Democratic party enjoyed a considerable growth in strength between 1904 and 1916, and that this was largely a result of its gains in the North-eastern industrial states. While it is quite possible that this was due to an association between the Democratic party and Progressive reform, this has not yet been conclusively demonstrated.[59]
4. In Conclusion
Was There a ‘Progressive Movement’?
The question of what sense, if any, it makes to speak of a ‘Progressive movement’ depends, of course, upon the relationship between the various concerns and constituencies that contributed to the pressure for reform. Clearly, particular issues could produce alliances between different groups. Thus social workers cooperated with urban machine politicians in promoting welfare legislation—a notable example of this being the New York State Factory Investigation Commission, set up after the Triangle Shirtwaist Factory fire of 1911 in which 148 women lost their lives. Less comfortably perhaps, old-stock traditionalists and the leaders of organized labour found themselves on the same side in the campaign for immigration restriction. But it needed more than a series of ad hoc coalitions to produce the idea held at the time that reform was the result of a ‘Progressive movement’.
This contemporary feeling was possible because impulses that seem quite distinct on analysis were often mixed up in practice. Thus, the attitudes of many middle-class, old-stock Americans towards immigrant communities involved both humanitarian and authoritarian elements. Advocates of a greater degree of government control over the economy believed this would promote both efficiency and social justice. Businessmen, seeking in their own interest more economical municipal government, would participate in crusades against political corruption with sincere moral indignation. Attacks on the railroads and other large corporations appealed both to the self-interest of shippers and other groups and to the ideological tradition of hostility to ‘monopoly power’.
Indeed, the combination of a variety of different appeals was characteristic of the style of Progressive writers and politicians. Thus Charles McCarthy, head of the Wisconsin Legislative Reference Bureau, could appeal to the gospel of efficiency, traditional American ideals and the enlightened self-interest of businessmen in promoting the cause of social reform:
What is the need of a philosophy or an ‘ism’ when there is obvious wrong to be righted? Whatever has been accomplished in Wisconsin seems to have been based upon this idea of making practice conform to the ideals of justice and right which have been inherited …. If certain social classes are forming among us, can we not destroy them by means of education and through hope and encouragement make every man more efficient so that the doors of opportunity may always be open before him?
If you were responsible for the business of government, would you not apply the common rules of efficiency, Mr. Business Man? Do you not believe that it would pay well to make a heavy investment in hope, health, happiness, and justice?[60]
Yet, despite the broad appeal of such pleas for reform, the diverse elements attracted to Progressivism constituted an unstable amalgam. In some cases at least, Progressivism as a political force at the city and state level seems to have changed, over time, both its character and its electoral base. In Detroit, for instance, Hazen Pingree, a self-made businessman and president of the Republican Michigan Club, was elected mayor in 1890 as a ‘good government’ candidate in the aftermath of Grand jury revelations of corruption in municipal contracts. He began by instituting a number of economy measures and securing the indictment of corrupt school board members. However, he became engaged in a long contest with the street railway company and other public utilities for lower rates and better service, and also in an attempt to achieve a more equitable tax system. In addition, he refused to enforce liquor legislation strictly, and in the severe depression of 1893-95 sought energetically to aid the unemployed, most notably through his ‘potato patch’ plan. Pingree’s actions alienated many of his former business sponsors but greatly increased his popularity with foreign-born and working-class voters.[61] In California, two decades later, the Lincoln-Roosevelt league in the Republican party grew out of the Good Government movement in Los Angeles, which was backed by the Chamber of Commerce. The league’s gubernatorial candidate, Hiram Johnson, campaigned in 1910 largely on the single issue of the corrupt influence of the Southern Pacific railroad in the state’s politics. In the Republican primary that year, Johnson’s vote was highest in rural, native-stock, Protestant counties, where Prohibitionist, anti-alien and anti-labour sentiment was also strong. As governor, however, Johnson promoted such measures as workmen’s compensation, an eight-hour day for women, factory inspection and a child-labour law. By 1916, when he contested a Senatorial primary election, Johnson’s strongest support came from the San Francisco Bay area, and particularly from the heavily Catholic and immigrant working-class wards.[62]
In both these cases, reform movements that started as middleclass, business-oriented and moralistic came to be more concerned with social reform and working-class support. It is hard to say how common this development was. Nevertheless, although the ideology of Progressivism, with its insistence that traditional American values demanded a more equitable distribution of wealth, could appeal to these different interests, these examples demonstrate the difficulty of keeping them in harmony at the level of practical politics.
ii.Was There a ‘Progressive Era’?
None of the elements that contributed to the pressure for reform in the early twentieth century was confined to that period. Antitrust sentiment could be traced back as far as the anti-monopolyism of the Jacksonian era, and it was to remain strong, particularly in the West and South, into the 1930s and beyond. The Sherman Act itself, of course, had been passed as early as 1890. The movement for social reform, involving both middle-class humanitarians and politicians drawn from the new-stock populations of the cities, developed into the urban liberalism that has been an influential force in American politics through most of this century. The attempt to uphold traditional American values in the face of the challenges presented as a result of urbanization and immigration was to become particularly strenuous in the 1920s, and to persist in some form at least as late as the presidential election of 1964. Similarly, the adaptation to the demands of a complex, industrial society—the acceptance of the need for organization, rationalization, professionalism and bureaucracy—has been a continuous social process from the late nineteenth century to the present day. In other words, these were all long-term developments in American history, while, of course, the political influence of special-interest groups was also not confined to this period.
But what did distinguish the early twentieth century was a public mood generally sympathetic to calls for reform. The language of successful political leaders like Theodore Roosevelt and Woodrow Wilson emphasized the need for action by government if traditional American ideals were to be preserved in the novel circumstances created by economic and social change. Such rhetoric not only helped to create the rather illusory sense of common purpose among reformers with different values and priorities, it also reflected and reinforced a climate of opinion generally receptive to innovation and experiment, one in which proposals for reform might be sympathetically regarded by a much wider circle than their direct sponsors and beneficiaries.[63] This was greatly in contrast with the mood of the 1890s, when the platform of the Populist Party, and even that of William Jennings Bryan in 1896, had seemed threateningly revolutionary to most middle-class Americans, including many who later became Progressives.
However, the period during which the American public seemed broadly sympathetic to proposals for political, economic and social reform was sharply limited in time. There is evidence that by 1914 a reaction had set in.[64] It is true that aspects of America’s experience in the First World War—the extension of government direction over the economy, the increased recognition of organized labour, and the general subordination of individual interests to a collective purpose—encouraged many Progressive reformers to believe that the transformation of American life they hoped for was at hand; but this only made their post-war disillusionment the more painful.[65] The dismantling of the machinery of wartime collectivism was swift and thorough,[66] and during the 1920s American businessmen enjoyed high prestige and the orthodoxies of laissez-faire were accorded renewed respect. In the 1930s, of course, calls for reform were again popular, but this is easily attributed to the Great Depression. The early twentieth century, by contrast, was a period of general prosperity. The question of why at—this time there was apparently such widespread discontent and desire for reform, particularly among middle-class Americans, has exercised historians.
In a broad sense, of course, the occurrence of an era of reform is to be explained by the social and economic developments that created the problems which seemed to demand attention; it was a ‘response to industrialism’. But, as Hofstadter pointed out, this does not explain the difference between the 1890s and the 1900s: “indeed, in many ways the problems of American life were actually less acute after 1897.”[67] Some see a natural cycle in the waxing and waning of reform sentiment in the United States. According to Charles B. Forcey, for instance, “each wave of reform has run its course at internals of twenty years or so since the founding of the republic.”[68] The difficulties with this theory are that it tends to lump together very disparate phenomena—Jacksonian Democracy and the Liberal Republicans of 1872, for example—and that it does not provide much by way of explanation.
Although Hofstadter remarked that the occurrence of “the Progressive revolt” during a period of prosperity presented “a challenge to the historian”,[69] it seems quite likely that the change in public mood in fact followed from the recovery of the economy in the late 1890s. It is perhaps reasonable to assume that appeals to democratic idealism on behalf of reforms that would benefit others were more likely to arouse a positive response when people were feeling comparatively affluent and secure than when they were preoccupied with their personal economic problems and fearful that the social fabric was being torn apart—as many middle-class Americans had been in the mid-1890s. Certainly, some Progressives believed there was a connection between prosperity and public support for reform—as when they blamed the disappointing results of the 1914 congressional elections on the economic recession at the time.[70] The Panic of 1907, according to Wiebe, caused most businessmen to lose sympathy with “all unnecessary agitation”—by which they meant any reforms other than those from which they themselves would directly benefit.[71] In studying nativism, John Higham pointed out that it was aggravated by economic depression, and that prosperity tended to encourage more tolerant and generous attitudes.[72]
However, if Progressivism was fostered by the confidence bred of prosperity, the emphasis placed by so many advocates of reform on the danger of class conflict or revolution if their pleas were not heeded suggests that it also rested on a basis of anxiety. It seems as if the mood on which Progressivism depended for its broad appeal could be eroded by complacency as well as by panic. But in the early twentieth century complacency required either strong nerves or a short memory. For the events of the 1890s—the severe depression of 1893-95, the violent labour conflicts, the Populist revolt and the Bryan campaign—had combined to create what Hofstadter has called a “psychic crisis”. Frederick Jackson Turner had declared in 1893 that the disappearance of “the frontier” of settlement in the latest census marked the close of “the first period of American history”, and it seemed to many that, with the end of “free land”, the United States would no longer enjoy its happy exemption from the ills that beset other societies.[73]
Yet neither the breadth nor the depth of the sympathy for reform in the early twentieth century should be exaggerated. Throughout the period there remained many prepared to defend the status quo. Some occupied powerful positions, for example as Supreme Court justices. It was in 1905, in Lochner v New York, that the Court struck down a state law limiting to ten the daily working hours for bakers on the ground that it infringed freedom of contract within the meaning of the due process clause of the Fourteenth Amendment. When Theodore Roosevelt made a case for the popular recall of state judicial decisions in a speech at Columbus, Ohio, in February 1912, he greatly strengthened the determination of the ‘Old Guard’ to deny him the Republican presidential nomination that year. The success of their fight for what Taft called “the retention of conservative government and conservative institutions” left them in virtually uncontested control of the party after Roosevelt’s supporters followed him into the ‘Bull Moose’ Progressive party.[74] With the Republican electoral revival from 1914 onwards, this increased the strength of conservative sentiment in Congress.
The power of conservatism in established institutions like the courts and the Republican party partly accounts for the limited nature of the Progressive achievement. Some problems, it is true, were ignored by most Progressive reformers themselves. This was notably the case with the race issue. But even on the issues which seemed central to Progressives, their accomplishments were not impressive. As far as one can tell from imperfect statistics, the distribution of income became more, not less, unequal between 1896 and the First World War.[75] The democratic process may have been revivified in some ways by the reforms of the period, but the proportion of the electorate who turned out to vote was lower than it had been in the later nineteenth century.[76] The welfare and labour laws of the period were generally rudimentary and ineffectively enforced, while those reformers who in 1916 saw compulsory health insurance as “the next great step in social legislation” had a long time to wait.[77] Nor did the America of the 1920s provide much evidence of the taming of corporate power.
These basic facts surely indicate that the early twentieth century was not dominated by those Progressive reformers who wished to direct the United States along a course broadly similar to that of European social democracy. Most Americans were never converted from their basic belief in the economic and moral virtues of free-enterprise capitalism. The social forces that were promoting various reforms were basically diverse and ultimately sought contradictory objectives—as was to be demonstrated in the 1920s and later. In other words, during the ‘Progressive Era’, as often at other times in American history, long-term developments were of more fundamental significance than the distinctive characteristics of a short period.
5.Guide to Further Reading
It is difficult to suggest introductory reading on the Progressive era since the best general accounts of the period are more concerned to provide an interpretative synthesis than a comprehensive narrative. This is true of both Samuel Hays’ Response to Industrialsim, 1885-1914 (1957)3 and Robert Wiebe’s The Search for Order, 1877-1920 (1967).[26] Wiebe’s book in particular is rich in original and suggestive insights, though to my mind the attempt to discern a single, if complex, theme in the multifarious events of the period is somewhat strained. But it would probably be best to read something else first. The volumes in the New American Nation series—Harold U. Faulkner, Politics, Reform and Expansion, 1890-1900 (New York: Harper and Row, 1959; paperback ed., 1963), George E. Mowry, The Era of Theodore Roosevelt and the Birth of Modern America, 1900-1912 (ibid., 1958; 1962), and Arthur S. Link, Woodrow Wilson and the Progressive Era, 1910-1917 (ibid., 1954; 1963)—are now a little dated but they still provide a very thorough account. Those with less time at their disposal should not despise textbooks, of which Arthur S. Link with William B. Catton, American Epoch: A History of the United States since the 1890s (3rd ed., New York: Knopf, 1967), is particularly authoritative on this period. Lewis L. Gould, ed., The Progressive Era (Syracuse, N.Y.: Syracuse UP, 1974), is a useful collection of essays.[78]
On Progressivism itself, Richard Hofstadter’s The Age of Reform (1955)[2] is still essential reading, so much more nuanced, rich and deeply knowledgeable than one would gather from some of the criticisms of it. Hofstadter’s “status revolution” hypothesis is the principal target of David Thelen’s “Social Tensions and the Origins of Progressivism” (1969),[34] which also provides an alternative explanation for the rise of Progressivism based largely on the case of Wisconsin. Peter G. Filene, “An Obituary for ‘The Progressive Movement’,” American Quarterly,[22] (1970), 20-34, provides a useful review of the literature as well as a salutary dose of nominalism. In my view, the best recent attempt to provide an integrated analysis of the whole phenomenon is Otis Graham’s The Great Campaigns (1971).9
On the ideology of Progressivism, Eric F. Goldman, Rendezvous with Destiny: A History of Modern American Reform (1952; rept., New York: Vintage—Knopf, 1956), still provides a lively and accessible introduction. The debate between the proponents and the critics of laissez-faire in the later nineteenth century is very fully recounted by Sidney Fine, Laissez Faire and the General Welfare State: A Study of Conflict in American Thought, 1865-1901 (Ann Arbor: Michigan UP, 1956). One aspect of this story is lucidly presented by Richard Hofstadter, Social Darwinism in American Thought (1944; rev. ed., Boston: Beacon Press, 1955). The social thought of the period is approached from a different perspective in Morton G. White, Social Thought in America: The Revolt Against Formalism (1952; rev. ed., Boston: Beacon Press, 1957). David W. Noble, The Paradox of Progressive Thought (Minneapolis: Minnesota UP, 1958), is a difficult book. Charles Forcey’s The Crossroads of Liberalism (1961)[68] is a model study of three Progressive publicists and the early years of the liberal weekly The New Republic. Samuel Haber emphasizes one particular theme in Efficiency and Uplift (1964)[16] and Christopher Lasch two or three in The New Radicalism in America, 1889-1963: The Intellectual as a Social Type (London: Chatto and Windus, 1966). One of the best studies of the diversity and complexity of the political attitudes of Progressives, although its basic methodology is open to some question, is Otis L. Graham Jr., An Encore for Reform: The Old Progressives and the New Deal (New York: Oxford UP, 1967).
The literature on the movement for social reform is relatively uncontentious. Robert H. Bremner, From the Depths: The Discovery of Poverty in the United States (New York: New York UP, 1956), is a lucid and comprehensive account. The Social Gospel movement was thoroughly studied some years ago in Charles H. Hopkins, The Rise of the Social Gospel in American Protestantism, 1865-1915 (New Haven: Yale UP, 1940), Aaron I. Abell, The Urban Impact on American Protestantism, 1865-1900 (Cambridge, Mass.: Harvard UP, 1943), and Henry F. May, Protestant Churches and Industrial America (1949).[21] An illuminating study of a particular reform in a particular city is Roy Lubove, The Progressives and the Slums: Tenement House Reform in New York City, 1890-1917 (Pittsburgh: Pittsburgh UP, 1962). The leading authority on the role of the social settlements is Allen F. Davis who has contributed both a general account of their more politically significant activities in Spearheads for Reform: The Social Settlements and the Progressive Movement, 1890-1914 (New York: Oxford UP, 1967) and a biography of their most famous leader in American Heroine: The Life and Legend of Jane Addams (New York: Oxford UP, 1973). The broader development of social work, is the subject of Roy Lubove, The Professional Altruist: The Emergence of Social-Work as a Career, 1880-1930 (Cambridge, Mass.: Harvard UP, 1965).
By comparison, the economic and political reforms of the period have been subject to more sceptical and divergent interpretations in recent years. The New Left view is best expounded in Gabriel Kolko, Triumph of Conservatism (1963)[4] and Railroads and Regulation, 1877-1916 (1965; 2nd ed., New York: Norton 1970), and in James Weinstein’s The Corporate Ideal in the Liberal State (1968) 49 Kolko’s account of the relationship between corporate leaders and the process of government economic regulation should be compared with that of Robert H. Wiebe, in Businessmen and Reform (1962)50 and in “The House of Morgan and the Executive,” American Historical Review,65 (1959), 49-60, and of John Braeman in “The Square Deal in Action” (1964).[52]
The so-called “organizational” interpretation of Progressivism is less implicitly conspiratorial than that of the New Left, but it too emphasizes the elitist aspect of many of the reforms of the time. Samuel P. Hays was the pioneer of this approach and it informs both his monograph, Conservation and the Gospel of Efficiency (1959),[26] and his essays, not only “The Politics of Reform in Municipal Government” (1964)3 but also “The Social Analysis of American Political History, 1880-1920,” Political Science Quarterly,80 (1965), 373-94, and “Political Parties and the Community-Society Continuum,” in William N. Chambers and Walter Dean Burnham, eds., The American Party Systems (New York: Oxford UP, 1967). Other accounts of some of the political reforms of the period from this point of view are James Weinstein, “Organized Business and the City Commission and Manager Movements” (1962),2 and Walter Dean Burnham, Critical Elections and the Mainsprings of American Politics (1970).[38] A broader development of this approach is illustrated by Weibe’s The Search for Order and articulated by Louis Galambos, “The Emerging Organizational Synthesis in Modern American History,” Business History Review,44 (1970), 279-90, and Ellis W. Hawley, “The New Deal and Business,” in John Braeman, Robert H. Bremner and David Brody, eds., The New Deal: The National Level (Columbus: Ohio State UP, 1975).
More recently the role of the political representatives of the urban working class in promoting the reforms of the period has been stressed; Joseph Huthmacher’s article on urban liberalism[54] blazed the trail but it is John Buenker who has amassed most of the evidence, now largely collected in his Urban Liberalism and Progressive Reform (1973).[10]
The merits of these various interpretations have to be tested against the complexities of politics at the local, state and federal levels. Good studies of particular cities are Zane Miller, Boss Cox’s Cincinnati: Urban Politics in the Progressive Era (New York: Oxford UP, 1968), James B. Crooks, Politics and Progress: The Rise of Urban Progressivism in Baltimore, 1895-1911 (Baton Rouge: Louisiana State UP, 1968), and Melvin Hopi’s Reform in Detroit (1969).[37] Martin J. Schiesl, The Politics of Efficiency: Municipal Administration and Reform in America, 1880-1920 (Berkeley: California UP, 1977), is a recent attempt to provide a synthetic interpretation of municipal reform.
Much of the best literature on Progressivism has been in the form of state studies. George Mowry’s The California Progressives (1951),[2] Richard M. Abrams, Conservatism in a Progressive Era: Massachusetts Politics, 1900-1912 (Cambridge, Mass.: Harvard UP, 1964), and David P. Thelen, The New Citizenship: Origins of Progressivism in Wisconsin, 1885-1900 (Columbia: Missouri UP, 1972), each in their turn gave rise to new interpretations of Progressivism. If less ambitious, Sheldon Hackney, Populism to Progressivism In Alabama (Princeton, NJ.: Princeton UP, 1969), Spencer C. Olin Jr., California’s Prodigal Sons: Hiram Johnson and the Progressives, 1911-1917 (Berkeley: California UP, 1968), Herbert Margulies’ Decline o f the Progressive Movement in Wisconsin (1968),[64] and Carl Chrislock’s Progressive Era in Minnesota (1971)[51] are also very illuminating. Those of these studies that deal with the Middle West and the South suggest that there was much less continuity between Populism and Progressivism than might be gathered from Russel B. Nye, Midwestern Progressive Politics: A Historical Study of its Origins and Development, 1870-1958 (East Lansing: Michigan State UP, 1959), and C. Vann Woodward, Origins of the New South, 1877-1913 (Baton Rouge: Louisiana State UP, 1951), but these regional surveys are still useful, particularly in view of the role played by sectional animosities in the political insurgency of the period.
There has been less recent writing on national politics and much of what there has been has been in the form of biographies, notably William H. Harbaugh, The Life and Times of Theodore Roosevelt (1961; rev. ed., New York: Oxford UP, 1975), David P. Thelen, Robert M. La Follette and the Insurgent Spirit (Boston: Little, Brown, 1976), and of course the early volumes of Arthur S. Link’s mufti-volume Wilson (Princeton, NJ.: Princeton UP, 1947). John Morton Blum, The Republican Roosevelt (Cambridge, Mass.: Harvard UP, 1954), and James Holt, Congressional Insurgents and the Party System (1967),74 are each distinguished by their brevity and tough-mindedness. By contrast with the later nineteenth century, there has been comparatively little analysis of electoral behaviour in the Progressive era. Apart from Michael P. Rogin and John L. Shover, Political Change in California (1970),[58] there is some material in Rogin’s The Intellectuals and McCarthy (1967)6 and Roger E. Wyman, “Middle-Class Voters and Progressive Reform: The Conflict of Class and Culture,” American Political Science Review,68 (1974), 488-504.This deficiency will no doubt soon be rectified, as indicated by some of the essays in Joel H. Silbey, Allan G. Bogue, and William H. Flanigan, eds., The History of American Electoral Behavior (Princeton, N .J.: Princeton UP, 1978).
6. Notes
- Frederic Austin Ogg, National Progress, 1907-1917 (New York: Harper and Bros, 1918), p.xix, xx. Back
- Richard Hofstadter, The Age of Reform: From Bryan to F.D.R. (New York: Knopf-Vintage, 1955); George E. Mowry, The California Progressives (Berkeley: California UP, 1951). Back
- E.g., Samuel P. Hays, The Response to Industrialism, 1885-1914 (Chicago: Chicago UP, 1957), and “The Politics of Reform in Municipal Government in the Progressive Era,” Pacific Northwest Quarterly, 55 (1964), 157-69. Back
- Gabriel Kolko, The Triumph of Conservatism: A Reinterpretation of American History, 1900-1916 (New York: FreePress of Glencoe, 1963). Back
- E.g., Arthur Link: “Generally speaking, progressivism might be defined as the popular effort, which began convulsively in the 1890s and waxed and waned afterward to our own time, to insure the survival of democracy in the United States by the enlargement of governmental power to control and offset the power of private economic groups over the nation’s institutions and life.” Arthur S. Link, “What Happened to the Progressive Movement in the 1920’s?” American Historical Review, 64 (1959), 836. Back
- Frederic C. Howe, Privilege and Democracy (New York: Scribner’s, 1910), p.232. Back
- Hofstadter, pp.308, 215. Back
- Walter E. Weyl, The New Democracy: An Essay on Certain Political and Economic Tendencies in the United States (New York: Macmillan, 1912), p.191. Back
- In 1900 Andrew Carnegie earned 23 million dollars tax free, while girls at the Triangle Shirtwaist factory in New York had to work six days for their five dollars a week; 2% of the population owned 60% of the country’s wealth. See Otis L. Graham, The Great Campaigns: Reform and War in America, 1900-1928 (Englewood Cliffs, NJ.: Prentice-Hall, 1971), p.9. Back
- Frederic C. Howe, The City: The Hope of Democracy (New York: Scribner’s, 1905). Back
- Hofstadter, p.233. Back
- E.g., Ellis W. Hawley, The New Deal and the Problem of Monopoly: A Study in Economic Ambivalence (Princeton, NJ.: Princeton UP, 1966). Back
- William Allen White, Woodrow Wilson: The Man, his Times and his Task (Boston: Houghton Mifflin 1924), p.264. Back
- Steffens to Tom L. Johnson, 1 Sept. 1909, in Ella Winter and Granville Hicks, eds., The Letters of Lincoln Steffens (New York: Harcourt Brace, 1938) 1, 223. Back
- Lincoln Steffens, Upbuilders (New York: Doubleday Page, 1909), p.278. Back
- E.g., Samuel Haber, Efficiency and Uplift: Scientific Management in the Progressive Era (Chicago: Chicago UP, 1964). Back
- Weyl, p.164. Back
- William Allen White, The Old Order Changeth: A View of American Democracy (New York: Macmillan, 1910), pp.30, 37, 28, 30. Back
- Christopher Lasch, ed., The Socil Thought of Jane Addams (Indianapolis: Bobbs-Merrill 1965), pp.21-22; Herbert Croly, Progressive Democracy (New York: Macmillan, 1914), p.427. Back
- Benjamin P. De Witt, The Progressive Movement: A Non-Partisan, Comprehensive Discussion of Current Tendencies in American Politics (New York: Macmillan, 1915), p.244. Back
- Henry F. May, Protestant Churches and Industrial America (New York: Harper and Row, 1949), p.111. Back
- Weyl, p.197. Back
- Jane Addams, “The Subjective Necessity for Social Settlements,” in Lasch ed ., Social Thought of Jane Addams, p.32. Back
- “The Principles of Scientific Management,” p.8, in Frederick W. Taylor, Scientific Management Crept., New York: Harper and Row, 1964). Back
- Hofstadter, Age of Reform, pp.215-16. Back
- Samuel P. Hays, Conservation and the Gospel of Efficiency. (Cambridge, Mass.: Harvard UP, 1959); Robert H. Wiebe, The Search for Order, 1877-1920 (London: Macmillan, 1967). Back
- James Weinstein, “Organized Business and the City Commission and Manager Movements,” Journal of Southern History, 28 (1962), 166-82. Back
- Hays, “Politics of Reform,” pp.157-69. Back
- Hays, Conservation, pp.271-76, and “Politics of Reform,” p.167. Back
- Gifford Pinchot, The Fight for Conservation (New York: Doubleday Page, 1910), p.48. Back
- Ibid, pp.87-88. Back
- McClure’s Magazine, 20 (January 1903), 336. Back
- Hofstadter, pp. 135, 137, 143-64. Back
- Richard B. Sherman, “The Status Revolution and Massachusetts Progressive Leadership,” Political Science Quarterly, 78 (1963), 61-65; Jack Tager, “Progressives, Conservatives and the Theory of the Status Revolution,” Mid-America, 48 (1966), 162-75; William T. Kerr Jr., ‘The Progressives of Washington, 1910-12,” Pacific Northwest Quarterly, 55 (1964), I6-27 ; David P. Thelen, “Social Tensions and the Origins of Progressivism,” Journal of American History, 56 (1969), 323-41. Back
- Hofstadter, pp.145, 140. Back
- Ibid. pp.182-85. Back
- Frank G. Goodnow, quoted in Melvin G. Holli, Reform in Detroit: Hazen S: Pingree and Urban Politics (New York: Oxford UP, 1969), p.172. Back
- See Walter Dean Burnham, Critical Elections and the Mainsprings of American Politics (New York: Norton, 1970), Ch.4. Back
- John Higham, Strangers in the Land: Patterns of American Nativism, 1860-1925 (New York: Atheneum, 1963), p.191. According to one authority, more than a fifth of the immigrants who entered between 1900 and 1910 could neither read nor write their native languages. See Richard M. Abrams, ed., Issues of the Populist and Progressive Eras, 1892-1912 (New York: Harper and Row, 1969), p.246. Back
- E.g., John D. Buenker, Urban Liberalism and Progressive Reform (1973; New York: Norton, 1978), pp.186-97. Back
- E.g., Paul J. Kleppner, The Cross of Culture: A Social Analysis of Midwestern Politics, 1850-1900 (New York: Free Press of Glencoe, 1970); Richard Jensen, The Winning of the Midwest: Social and Political Conflict, 1888-1896 (Chicago and London: Chicago UP, 1971). Back
- Arthur S. Link “What Happened to the Progressive Movement in the 1920’s?” 847-48; James H. Timberlake, Prohibition and the Progressive Movement (Cambridge, Mass.: Harvard UP, 1963), pp.1-3. Back
- Higham, p.139. Back
- Lincoln Steffens, The Shame of the Cities (New York: McClure, Phillips, 1904), pp.2-3. Back
- Timberlake, pp.30-32. Back
- White, The Old Order Changeth, pp.128, 197. Back
- The Autobiography of William Allen White (New York: Macmillan, 1946), p.630. Back
- Kolko, Triumph of Conservatism, pp.1-56, 77-78. Back
- James Weinstein, The Corporate Ideal in the Liberal State, 1900-1918 (Boston: Beacon Press, 1968), pp.ix-x, 253, passim. Back
- Robert H. Wiebe Businessmen and Reform: A Study of the Progressive Movement,(Cambridge, Mass.: Harvard UP, 1962). Back
- Carl H. Chrislock, The Progressive Era in Minnesota, 1899-1918 (St. Paul: Minnesota Historical Society, 1971), pp.26-29, 43-46. Back
- John Braeman, “The Square Deal in Action: A Case Study in the Growth of the National Police Power,” in John Braeman, Robert H. Bremner and Everett Walters, eds., Change and Continuity in Twentieth Century America (Columbus: Ohio State UP, 1964), pp.35-80. Back
- E.g., from 1902 the chief preoccupation of the National Association of Manufacturers was its anti-union Open Shop campaign. Back
- J. Joseph Huthmacher, “Urban Liberalism and the Age of Reform,” Mississippi Valley Historical Review, 49 (1962), 231-41; Buenker, Urban Liberalism and Progressive Reform. Back
- Buenker, pp.27-41. Back
- For example, in the lower house of the Connecticut General Assembly in 1910, some towns with populations of less than a thousand had the same number of representatives as cities of over 100,000. Ibid., pp.13-14. Back
- Ibid, pp.217-21. Back
- Michael Paul Rogin and John L. Shover, Political Change in California: Critical Elections and Social Movements, 1890-1966 (Westport, Conn.: Greenwood, 1970), p.76. Back
- Buenker, pp.222-23. This Democratic revival was noted in 1957 by Hays, who observed that its roots “remain an enigma”. Hays, Response to Industrialism, p.149. Back
- Charles McCarthy, The Wisconsin Idea (New York: Macmillan, 1912), pp.302-3, quoted in Michael Paul Rogin, The Intellectuals and McCarthy: The Radical Specter(Cambridge, Mass.: M.LT. Press, 1967), pp.194-95. Back
- Holli, Reform in Detroit, passim. Back
- Michael P. Rogin, “Progressivism and the California Electorate,” Journal of American History, 55 (1968), 297-314; Rogin and Shover, Chs.2-3. Back
- For example, at this time “spokesmen for all groups within Minnesota adapted progressive rhetoric to the promotion of their particular interests. Precisely what policies deserved to be called progressive became a moot question, but nearly everyone claimed the label.” Chrislock, p.22. Back
- The elections of 1914 were generally regarded as a disaster not only for the Progressive Party but for the wider reform cause. For a good analysis of the situation in a leading progressive state, see Herbert F. Margulies, The Decline of the Progressive Movement in Wisconsin, 1890-1920 (Madison: State Historical Society of Wisconsin, 1968), pp.124-63. Back
- Allen F. Davis, “Welfare, Reform and World War I,” American Quarterly, 19 (1967), 516-33. Back
- See Burl Noggle, Into the Twenties: The United States from Armistice to Normalcy (Urbana: Illinois UP, 1974), pp.46-83. Back
- Hofstadter, Age of Reform, p.149. Back
- Charles B. Forcey, The Crossroads of Liberalism: Croly, Weyl, Lippman and the Progressive Era, 1900-25 (New York: Oxford UP, 1961), pp.xv-xvi. Back
- Hofstadter, pp.134-35. Back
- W.A. White to Theodore Roosevelt, 24 Nov. 1914 (W.A. White Papers, Library of Congress); Chester Rowell to Theodore Roosevelt, 6 Dec. 1914 (Chester Rowell Papers, University of California, Berkeley). Back
- Wiebe, Businessmen and Reform, pp.70-72. Back
- Higham, Strangers in the Land, e.g., pp.81, 106-07. Back
- Richard Hofstadter, “Cuba, the Philippines, and Manifest Destiny,” in his The Paranoid Style in American Politics and Other Essays (London: Jonathan Cape, 1966), pp.148-50. The importance of the depression of the 1890s in creating that unity among different types of reformers that distinguished the Progressive Era is emphasized by Thelen, “Social Tensions and the Origins of Progressivism,” pp.335-41. Back
- James Holt, Congressional Insurgents and the Party System, 1909-16 (Cambridge, Mass.: Harvard UP, 1967), pp.58-62. Back
- Graham, The Great Campaigns, pp.143-44. Back
- Burnham, Critical Elections, pp.84-90. Back
- Forrest A. Walker, “Compulsory Health Insurance: ‘The Next Great Step in Social Legislation’,” Journal of American History, 56 (1969), 290-3 04. Back
- Postscript: Some good and relatively concise accounts of the period have recently appeared: Richard M. Abrams, The Burdens of Progress, 1900-1929 (Glenview, Ill., 1978), Irwin and Debi Unger, The Vulnerable Years: The United States, 1896-1917 (New York, 1978), and John W. Chambers II, The Tyranny of Change: America in the Progressive Era, 1900-1917 (New York, 1980). See also John W. Gable, The Bull Moose Years: Theodore Roosevelt and the Progressive Party (Port Washington, N.Y., 1978).
Top of the Page
Resources for American Studies: Issue 59, 2006
Contents
- Foreword
- Contributors
- The Presidential Library: From Early Development towards a Definition (Part 1)
- The Presidential Library: From Early Development towards a Definition (Part 2)
- Benjamin Franklin and Public History: Restoring Benjamin Franklin House
- Exhibiting Franklin
- Whitman: “A poet given to compulsive self-revision”
- Readers’ Writes: Ch-ch-changes — a Bibliophile’s Path through Higher Education Resources
- Read All About It: Free online Newspaper sources for Nineteenth-Century
- American Sheet Music
- Useful Web Resources
- Oxford Dictionary of National Biography
- American Legislative Intelligence
Foreword
As the cover suggests, we are happy to mark the tercentenary of the birth of one of America ’s founding fathers (and printer, book collector and founder of the Library Company), with the inclusion of two articles on Franklin’s life in London and his afterlife in public memory. These articles were presented at the 2006 BAAS conference at a panel partly organised by the Library and Resources Sub-Committee. We are also delighted to include a substantial article on the making of Presidential Libraries.
We are also pleased to report that the journal is now sent annually with the American Studies newsletter. Existing subscribers will continue to receive their copy under separate cover. We have tried to coordinate the mailing list, but there is likely to be some duplication so please let us know if you are receiving two copies. Similarly, if you know of an institution or individual who would like to receive a copy, let us know.
As ever, the journal welcomes submissions. If you would like to review a resource, whether in print or online, or have an article in mind on any aspect of resources American Studies, then please contact us. Similarly, the committee would be pleased to be informed of any matters relating to library provision for American Studies.
Finally, thank your for your support, and we hope that you find something of value in this year’s journal.
Matthew Shaw
Editor, Resources for American Studies
americas@bl.uk
Top
Contributors
Márcia Balisciano is the Founding Director of Benjamin Franklin House, London.
Dorian Hayes curates the Canadian and North American literature collections at the British Library.
Paul Jenks writes a monthly column for LLRX.com and is an Account Manager for GalleryWatch
Jonathan Pearson is a lecturer in American history at the University of Durham.
Jean Petrovic is the bibliographic editor, Eccles Centre for America Studies at the British Library.
Lisa Rull is a Learning Support Tutor at the University of Nottingham.
Matthew Shaw is a curator in the Americas Collections at the British Library.
Donald Tait is subject librarian at Glasgow University Library.
Arnie Thomas keeps a close watch on events in the Beltway for GaleryWatch.
Top
The Presidential Library: From Early Development towards a Definition (Part 1)
Jonathan Pearson, University of Durham
The Presidential Library is a unique institution in the United States and has courted much comment since its inception. As a genre, it is caught at the centre of a network of issues inextricably linked to the evolution of the presidency, memorialisation, commemoration, preservation and American identity itself. Individually and as a group, they have been seen as contested spaces and reflect both public adoration and criticism of the presidency. They have been called disparagingly, ‘America’s pyramids’[1], ‘Presidential Temples’[2] or ‘Presidential Palaces’[3] but also ‘Necessary Monuments’[4] or ‘Rewarding Institutions’.[5] Opinion continues to be divided over whether they are ‘Mines or Shrines,’[6] not least because they have become mausolea to their subjects[7] and provide a majority of the evidential base for presidential histories.
The apparent paradox of their popularity is explained by Richard Norton Smith, a former director of four presidential libraries and now the Director of the Abraham Lincoln Presidential Library and Museum: ‘They serve diverse audiences – that’s their glory and their weakness’.[8] Moreover, they are becoming the focus of increasing academic concern, largely with regard to their future.[9] To have any sense of that future, it is essential to consider both what they actually are and how they have derived from the melting pot of late nineteenth century and early twentieth century American debates over the Presidency, heritage preservation and archival development. It is no coincidence therefore that the debate should find itself reopened each time the power and roles of the Presidency have been questioned. Vitriolic Congressional debate greeted the establishment of the first official Presidential Library, that of Franklin Roosevelt, followed by a protracted controversy over access to ‘his’ presidential records, although there was little published or public discontent. There was little negative reaction to the beginnings of the Truman (1957), Eisenhower (1962) and Hoover (1962) Libraries, which, enshrined in the legislation of the 1955 Presidential Libraries Act, demonstrated the development of a Presidential Library System. In fact, the first rumblings of public and private discontent began to emanate when the Imperial Presidents, led by Lyndon Johnson began to carve their legacies out of stone. Johnson, intending to create the largest Presidential Library in the world, opened his ‘monolith’ in 1971. As one reporter remarked shortly after the opening: ‘It’s as though Lyndon Johnson were trying to pick up those five years in the White House with his bare hands and squeeze them into a shape that will make his history stand back in awe’.[10] The same reporter concluded that she would not ‘care to write a history of the Johnson years in that building, surrounded, overwhelmed by his words chiseled in granite and his deeds recorded in plastic display cases’.[11] Nevertheless, for all the negative associations, the Johnson Library has been ‘widely hailed for making virtually everything [presidential papers] available with heroic speed’.[12] The Library quickly became the most popular of its genre, with approximately 300,000 visitors inside the first six months.[13] Between 1980 and 2005, the Johnson Library attracted a total of 8,201,706 visitors, some two million more than at the Kennedy Library over the same period,[14] despite Johnson languishing in both public and academic post-presidential opinion polls.[15] From their inception up to 2005, the official Presidential Libraries had received approximately 67 million visitors.[16] To a large extent this interest reflects ‘a fundamental shift in the public’s values and priorities relative to museums,’ which has invoked ‘a change in the public’s perceptions of the role museums can play in their lives’, specifically within the last thirty years.[17] In 2001, Gallup revealed that nearly 53% of Americans visited a museum at least once a year.[18] Additionally, museologist Eilean Hooper-Greenhill has seen the period from the early 1980s to the early 1990s as a period of ‘enormous changes in museums and galleries across the world’. These are represented, in her words, by a shift from museums being ‘static storehouses for artefacts’ to ‘learning environments for people’.[19] This would appear to be something of an oversimplification when looking at museums and museum bodies on an individual basis but it reflected a generic change in the perception of museum roles by both public and professionals.
As a group, Presidential Libraries are unique because of their diverse roles and functions, serving as aging monuments to past presidents but also living memorials, with researchers continuing to flock to the archival deposits: they are simultaneously, to some extent, donor memorials, regional, national and international museums, preservers and conservers of collections, centres of education, visitor attractions, and research institutions. They continue to evolve. Any discussion of Presidential Libraries is immediately beset by the problem of definition, itself plagued by diversity and hence inconsistency. Confusion is immediately provided by the term ‘Library’. The name was originally chosen by the Executive Committee responsible for the papers and collections at what would become the Franklin D. Roosevelt Presidential Library. It was believed that this title would, in Benjamin Hufbauer’s words, be ‘less alien to the public’, given the unique nature of the establishment. In addition, it reflected the precedent set by the Hayes Memorial Library.[20] The institutions do contain libraries of books, often those belonging to their founders, but as Herman Kahn, the second Director of the Roosevelt Library, remarked, they ‘are at least as much archival depositories and museums as they are libraries.’[21] Some have seen the presidential library as an ‘information warehouse’.[22] But at what point does the library and museum become distinct? The Libraries, at least theoretically, substantiate the exhibits and the potential interpretation. As such, the two elements compliment each other. However, in practical terms, the obvious distinction is that while the public is allowed access to both parts of the Library, they predominantly visit the museum, leaving a much smaller group of specialists to use the research capacity of the archive. In fact, it was estimated in 1980 that ‘researchers and scholars [made] up fewer than one percent of the visitors’ to presidential libraries.[23] This gap has narrowed marginally with the increasing release of papers and tailing off of visitation after the initial enthusiasm associated with the opening of each new site. Nevertheless, the relationship remains and is exacerbated by the fact that while the Libraries were originally founded to house the President’s collection, itself predominantly paper-based archival deposits, the first elements of all Presidential Libraries to be opened and the principle point of contact between the Library and the public, is still the museum.[24] There is the hope that the regular visits by scholars will maintain a historical balance in the museum displays. As Director of the Johnson Library, Betty Sue Flowers considered, ‘[h]aving historians around keeps us honest’.[25] Yet, in reality, as Kennedy historian Robert Dallek reflected on the Kennedy Library, ‘[t]he images are so powerful, so compelling – it’s hard even for me to be objective’.[26] This has been reinforced by the increasing professionalism of the museums, employing renowned designers such as Ralph Appelbaum Associates, and modern museological practice, to produce compelling displays.
As archives, their existence has also been bound up in the discussion over the preservation and ownership of presidential papers. As can be seen from the mission statements and museums of the Presidential Libraries, there is an apparent dichotomy for each institution between providing a high level of objectivity and maintaining the legacy of the particular President. This has promoted discussion, such as that from Paula Span[27] and Michael Kammen,[28] questioning the efficacy of having several repositories rather than one central presidential ‘archive’, the appropriateness of having a museum attached to the archive, and the extent to which the President’s materials are private or public property.[29] The issue over the creation of a central repository for Federal records and materials had already aroused concern long before the creation of the Presidential Library system. Much of the material of former Presidents had either been lost or sold into private ownership. George Washington[30] and his immediate seventeen successors had each taken their material with them as private possessions. The heirs of Martin Van Buren, Franklin Pierce, Ulysses S. Grant and Millard Fillmore destroyed parts of theirs. Much of John Tyler’s records were burnt in the fire that swept through Richmond in 1865. Federal troops seized Zachary Taylor’s. Abraham Lincoln’s were kept from the public until 1948, and John Quincy Adams’s were kept behind the, largely, closed doors of the Massachusetts Historical Society.[31] In fact, the main bulk of the Adams’ Papers were only donated to the Society by the family in 1956. Additionally, some of Andrew Jackson’s disappeared when his home, the Hermitage, near Nashville, Tennessee, caught fire in 1934.[32] Such was the situation by the end of the nineteenth century, that one White House secretary exclaimed, ‘until recently “hardly a scrap of paper was kept here to show what a President did or why he did it”’.[33] This situation was made worse by the fact that because of the poor way that the ‘preserved’ material had been stored it proved very difficult to gain thorough access.[34] In fact, it was not until 1901 that the Library of Congress published ‘its first description of a manuscript collection,’ although it did focus upon the nation’s first President.[35]
In 1903, the Government moved several of the collections of Presidential Papers to the Library of Congress, promoting a policy of acquisition of such records.[36] Under Theodore Roosevelt’s executive order of March 9 the papers of George Washington, Thomas Jefferson, James Madison, and James Monroe, along with those of other important figures were transferred.[37] It was at this time that published reaction to the dangers faced by the ‘nation’s’ records first began to appear, particularly when it was realised that not even the originals of the Declaration of Independence or of the Constitution were adequately protected.[38] Remonstrations by William Howard Taft, The New York Times and the American Historical Association[39] finally brought action but even after Senate had appropriated $2,500,000 for the project, $500,000 of which could be used to begin work immediately[40], no consensus existed to force through completion.
Additionally, no legislation existed to compel Presidents to hand over their records and, in fact, after 1903, while ‘Theodore Roosevelt, William Howard Taft, and the widow of Woodrow Wilson complied… Herbert Hoover and the widows of Warren Harding and Calvin Coolidge made other plans’.[41] In fact, Warren Harding’s widow burnt some of her husband’s papers, specifically related to correspondence with ‘ Ohio political cronies’.[42] This inconsistency was despite the consistent calls for a national archives building.[43] However, the ‘first installment of the gift of the Theodore Roosevelt papers’ in 1917 was groundbreaking in that it was the first receipt of presidential papers ‘directly from a former president’.[44] The higher profile of the United States and the Presidency led to a substantial increase in the number of Presidential papers produced during each administration. In addition, Presidents, increasingly aware of their own historical value, both produced and kept more personal material, often to help in the writing of memoirs. Since the turn of the century the Library of Congress had been the ‘“National Library of the
United States ,”’[45] and its role represented a change in both Governmental and public attitudes towards heritage, as Ray Geselbracht remarked: ‘Americans were becoming protective of their memory’.[46] Significantly, this memory was embodied within the Presidency. By the late 1930s, papers from 22 of the previous 31 Presidents were held in the Library of Congress.[47] However, The Library was in no position to keep collecting and preserving Presidential materials, which were expanding voluminously, approaching two million pages of manuscript,[48] and The National Archives[49] were only ever intended to house federal records from administrative departments. Where were future presidential records to be kept? In addition, this question of storage and access was even more pertinent for other forms of material culture associated with the presidency. Possessions and gifts, for instance, which had been predominantly distributed between family and collectors were now being kept by presidents. At the same time that American attitudes towards protecting presidential papers were changing, Government began to take more of an interest in historic preservation.
In 1916, Colonel Webb Cook Hayes opened the Hayes Memorial Library (later renamed the Rutherford B. Hayes Presidential Center).[50] The Library and Museum were intended to ‘focus on the President and his family’ but also the ‘Gilded Age’, reflecting both the interests of the founder and ‘the American public’.[51] The construction was financed largely by the Hayes Family with a contribution from the State of Ohio and could be said to be the first Presidential Library in the United States and the only one to represent a nineteenth century President.[52] However, despite this early lead, subsequent Presidents continued to destroy material, or place it outside of the general public’s domain, in historical societies and private collections.[53] In addition, even when the National Archives assumed control of the Presidential Libraries, on the gifting of the Roosevelt Library to the Government in 1941, it did not involve itself with overseeing the Hayes Presidential Center, whose associated land and holdings had been deeded over to the State of Ohio by the President’s son. In fact, many Presidential memorials remained under the control of individual states or were being run privately.
A further difficulty in attempting to generically discuss Presidential Libraries is caused by organisational structure. There are currently eleven official Presidential Libraries[54], each overseen by the United States ’ Government via the Office of Presidential Libraries, a division of the National Archives and Records Administration (NARA).[55] As Richard Norton Smith has remarked, ‘[t]hey’re hybrids and I think that’s why people have trouble understanding or justifying them’.[56] However, despite the apparent continuities engendered by the organisational structure, each institution is individually operated, with varying management and interpretation. Of particular significance is that the Libraries are funding their own educational and, hence, outreach programs, as well as their own advancement. As a group, the Libraries tend to experience a minimum of external interference, despite the apparent oversight of NARA and the Office of Presidential Libraries. [57] This is particularly true for the attached museum branches, which remain virtually unmentioned in NARA’s latest strategic plan[58] and experience very little in the way of directive from central authority, relying on the personalities of the staff to direct their running and interpretation. Further complication in attempting to generically analyse the official Presidential Libraries is provided by the fact that the museums of these institutions are not members of the American Association of Museums (AAM) and, as such, are not registered museums that fulfill certain standardised criteria in terms of management and ethic. Despite these variations, the Presidential Libraries have been classified as a system, both by their overseer NARA and more formally by academics as a ‘quiescent policy subsystem’.[59]
Notes
[1] Historian Robert Caro cited in Paula Span, ‘Monumental Ambition’, in The Washington Post Magazine, February 17, 2002. See also Richard J. Cox, ‘
America ’s pyramids: Presidents and their libraries’, in Government Information Quarterly (Vol.19, 2002), 45-75. One of the first references appeared in Molly Ivins, ‘A Monumental Undertaking’, in The New York Times, August 9, 1971.
[2] Benjamin Hufbauer, Presidential Temples: How Memorials and Libraries Shape Public Memory ( University of Kansas Press: Lawrence, Kansas, 2006).
[3] August Gribbin ‘Presidential Palaces’, in Insight on the News, (Vol. 16, No. 13, April 3, 2000).
[4] Cynthia J. Wolff, ‘Necessary Monuments: The Making of the Presidential Library System’, in Government Publications Review (Vol.16, No. 1, January/February 1989), 47-62.
[5] William H. Honan, ‘11 Ridiculed but Rewarding Institutions’, in The New York Times, November 7, 1997.
[6] Robert F. Wroth, ‘Presidential Libraries: Mines or Shrines’, in The New York Times, April 24, 2002.
[7] Of the eleven official Presidential Libraries, seven of the Presidents represented have died and five of those are buried at their Libraries: They are Hoover, Roosevelt, Truman, Eisenhower and Reagan. Kennedy is buried in Arlington National Cemetery, Arlington, Virginia and Lyndon Johnson is buried at the family cemetery near Stonewall, Texas. In addition, of the two unofficial Presidential Libraries, both Presidents Hayes and Nixon are buried ‘on-site’, while President Ford and George HW Bush plan to be interred at their respective facilities: Donald Holloway, Collections Manager, Gerald R. Ford Presidential Museum, in correspondence with the author, August 27, 2002 & Stephannie Oriabure, Archivist, George Bush Presidential Library, in correspondence with the author, August 28, 2002. It is unclear, at this stage, where Presidents Carter or Clinton intend to be laid to rest, although President Carter has already filed a plan for a state funeral.
[8] Norton Smith cited in Worth, ‘Presidential Libraries: Mines or Shrines’.
[9] See for example, Princeton University Center for Arts and Cultural Policy Studies, Museums in Presidential Libraries: A First Report on Policies, Practices and Performance (December 2004), Princeton University Center for Arts and Cultural Policy Studies, Presidential Libraries: A Background Paper on their Museums and their Public Programs (Princeton University, December 2004), or American Association for State and Local History, ‘Presidential Sites and Libraries Conference IV: The American Presidential Community’, June 19-22, 2006.
[10] Ivins, ‘A Monumental Undertaking’.
[11] Ibid.
[12] Worth, ‘Presidential Libraries: Mines or Shrine?’
[13] Gary Cartwright, ‘The L.B.J. Library: The Life and Times of Lyndon Johnson in Eight Full Stories’, in The New York Times, October 17, 1971.
[14] Figures synthesised from Curt Smith, Windows on the White House: The Story of the Presidential Libraries (Diamond Communications Inc.: South Bend, Indiana, 1997), 233 and statistics provided by the Office of Presidential Libraries, see Sharon Fawcett to the author, May 11, 2006. From its opening in 1971 through 2005, the Johnson Library had received 13,276,741 visitors. These were by far the highest visitor figures of all the Presidential Libraries and reflected a virtually consistent topping of yearly attendance.
[15] In 1990 Gallup recorded Johnson’s current approval rating at 40%. This low was reflected in academic and public surveys rating the presidents in which Johnson appeared consistently down. For Johnson’s poll ranking in academic and public polls, including Schlesinger Sr. (1948 & 1962), Murray-Blessing (1982), Schlesinger Jr. (1996) and C-Span (2000), see Bose, Meena, ‘Presidential Ratings: Lessons and Liabilities’, in White House Studies (Winter, 2003). See also, Gallup Poll News Service, ‘Greatest U.S. President? Public Names Reagan, Clinton, Lincoln’, February 18, 2005.
[16] Figures synthesised from Smith (1997), op. cit., 233 and Sharon Fawcett to the author, May 11, 2006.
[17] John H. Falk & Lynn D. Dierking, Learning From Museums: Visitor Experiences and the Making of Meaning (Alta Mira: Walnut Creek, Lanham& Oxford, 2000), 2.
[18] Gallup Poll, Question: qn36C, ‘About how many times in the past year, if any, did you do each of the following? How about…Visit a museum?’ December 6, 2001-December 9, 2001.
[19] Eileen Hooper-Greenhill, Museums and their Visitors (Routledge: London & New York, 1996), 1. This reflected Cannon Brookes’ opinion, first articulated in 1984, that ‘[t]he fundamental role of the museum, of assembling objects and maintaining them within a specific intellectual environment, emphasizes that museums are storehouses of knowledge as well as storehouses of objects’: Peter Cannon-Brookes, ‘The nature of museum collections’ in Thompson, John M A, Manual of Curatorship: A Guide to Museum Practice (Butterworth Heinemann: London, 1992), 501.
[20] Hufbauer (2001), op. cit., 181.
[21] Harman Kahn, ‘The Presidential Library – A New Institution’, in Special Libraries (Vol.50, March 1959).
[22] ‘On the Record: Clinton Library as Complex as His Presidential Legacy’, in The Washington Times, July 19, 2002.
[23] ‘The Spirit of Presidents Past’, in The New York Times, January 7, 1980.
[24] Further complication is also provided by the fact that the Gerald Ford Presidential Library and Museum, while a single entity online, is actual two separate institutions: the Library is at Ann Arbor, Michigan and the Museum is at Grand Rapids, Michigan.
[25] Worth, ‘Presidential Libraries: Mines or Shrines?’
[26] Span, ‘Monumental Ambition’.
[27] Ibid.
[28] Michael Kammen, Mystic Chords of Memory: The Transformation of Tradition in American Culture (Vintage Books: New York, 1993), 446.
[29] This is even in spite of the fact that the 1978 Presidential Records Act legalised the public ownership of Presidential records.
[30] Washington’s papers were later sold, in 1834, to the State Department for $55,000. In 1848, $20,000 was paid for part of the Jefferson Papers; $65,000 for Madison’s Papers; $20,000 for Monroe’s Papers; $18,000 for Andrew Jackson’s Papers.
[31] Smith, op. cit., 1-2; See also, Geselbracht, Raymond H., ‘The Four Eras in the History of Presidential Papers’, in Presidential Papers (Spring, 1983), 37.
[32] Smith, op. cit., 1-2.
[33] Cited in Kammen, op. cit., 613.
[34] Ibid.
[35] John Y. Cole, Jefferson’s Legacy: A Brief History of the Library of Congress, http://www.loc.gov/loc/legacy/colls.html, viewed February 28, 2006.
[36] The building had been opened as a library for the new centre of American government in 1800 and then as an archive in 1897.
[37] Other papers included those of Benjamin Franklin and Alexander Hamilton.
[38] ‘NATIONS RARE DOCUMENTS UNPROTECTED AGAINST FIRE’, in The New York Times, May 28, 1911.
[39] William Howard Taft, 4th Annual Message, December 3, 1912 in Public Papers of the Presidents of the United States ( United States Government Printing Office: Washington DC). ‘FIRE-THREATENED ARCHIVES’, in The New York Times, January 21, 1912. This appeal by the newspaper was being repeated in 1919, ‘Cellars and Attics for Archives: These and Rented Non-Fireproof Buildings House Many of the Most Valuable Records in Washington’, in The New York Times, May 4, 1919. The originals of the Declaration of Independence and the Constitution were transferred to the Library of Congress in 1921.
[40] ‘To Protect Government Records’, in The New York Times, January 21, 1923.
[41]Raymond Geselbracht & Timothy Walch, ‘The Presidential Libraries Act After 50 Years’, Prologue (Summer, 2005), 49.
[42] Neil Sheehan, ‘Historians Worried by Cutbacks In Access to Presidential Papers’, in The New York Times, June 13, 1970.
[43] In addition, at the end of 1922 Princeton University decided ‘to petition Congress for an appropriation for the construction of a national archives building in Washington’: The New York Times, December 7, 1922.
[44] Roosevelt to Herbert Putnam, December 5, 1916, Theodore Roosevelt Papers, Library of Congress, http://memory.loc.gov/ammem/trhtml/trfaid.html#II, viewed on March 1, 2006.
[45] Herbert Putnam to Roosevelt, October, 1901, Theodore Roosevelt Papers, Library of Congress, http://memory.loc.gov/ammem/trhtml/trfaid.html#II, viewed March 1, 2006.
[46] Geselbracht (1983), op. cit., 38.
[47] Ibid., 37-38. Currently the Library of Congress holds the records of 23 presidents, see Appendix A. For further discussion of the development of Presidential Papers, see Lloyd, David Demarest, ‘Presidential Papers and How They Grew’, in The Reporter (February 1, 1954), 31-34.
[48] Geselbracht (1983), op. cit., 38.
[49] The National Archives Building had been completed in 1933.
[50] Colonel Webb Cook Hayes was the second son of former President Rutherford B. Hayes.
[51] Roger D. Bridges, ‘Our Purpose and Direction’, in The Hayes Historical Journal: A Journal of the Gilded Age (Vol.10, No.2, Winter 1991), 6-7.
[52] Ibid.; Thomas A. Smith, ‘Creation of the Nation’s First Presidential Library and Museum: A Study in Cooperation’ & Culbertson, Thomas J., ‘The Hayes Presidential Center Library and Archives’, in The Hayes Historical Journal: A Journal of the Gilded Age (Vol.10, No.2, Winter 1991), 12-29 & 40-44.
[53] Smith, op. cit., 2.
[54] See Appendix B.
[55] For a discussion of the development of the National Archives, see McCoy, Donald R., The National Archives: America ’s Ministry of Documents 1934-1958 (University of North Carolina Press: Chapel Hill, North Carolina, 1978).
[56] Norton Smith cited in Worth, ‘Presidential Libraries: Mines or Shrines?’
[57] NARA, Presidential Libraries Manual (NARA: Washington DC, 1985).
[58] NARA, Ready Access to Essential Evidence: The Strategic Plan of the National Archives and Records Administration (NARA: Washington DC, 1999; revised 2000 & 2003).
[59] Cochrane, Lyn Scott, The Presidential Library System: A quiescent policy subsystem (Ph.D., Virginia Polytechnic Institute and State University, 1999).
Top
The Presidential Library: From Early Development towards a Definition (Part 2)
Officially, the presidential library system originated in 1941 when Franklin D. Roosevelt (FDR) opened his Presidential Library and Museum at his family home at Hyde Park in New York State. FDR crystallised public identification with the presidency and the nation, signifying for many historians and political commentators the origin of the modern presidency.[60] It was his intention to build a library and museum to ‘house the vast quantity of historical papers, books, and memorabilia he had accumulated during a lifetime of public service and private collecting’.[61] Practically, there was a need to find a repository for the Presidential papers. Roosevelt had generated far more than any other President, exacerbated by his expansion of the Executive Office and, in combination with his own avid collecting, he had acquired more material than he had personal room to house.[62] This predicament had been worsened by the enormous number of gifts given to the President; significantly greater than any previously. Such was the size of his personal collection that he exclaimed: ‘“Future historians will curse as well as praise me”’.[63] He believed in the importance of preserving the past[64] and expected the library to be a place to which ‘visitors could come and where he could work “preparing the collections during hours when the public was not admitted”’.[65] As such, he wanted to maintain a hands-on approach to the library’s running and, more specifically, access to the presidential papers. This was also indicated by, as Benjamin Hufbauer has observed, the attempt to make his long-time friend and adviser Harry Hopkins the library’s first director. While this was cut off at the pass by the Archivist of the United States , Robert D.W. Connor, FDR continued to try and control the library’s functions.[66] While publicly Connor declared that ‘“Franklin D. Roosevelt is the nation’s answer to the historian’s prayer,”’[67] this undercut the populist element implicit within the creation of the museum, as well as the patriarchal legitimising of the American public’s ownership of the presidential papers and thus, to some extent, ownership of the Presidency through its symbolism of American tradition and values. Additionally, Connor privately recorded his concern that ‘“[t]he President still thinks of the Library as his property”’.[68] In a memorandum to the Director of the Library, FDR was firm that ‘“[b]efore any of my personal or confidential files are transferred to the Library… I wish to go through them and select those which are never to be made public”’. Provision was also made for censorship after his death, such that a committee consisting of Sam Rosenman, Harry Hopkins and Grace Tully ‘“or the survivors thereof”’ should perform the same function.[69] Despite having gifted his papers to the nation FDR continued to effect access to them. While this did not cause immediate problems, by 1947 the memorandum had appeared in the press and the Library trustees had withheld documents from an investigating committee.[70] The situation had raised a serious issue with regard to presidential papers that a nation believed had been gifted. The problem appeared to have been laid to rest on July 21, 1947 when a court ruling ‘decided that the late President Roosevelt made a valid and effective gift to the United States Government of all his papers and files, including those in his possession at the time of his death’. Responsibility for controlling access was passed to the United States Archivist.[71] Nevertheless, this was a debate that would continue[72] and Roosevelt’s papers would not be considered open until 1950,[73] even courting controversy over access again in 1969.[74]
By May 1949, within 8 years of opening, Hyde Park had received over one million visitors.[75] Roosevelt’s perceived gift and the creation of the museum reflected a changing America . However, despite the development of cultural responsibility, the iconic nature of the president and the impact of the War, at this point, as one of Truman’s biographers noted, ‘the Franklin D. Roosevelt Library…might have served as a unique depository for the papers of a unique president had Truman not been determined to push ahead with his’.[76] Even as he was packing up to leave the White House, Truman was ‘making plans for his presidential library’.[77] However, he was also adamant that he did not want to start fundraising and putting the mechanics in place to raise the Library until he had officially left office.[78] In fact, the two things he wanted most after leaving office were ‘to become a grandfather and to see his library established’.[79] The latter became the ‘focus of his life’. Part of this drive was gained from the ‘disappointment’ derived from writing his memoirs but beyond this he had a vision for an educational establishment that would benefit the nation.[80] He hoped that the library would become a regional centre for the study of the presidency, a subject on which he said of himself that he was ‘a nut’.[81] Truman acknowledged that ‘“I got the idea from Franklin Roosevelt, but he died before he could come and do it like this.”’[82] The Truman Library was not going to be the national shrine that FDR had created.[83] Truman ‘refused to allow paintings of himself or postcards with his likeness in the public part of the library’[84] and was emphatic that he did not want to be portrayed in Hart Benton’s mural in the Library’s lobby,[85] because he did not want the Library to become ‘the Truman Memorial Library’.[86] He also accepted that he could be proud of the Library ‘even though he knew he was its principal exhibit’.[87] A ‘Harry S. Truman Memorial Building’ had been envisioned by local community and municipal leaders, as early as 1948/1949, but Truman persisted with his own vision for a more balanced library and museum, one that would give visitors ‘a better understanding of the history and the nature of the presidency and the government of the United States’. He had no intention of creating an ostentatious repository. Truman made sketches for the Library architects, proposing that it look like his Grandfather Solomon Young’s house, ‘the big house of his own childhood memories’[88] and vehemently rejected such ideas as enshrining his future gravesite within a ‘little chapel’.[89]
By July 1950, a non-profit corporation had been established to raise the necessary $1,750,000 for the library project.[90] The popularity of the Presidency was revealed by the individual donations exceeding 17,000 in number and the diversity of the contributors in terms of background, geographical location and political allegiance.[91] Furthermore, the number of visitors to the Truman Library in its first eight years exceeded the one million that had visited the Roosevelt Library during the same number of years from its inception; each paying fifty cents.[92] Given that, by the end of July, Truman’s presidential approval rating had dropped as low as 39%, here was another indication that whatever the public attitude towards an incumbent’s perceived political successes or failures there was now a core that believed in the office of the presidency and were prepared to pay to commemorate it; favourability was more important in this process than approval.[93]
In planning the preservation of his papers while President, Truman had supported the 1949 Federal Property and Administrative Series Act which provided authority to the National Archives ‘to accept and care for papers of present and future Presidents’.[94] In 1950 he also backed the Federal Records Act, further professionalising records’ management. In this way, all the critical elements of what we have come to expect from Presidential Libraries were coming together; the archives, the historical representation and education, popular support and an associated supporting non-profit foundation. They were formally fused by the passage of the Presidential Libraries Act of 1955, championed by former president Truman.[95] The Joint Resolution authorised the Government to extend the Federal and Administrative Services Act to accept, ‘in the name of the
United States ’, the land and/or buildings to house these collections from the donor. However, this was subject to the securing ‘so far as possible, the right [of Government] to have continuous and permanent possession of such materials’.[96] The 1955 legislation enshrined the purpose and endurance of the Libraries. As Truman had stipulated in his will, reaffirming the 1955 Act, he was bequeathing ‘to the United States of America all of my right, title and interest in, and possession of’ his papers.[97] Construction of the Library began in 1955 and the building was finally dedicated on July 6, 1957, when the presidential materials were ceremonially transferred to the Government, with a pomp and circumstance that reflected the VIP guests belief in Truman’s vision.
Unlike FDR or, later, Eisenhower ‘Truman’s image of the Presidency and of himself as President kept job and man distinct’ such that ‘never in his tenure does he seem to have conceived that he fulfilled the Presidency by being Harry Truman’. He lacked the subtleties of other Presidents but there was an honesty that was often ignored.[98] As such, he saw the need for a Presidential Library system and set up the legal parameters within which others could follow, not so much for his own glory but for that of the office. Additionally, Truman intended the Library to bring a federal resource closer to the people of the Mid-West.[99] Yet, none of Truman’s actions guaranteed immediate access to his papers. While the same storm that had surrounded Roosevelt’s desire to censor access to his papers did not break, Truman continued to retain control over certain papers, notably those associated with foreign policy.[100] It was not until after his death in 1972 and that of his wife, ten years later, that historians began to break into the history of the Truman administration with notable releases by Robert Ferrell, Alonzo Hamby and David McCullough. Truman alleged that he did ‘not want any competition with his memoirs’ but there is little doubt that he was also concerned how historians would perceive him. He even went as far as to refuse access to the ‘chief State Department historian’.[101] He should not have worried. Truman is now perceived as a ‘near-great’ by both academics and public alike. But this was not Truman’s overriding concern for his Library. For the President, it was about the office. The Truman Library, like the president’s legacy, has matured and provides a largely objective interpretation that reflects the current historiographical position but also Truman’s focus upon education and the presidency as an institution. In developing, in his words, a ‘cultural center’[102], he had suggested the importance and interconnection of the public, the presidency and the nation’s history.
The timing of the inception of the Presidential Library system predated the clearer development of a national commitment to the preservation and, even, construction of an American memory outlined by, among others, Michael Kammen. Public identification with the president and its evolutionary interest in the preservation of the nation’s history combined, such that national and Federal interests met by defining the nation through the filter of the Presidency, culminating with the opening of the Truman Library in 1957. From this point there was an increased commitment to national history reflected in a series of legislative moves. At the same time that the Truman Library was being opened, Congress was debating the passage of an act ‘to organize and microfilm the papers of the Presidents of the United States in the collections of the Library of Congress… as a means of safeguarding them and in order to make them more widely available to researchers and students of American and world history’.[103] The legislation was finally passed on August 16, 1957 and with it an authorisation for $720,000 to facilitate its implementation.[104] The National Historic Preservation Act of 1966, which established the National Register of Historical Places, and the National Museums Act, also of 1966, were also primary in this process.[105] Nevertheless, even though the National Museums Act acknowledged Congress’ acceptance that ‘“national recognition is necessary to insure that museum resources for preserving and interpreting the Nation’s heritage may be more fully utilized in the enrichment of public life in the individual community”’, Congressional action was still ad hoc and ‘fiscal conservatives…believed that such work was an “obligation of the local communities and states”’.[106] Further legislation in the 1970s appeared to reinforce Congressional commitment. The 1974 Presidential Recordings and Materials Preservation Act, while applying only to President Nixon’s materials, showed a concern for the possible loss of historical materials, the basis of the memory and historical construct. In the Act, the Federal Government defined which of the President’s materials were to be publicly owned and cared for on behalf of the nation by the National Archives and Records Service (NARS, the forerunner of NARA).[107] This attitude towards ownership went a stage further, again born out of the fears that another President might choose to destroy or refuse access to certain materials. Historians had already become ‘worried that legal and bureaucratic complications surrounding the preservation of Presidential papers [might] be crippling their efforts to analyze the world’s most powerful office’.[108] The Presidential Materials Act of 1978 thus extended the 1974 legislation to pertain to any future President, officially affecting those from Reagan onwards. Further legislation does not seem to have helped. The amount of work required to process presidential papers in the wake of the 1978 Act, the Freedom of Information Act (1966-2000) and the amendments to the Presidential Materials Act (1989, 1995, 2001 & 2003) have brought greater work to an overstretched and depleted number of archivists.[109] In particular, presidential records were now to be released within twelve years or even faster, if processed through a Freedom of Information request. Paradoxically, it is becoming increasingly difficult to make papers accessible to researchers, despite the apparent ability of researchers to request early release of materials. Further restrictions on release and access have been imposed by President George W. Bush’s Executive Order 13233 of 2001. Whilst being called, somewhat disingenuously, ‘Further Implementation of the Presidential Records Act’, in historian Robert Watson’s words the order ‘permits the sitting president to deny the release of papers of a former president, even if that previous president authorizes the release of his papers’, thus reversing the previous legislation which permitted ‘only the former president in question to weigh in on the accessibility of his presidential materials’. Additionally, the amendment ‘allows for the release of certain types of presidential papers and documents only if both the former and sitting presidents approve their release,’ making it ‘difficult for the public or scholars to obtain materials and locks away from public introspection potentially important documents, deemed sensitive by the incumbent president’.[110] This attitude was reinforced by Robert Putnam and Robert Spitzer in their ‘American Political Science Association Response to Executive Order 13233’.[111]
The Presidential Libraries continued to appear to some as, at best, anomalous and, at worst, antithetical. This was bound up in the ‘essential contest’, in the United States , shaping ‘the commemoration and interpretation of the past’.[112] The contest was being ‘waged between the advocates of centralized power and those who were unwilling to completely relinquish the autonomy of their small worlds’.[113] Hence the same reservations held by the preservationists such as William Sumner Appleton and the New Englanders at the beginning of the Twentieth Century[114] were still prevalent and competing against the attitudes of those such as Theodore Roosevelt. Therefore, in terms of perception, Presidential Libraries appear to fall between two stalls. For those advocating centralised control, the Libraries are seen as independent institutions that, because of this limited oversight, develop self-serving donor memorials with overly biased legacies. To those advocating decentralised control, they appear to be part of a Library system, centrally overseen by the Federal Government, through NARA and the Office of Presidential Libraries. Other, unofficial Presidential Libraries have avoided some of this controversy by maintaining their autonomy but this is increasingly financially difficult in a country that has been inundated with new museums.[115] However, the timing of its inception has also meant that the Presidential Library has become inextricably linked with the legacy of the nation, which is synonymous with the legacy of the President. In this way, it constructs and reaffirms an identification with the nation and the President through preserving and conserving the Presidential materials and making them as accessible as possible, within a cultural-historical context, to the widest audience through a variety of means dominated by specialist, educational and touristic programs. The Library is unavoidably political but its parameters are being defined by sociological, museological and historical writing. When considered within the Library system, the plethora of other multi-functional museums in the
United States and, through the globalisation of media and audiences and the rest of the world, the Presidential Library continues to pragmatically reassert itself as a national archive and museum with a significant role to play for its various audiences in the Twenty-first century.
Appendix A
Presidential Papers: Location of the dominant collections
NB: The presidential papers held by the Library of Congress have been microfilmed and are available through many Libraries and Universities throughout the
UK . Additionally, many of the presidential papers are currently under publication bringing those collections listed below and other previously dissipated collections together. As such, it is worth checking the details provided by the Scripps Library of the Miller Center of Public Affairs at http://millercenter.virginia.edu/scripps/reference/papers/
George Washington |
Library of Congress |
John Adams |
Massachusetts Historical Society |
Thomas Jefferson |
Library of Congress |
James Madison |
Library of Congress |
James Monroe |
Library of Congress |
John Quincy Adams |
Massachusetts Historical Society |
Andrew Jackson |
Library of Congress |
Martin Van Buren |
Library of Congress |
William H. Harrison |
Library of Congress |
John Tyler |
Library of Congress |
James K. Polk |
Library of Congress |
Zachary Taylor |
Library of Congress |
Millard Fillmore |
Buffalo and Erie County Historical Society |
Franklin Pierce |
Library of Congress |
James Buchanan |
Historical Society of Pennsylvania |
Abraham Lincoln |
Library of Congress |
Andrew Johnson |
Library of Congress |
Ulysses S. Grant |
Library of Congress |
Rutherford B. Hayes |
Rutherford B. Hayes Presidential Center |
James A. Garfield |
Library of Congress |
Chester A. Arthur |
Library of Congress |
Grover Cleveland |
Library of Congress |
Benjamin Harrison |
Library of Congress |
William McKinley |
Library of Congress |
Theodore Roosevelt |
Library of Congress |
William H. Taft |
Library of Congress |
Woodrow Wilson |
Library of Congress |
Warren Harding |
Ohio Historical Society |
Calvin Coolidge |
Library of Congress |
Herbert Hoover |
Herbert Hoover Presidential Library |
Franklin D. Roosevelt |
Franklin D. Roosevelt Presidential Library |
Harry S Truman |
Harry S Truman Presidential Library |
Dwight D. Eisenhower |
Dwight D. Eisenhower Presidential Library |
John F. Kennedy |
John F. Kennedy Presidential Library |
Lyndon B. Johnson |
Lyndon Johnson Presidential Library |
Richard M. Nixon |
Nixon Materials Project, College Park, Washington D.C.[116] |
Gerald R. Ford |
Gerald Ford Presidential Library |
James E. Carter |
James Carter Presidential Library |
Ronald Reagan |
Ronald Reagan Presidential Library |
George H. W. Bush |
George Bush Presidential Library |
William J. Clinton |
William J. Clinton Presidential Library |
Appendix B
Official Presidential Libraries and Dates of Original Dedication
Franklin D. Roosevelt |
June 30, 1941 |
Harry S Truman |
July 6, 1957 |
Dwight D. Eisenhower |
May 1, 1962 |
Herbert Hoover |
August 10, 1962 |
Lyndon B. Johnson |
May 22, 1971 |
John F. Kennedy |
October 20, 1979 |
Gerald R. Ford |
April 27, 1981 |
Jimmy Carter |
October 1, 1986 |
Ronald Reagan |
November 4, 1991 |
George H.W. Bush |
November 6, 1997 |
William J. Clinton |
November 18, 2004 |
[60] See for example: Pfiffner, James , The Modern Presidency (St. Martin’s Press: Boston,, 2000), Davis, James W., The American presidency: a new perspective (Harper & Row: New York, 1987), Greenstein, Fred I., The presidential difference: leadership style from FDR to Clinton (Martin Kessler/Free Press: New York & London, 2000), Polsby, Nelson W., (ed.) The Modern Presidency (Random House: New York, 1973), Kernell, Samuel, Going public: new strategies of presidential leadership (CQ Press: Washington DC, 1986) or Neustadt, Richard E., Presidential Power and the Modern Presidents: The Politics of Leadership from Roosevelt to Reagan (The Free Press: New York, 1990).
[61] The Franklin D. Roosevelt Presidential Library and Museum, ‘The Franklin D. Roosevelt Library’, http://www.fdrlibrary.marist.edu/aboutl2.html.html, viewed March 5, 2006.
[62] Kahn (1959), op. cit.
[63] Ward, Geoffrey, ‘Future historians will curse as well as praise me’, in Smithsonian (December 1989), 62.
[64] See for example, ‘Remarks at the Dedication of the Franklin D. Roosevelt Library at Hyde Park, New York, June 30th, 1941’, in Public Papers of the Presidents of the United States (United States Government Printing Office: Washington, DC).
[65] McCoy (1975), op. cit., 138.
[66] Hufbauer (2001), op. cit., 183.
[67] Connor cited in The Franklin D. Roosevelt Presidential Library and Museum, ‘The Franklin D. Roosevelt Library’, http://www.fdrlibrary.marist.edu/aboutl2.html.html, viewed February 23, 2006.
[68] Connor Journal, June 30, 1941, cited in Hufbauer (2001), op. cit., 183.
[69] ‘FDR to Director of the Franklin D. Roosevelt Library, July 16, 1943’, Truman Papers, Truman Library, White House Central File, Official File, 158-d, Franklin D. Roosevelt Library – Museum.
[70] ‘Roosevelt’s “Personal” File Shut by His Memo in 1943’, in The New York Times, May 22, 1943. See also, ‘Truman Will Get Roosevelt Papers’, in The New York Times, May 3, 1943‘Roosevelt Estate Bars Hunt in Files’, in The New York Times, May 9, 1943 & ‘Senators Get Part of Roosevelt File’, in The New York Times, May 26, 1943.
[71] ‘Roosevelt Files Are U.S. Property, Not Part of His Estate, Judge Rules’, in The NewYork Times, July 22, 1947.
[72] See for instance, Nevins, Allan, ‘The President’s Papers – Private or Public?’, in The New York Times, October 19, 1947 & ‘Letters: Presidential Papers’, in The New York Times, September 7, 1969
[73] Parke, Richard H., ‘Roosevelt Papers Opened To The World at Hyde Park’, in The New York Times, March 18, 1950.
[74] Bates Leonard, et al. ‘Presidential Papers’, in The New York Times, September 7, 1969, Arthur Schlesinger Jr., ‘F.D.R. and Foreign Affairs (continued)’, in The New York Times, October 19, 1969, Carl N. Degler, ‘F.D.R. and Foreign Affairs (continued)’, in The New York Times, November 9, 1969 & Henry Raymont, ‘Secrecy of Documents Irks Historians’, in The New York Times, December 28, 1969.
[75] John E. Booth ‘ Hyde Park Shrine: Millionth Visitor Tallied At the Roosevelt Home’, in The New York Times, June 5, 1949.
[76] Alonzo Hamby, Man of the People: A Life of Harry S. Truman (Oxford University Press: Oxford & New York, 1995), 629.
[77] David McCullough, Truman (Simon & Schuster: New York & London, 1993), 916.
[78] Oral History Interview with Tom L. Evans, Kansas City, Missouri, December 10, 1963, interviewed by J.R. Fuchs, Truman Library, Truman Papers, 608 & 622.
[79] Ibid., 961.
[80] Ibid., 962.
[81] Ferrell, Robert, Harry S. Truman: A Life (University of Missouri Press: Columbia, Missouri, 1994), 389.
[82] Oral History Interview with Milton Perry…10.
[83] Harry Vaughan to Tom Evans, January 18, 1949, Truman Papers, Truman Library, Papers of Tom Evans, Alphabetical File, Washington Correspondence – 1949.
[84] Ferrell, op. cit., 389.
[85] McCullough, op. cit., 968.
[86] Ferrell, op. cit., 389. See also Herscher, Betty, ‘ Missouri’s Presidential Library: The Harry S. Truman Library’, in The Missouri Library Association (Vol.21, No.3, September 1960). 82-83.
[87] Ferrell, op. cit., 388.
[88] McCullough, op. cit., 931.
[89] Oral History Interview with Tom L. Evans, Kansas City, Missouri, December 10, 1963, interviewed by J.R. Fuchs, Truman Library, Truman Papers, 656-657.
[90] ‘Press Information: Dedication Ceremony of the Harry S. Truman Library Independence, Missouri’, July 6, 1957, Truman Papers, Truman Library, Vertical File.
[91] McCullough, op. cit., 962.
[92] Visitor statistics provided by Ray Geselbracht in correspondence with the author, April 14, 2006.
[93] Presidential Job Performance, Roper Center for Public Opinion, http://www.ropercenter.uconn.edu/cgi-bin/hsrun.exe/Roperweb/PresJob/PresJob.htx;start=HS_fullresults?pr=Truman, viewed February 26, 2006.
[94] Federal Property and Administrative Series Act 1949, http://http://www.baas.ac.uk/wp-content/uploads/2010/05/fpasa49.pdf, viewed March 6, 2006.
[95] Public Law 81-754, ‘An act to amend the Federal and Property and Administrative Services Act of 1949, and for other purposes’: Truman Papers, Truman Library, Post-Presidential Files, Library-Museum File, Pre-dedication, Box 889, Personnel – applications.
[96] H.J.RES. 330, July 6, 1955: Truman Papers, Truman Library, Harry S Truman Inc. Files, Box 19, Presidential Libraries General.
[97] H. S. Truman ‘Last Will and Testament’, January 7, 1959, Truman Papers, Truman Library, Vertical File.
[98] Neustadt, Richard E., Presidential Power and the Modern President: The Politics of Leadership From Roosevelt to Reagan (Free Press: New York, 1990), 147-148.
[99] New York Post, January 9, 1953.
[100] See for example, Sheehan, ‘Historians Worried by Cutbacks…’
[101] Ibid.
[102] HST to Mr W P Marchman, Secretary, Rutherford B. Hayes and Lucy B. Hayes Foundation, November 7, 1952, Truman Papers, Truman Library, President’s Secretary’s Files, Truman Library Foundation File: Correspondence – General.
[103] Report No.615, July 17, 1957: Truman Papers, Truman Library, Post-Presidential Files, Library-Museum File, Post-Dedication, Box 893, Papers.
[104] Public Law 85-147, August 16, 1957 cited in ‘The Presidential Papers Program of the Library of Congress’, October 1, 1960, Truman Papers, Truman Library, Harry S Truman Inc. Files, Box 20, Papers of Presidents (folder 2).
[105] For further discussion see, Kammen, op. cit., 570.
[106] Ibid., 613-614.
[107] This is still an area of debate, see Ibid., 613.
[108] Sheehan, ‘Historians Worried by Cutbacks…’
[109] Executive Order 12667, January 16, 1989; Executive Order 12958, April 17, 1995; Executive Order 13233, November, 1, 2001; Executive Order 13292, March 25, 2003.
[110] Watson, Robert, ‘A Challenge to the Presidential Records Act?’, in White House Studies (Vol.2, No.2, 2002).
[111] Robert D. Putnam & Robert J. Spitzer, ‘American Political Science Association Response to Executive Order 13233’, in Presidential Studies Quarterly (Vol.32, No.1, 2002). See also, Kumar, Martha Joynt, ‘Executive Order 13233 Further Implementation of the Presidential Records Act’, in Presidential Studies Quarterly (Vol.32, No.1, 2002).
[112] John Bodnar, Remaking America : Public Memory, Commemoration and Patriotism in the Twentieth Century (Princeton University Press: Princeton, New Jersey, 1994), 245.
[113] Ibid.
[114] See Kammen, op. cit., 264.
[115] Alexander estimated that, at his time of writing, 3.3 new museums were being created daily: Alexander, op. cit., 5. This pace may have declined but the United States is currently home to an estimated 17,500 museums, American Association of Museums (AAM), ‘ABCs of Museums’, http://www.aam-us.org/aboutmuseums/abc.cfm#how_many viewed March 5, 2006, receiving more than 865 million visitors a year, AAM, ‘Working in the Public interest’, http://www.aam-us.org/aboutmuseums/publicinterest.cfm, viewed March 1, 2006.
[116] Congress passed legislation in January 2004 to amend the 1974 Presidential Recordings and Materials Preservation Act and provided for the establishment of a federally-operated Nixon Presidential Library in Yorba Linda at the current site of the private Nixon Library and Birthplace. NARA have recently agreed that if and when the Nixon Foundation raises the money to complete a satisfactory archives building at the Library, the former president’s papers will be transferred from Washington. The provisional timeframe suggests that NARA will accept the Nixon Library as another official Presidential Library in the Summer of 2006, after which time it will take at least eighteen months to transfer all materials from the College Park archive.
Top
Benjamin Franklin and Public History: Restoring Benjamin Franklin House
Dr. Márcia Balisciano, Founding Director, Benjamin Franklin House
What is the best way to engage the public in the history of a person, location, time? This is a question that was put to the test at Benjamin Franklin House. This paper reviews the process by which a 1730s building, derelict for over 25 years, and never open to the public, became a new kind of museum.
By some quirk of fate, the only surviving home of Benjamin Franklin, one of America’s most iconic figures, is not in Boston where he was born; not in Philadelphia, his adopted city, where he created civic institutions that have shaped American life and where he made lasting contributions to science; and not in Paris where he served as the first official representative of a fledging American government, garnering support which helped decide the course of the American Revolution. It is, in fact, in the heart of London, just steps from Trafalgar Square. Benjamin Franklin called a narrow, brick terrace building at 36 Craven Street home for nearly sixteen years between 1757 and 1775.
By 1980 the Georgian building was empty and derelict. In recognition of the importance of the building and its plight, a trust was formed. It was not, however, until the late 1990s that substantial fundraising allowed serious consideration of the building’s future. The starting point for this exploration was Benjamin Franklin.
Franklin’s passionate curiosity, his commitment to furthering public knowledge, and embrace of technology – to the degree he longed to know what would be made, centuries on, of inventions he had pioneered – marked his long life and spurred our quest to make his only surviving residence a tribute to this legacy.
Benjamin Franklin and 36 Craven Street
Leaving behind wife Deborah, who feared an ocean crossing, and fourteen year-old daughter Sally, Franklin departed Philadelphia on 4 April 1757. It was not, however, his first visit to the motherland. In 1724, as a young man of eighteen, Franklin travelled to London to expand his printing skills and remained for nearly two years. In his amazing lifetime (1706-1790), Franklin ventured across the Atlantic eight times.
Accompanied by his son William and two black servants, Peter and King, Franklin’s second sojourn in London began on 26 July 1757. As Agent of the Province of Pennsylvania at the Court of His Most Serene Majesty, Franklin’s mission was to convince the Penn family, Pennsylvania’s proprietary owners, to pay tax in order to alleviate the expense of the French and Indian War, or persuade George III to bring the colony under Royal dominion, with concomitant financial support. He also served as Postmaster-General of North America.
Perhaps Franklin chose the house for its nearness to the seats of British power; it was additionally close to the Penns’ palatial home at Spring Gardens. And it had an upright mistress, widow Margaret Stevenson, her charming daughter Polly (Mary), and maid Janey who hailed from Pennsylvania to recommend it. Franklin found a surrogate family that did not look all that different from the one he had left behind. Biographer Carl Van Doren said he was less a lodger than the head of a household living in serene comfort and affection.
In London, Franklin continued his innovative work in science. On the banks of the Thames at the bottom of his street he demonstrated his kite and key experiment proving lightning to be an electrical phenomenon (and St. Paul’s Cathedral was the first building in Britain to have a Franklin lightning rod) – a hallmark of the Age of Enlightenment he helped shape. At Craven Street, Franklin worked with chemist Joseph Priestley on oxygen experiments, tested the smoothing properties of oil on water, and the effect of canal depths on ships. During conservation we discovered the remnants of a Franklin stove he installed in his laboratory at Craven Street.
Franklin returned to Philadelphia in 1762, but his presence in London was soon deemed essential in order to further colonial views before a distracted King and an increasingly antagonistic Parliament. He successfully fought against the punitive Stamp Act, but despite his continued negotiations and pleas, a final break proved inevitable. Franklin’s last days in London were marked by political strife, precipitated by the so-called Hutchinson Affair. Franklin leaked letters to Boston which showed the intention of American-born Massachusetts Governor, Thomas Hutchinson, to call in British troops should denizens prove unruly. The fiasco led to a duel in Hyde Park between two disgruntled players in the drama – and Franklin’s call-down in 1774 before the House of Commons.
When Franklin learned of the death of his wife Deborah, he knew he finally had to leave, which he did clandestinely in March 1775, with his grandson William Temple Franklin. He spent his final day at Craven Street with friend and chemist Joseph Priestley scanning the American newspapers for any snippet to relive the gloom. Within months, Franklin, having signed the Declaration of Independence in Philadelphia, was sailing back to Europe, this time, on his way to France to broker the alliance that eventually saved the young United States from ignominious defeat.
Franklin’s contribution in London was significant in many ways. Politically, he survived many years of near impossible controversy, demonstrating tireless skills of patience, perseverance, and compromise. By his pen, his chief weapon, he explained his current and future vision for the Colonies, and their ongoing relationship with the mother country.
Bringing the Project to Fruition
Craven Street’s townhouses are built on in-filled soil, which does not provide a fully stable foundation. Benjamin Franklin House also suffered from a change to the original mansard roof in 1780 and the addition of the back ‘closet’ story, over the ensuing 270 years the additional weight, shifting subsoil conditions, and damage to the fabric of the building caused delamination of the front brick facade, critical sagging of the spine wall, and overall structural decay. When the Craven family fell on lean times at the turn of the 20th century, they decided to sell the freehold to what became British Rail. They recognised the historic and architectural value of Benjamin Franklin House but did not invest funding to prevent the building from becoming derelict.
Our first step in 1998 was to ensure exterior stabilisation of the building to avoid the collapse of the structural. With planning approval from the relevant statutory bodies and primary funding from the Heritage Lottery Fund (HLF), English Heritage, the Getty Foundation, and the William Hewlett Trust, among others – support beams were reinforced, brickwork was tuck-pointed, repairs were made to the roof, and ties were instated to steady floors at a cost of nearly £1 million.
Throughout the project we adhered to the following conservation principles:
1) Minimise the extent of repair work
2) Retain original material wherever possible
3) Use traditional methods and materials wherever possible
4) Provide long-term rather than ad hoc repairs which need early renewal
While this work completed by 2000 secured the structure of the building, the interior was still derelict. Between 2000 and the start of 2004, all design work was completed for the Historical Experience, Student Science Centre, and Scholarship Centre and two floors of interior conservation. Of the approximately £3.3 million total project cost, by the close of 2003 there was still £1.5 million to raise; and with a goal to open on the tercentenary of Franklin’s birth in January 2006, timescales were tight. We were extremely fortunate to receive an £1 million grant from the HLF; this served as a catalyst to raising the required balance.[117]
With funding in hand, we tendered for final conservation and all multimedia and electrical services. During the year, ceilings, panelling, fireplaces, and flooring were brought back to their original lustre by primary contractor Wallis under the watchful eye of a project management team that included a member of the Board who is a conservation specialist. Sysco installed the technology (including sound, lighting, switches, video, PC networking) needed for all uses of the House. Heating, cooling, cabling and multimedia requirements were sensitively integrated into the 18th century fabric of the building.
Where we go from Here
On a remarkable day, Benjamin Franklin’s 300th birthday, Benjamin Franklin House opened to the public for the first time. In a fitting tribute to the Anglo-American Franklin, the Foreign Secretary, Jack Straw, and the American Ambassador, Robert Tuttle, cut a ribbon across the threshold of 36 Craven Street on 17 January 2006 to welcome the public to Franklin’s last remaining home.
The House’s offerings make best use of the building’s limited space uncovering the rich yet not widely known story of Franklin’s London years. The Historical Experience employs live interpretation and leading edge sound, lighting and visual projection to tell Franklin’s rich London story in his own words. The historic spaces serve as stage for this ‘museum as theatre’ which removes the traditional distance between visitor and the past and illuminates a unique moment in Anglo-American history: food, health, botany, and daily living in the basement kitchen; public and personal relationships, musical inventions and political tension on the ground floor; scientific work, political triumphs and woes, and a hurried return to America in the face of the looming War of Independence on the first floor. Emmy-award winning actor Peter Coyote is the voice of Benjamin Franklin and Academy Award-nominated actress Imelda Staunton is Margaret Stevenson, Franklin’s landlady.
The Student Science Centre features hands-on experimentation with scientific discoveries from Franklin’s London years, juxtaposing past and present knowledge, and inspiring young people – particular those from disadvantaged backgrounds to think and test in the mode of Franklin, translating curiosity and discovery into practical ways of improving life and society. The Student Science Centre allows students to re-create diverse and important experiments from Franklin’s sojourn in London and support elements of the National Curriculum. The emphasis in the Medical History Room is on the medical research work of William Hewson, who ran an operating theatre in Craven Street.[118] Children will carry out experiments with the House’s Education Officer encompassing Franklin’s work on canal depths, electricity and lightning rod design, and the Franklin developed instrument, the glass armonica. Dramatic, interest-catching audio-visual segments support the presentations, extending the lesson and enabling children to explore ‘what if’ questions such as ‘what happens if lightning strikes a building with no lightning conductor?’
Scholarship
The Scholarship Centre is the intellec-tual hub at the top of the building. It features a full set of the Papers of Benjamin Franklin, prepared by Yale University purchased with support from the US Embassy London with access to the prototype online Papers catalogued by Yale and the Packard Institute for the Humanities. We aim to have a scholar in residence by the new academic year in October 2006 focusing on one of the myriad subjects of interest to Franklin, while annual Symposia will use Franklin as a point of departure for contemporary discussions of issues related to his key contributions in science, the arts, diplomacy and letters.
By reaching out to underserved communities we are further using the House and the character of Franklin to further history and education. Our work with inner city young people show they are familiar with American products or brands but many do not realise the US and the UK were once joined. They are unfamiliar with the circumstances that led to the War of Independence between Britain and America . The House and Franklin played an important role in these events will be a catalyst to help local youth understand more about this pivotal historical period.
The primary focus now is ensuring Benjamin Franklin House is widely experienced by the public. In their interest we have tried to engagingly capture the London life of one of history’s great figures in a manner that would have appealed to the man himself. For Franklin is a character whose pragmatism, inventiveness, and sense of civic responsibility has much to teach us still.
[117] We received contributions from numerous sources including £150,000 from a single US donor as well as a low, fixed interest loan from the Architectural Heritage Fund.
[118] For instance, a circulation game requires students to follow the route of blood through arteries and veins with butterflies that flutter in the stomach and red lights that flash in the brain of the glass body’s silhouette when all is connected correctly.
Top
Exhibiting Franklin
Before curating a small exhibition for the tercentenary of Franklin’s birth at the British Library, Matthew Shaw wondered how it had been done before
The history of our revolution,’ John Adams wrote to Benjamin Rush in 1790, ‘will be one continued lye from one end to the other. The essence of the whole will be that Dr Franklins electrical Rod smote the earth, and out sprung General Washington… thenceforward these two conduced all the policy, negotiation, legislation, and War’.[119] Adams may have been happy to know of the success of David McCullough’s John Adams (2001), a work that has sold in great numbers, but his fear of ‘one continued lye’ remains a potent one for the public history of the American nation.[120] While a focus on ‘Great Men’ has often obscured the broader historical picture, the elevation of the leaders of 1776 to the status of Founding Fathers has at least kept historical writing on the shelves of Borders and Barnes and Noble. As Colin Kidd noted in a recent review of another of McCullough’s works, ‘the founders in their periwigs, breeches and frockcoats,’ Kidd notes, ‘hold a secure place in the popular iconography of American freedom, alongside comic-book heroes in capes and tights.’[121] Such historical superheroes are developed largely outside of the pages of historical texts, but find their way at the heart of popular culture, forming the backbone of what might be termed the American national myth. The collection of stories that America tells about itself, about how it came to be, and what America means, in Carla Mulford’s words, ‘takes shape outside the parameters of controlled discourse and is highly and haphazardly impressionistic’.[122] Movies, half-remembered school lessons, postage stamps, novels, plays, songs… all these help to shape how a nation is imagined or remembered.[123] The following pages look at Franklin’s place in this story, a place not always as prominent as Adams feared, and concentrate on the role of the public exhibition, concluding with some remarks on my own experience of curating an exhibition in his tercentennial year.
National Myth
The Revolution and the founding fathers were commemorated in a variety of ways from the time of the birth of the nation. For example, Simon Newman has examined the importance of festivals and communal feasts, celebrating American (and, for that matter, French) victories, the declaration of independence and other key events. While celebrating the new nation, such democratic events also reflected and enunciated social divisions, of class, occupation and gender.[124] Thanksgiving sermons provided another forum for commemoration, celebration and education. Print culture offered another outlet for such expression, and men such as Mason Locke Weems, found that bringing news of ‘the ‘nation’ to a culture-starved rural population’ could be a profitable endeavour, publishing selections from Franklin’s writings, including the Way to Wealth, as well as Choice Anecdotes in the early nineteenth century. The Founding Fathers also impressed themselves on material culture, on shop-fronts, advertising hoardings, and so forth, as well as the dominance of Washington and Franklin on a variety of medals, stamps, tokens and bank notes.[125] Commemoration also fulfilled a social function, serving as a motif for fraternal meals, meetings and ceremonies of all kinds.[126] Streets, towns, buildings or parks also served as a geographical memorial (including the fourteenth state, which was supposed to be named after Franklin), often providing a figure to represent all Americans in the face of ethnic immigrant diversity.[127]
Michael Kammen has detailed the printed path that the image of the founding fathers followed. He suggests that Franklin (and Hamilton) rose in prominence, when compared to George Washington, in the Gilded Age. The New York Times singled Franklin out for particular praise in its centennial issue, and John Bach McMaster ‘declared him to be the greatest American of the Revolutionary Era’. As many have commented, Franklin’s promotion of thrift and hard work made him a role model to a bourgeois, capitalist society. McMaster’s publication of Franklin’s letters in 1887 raised his stock further, as did a series of popular works and essays in journals such as the Century Magazine (1898). Franklin’s example of rising from obscurity to greatness also figured strongly in the juvenile literature of the latter quarter of the nineteenth century.[128]
As the nineteenth-century ended, works such as Sydney George Fisher, The True Benjamin Franklin (1899) sought to strip away the myths and to offer a more human and accurate portraits as a reaction against the sanitised and popularised version of Franklin as the embodiment of the American Dream. The new science and profession of history also set to work on the founding fathers, beginning a series of studies on aspects of their lives or biographies of the whole, such as Carl Van Doren’s Pulitzer-prize winning biography of 1938. The image of Franklin has shifted from the envy of Adams, to the semi-obscurity of the early-nineteenth century, to celebration and censure as the supposed father of American business, reassessment according to the lights of professional historians, an avuncular figure of popular culture, all kites, keys and kisses for the ladies, to today’s more rounded picture.
The Loan Exhibition of Frankliniana, Grand Lodge, Philadelphia (1906)
In 1906 – or rather 5906 according to the Masonic Calendar – The Right Worshipful Grand Lodge at Philadelphia arranged for an ‘exhibition of […] relics’ of Franklin, the fourth Grand Master in Pennsylvania and ‘an enthusiastic freemason’.[129] It was the 250th anniversary of Franklin’s birth. The Lodge printed two thousand copies of a 351-page Memorial Volume, complete with several facsimiles and plates. The Lodge had close connections with the American Philosophical Society, which organised the ‘Franklin Bi-Centenary’, welcomed delegates from lodges across the world, held a dinner and conferred an honorary degree (in absentia) on ‘Brother Edward VII, King of Great Britain and Ireland and Emperor of India…’[130] Talks were given on ‘Franklin – the lesson his life teaches’, Franklin and the University of Pennsylvania and Franklin as a Freemason. The President of the Society, R.W. George W. Kendrick, also presided at a large memorial service at the tomb of Franklin.[131] A medal was struck, with federal government funds, and a Calendar of the Franklin papers at the American Philosophical Society and University of Pennsylvania was produced.[132] Lectures and sermons emphasised Franklin’s moral character.
Clearly, the exhibition and its associated activities were major events in Philadelphia. Organised by the Grand Lodge, the American Philosophical Society, and the Historical Society of Pennsylvania, the exhibition was visited by 43,287 people, and the catalogue included a vast number of paintings, objects, books and letters relating to Franklin.[133] Interestingly, out of the 487 exhibits, some 250 were engraved portraits. As such, the exhibition straddled the era of the cabinet of the curiosity and the museum, with an emphasis on the visual image of the man and an almost totemic significance placed on objects associated with him (which were often held in private hands). In its organisation and focus, the exhibition demonstrated the national influence held by the Masons, and also the range of groups, such as the American Philosophical Society and the University, which helped to constitute civil society. It tied Franklin to Philadelphia, and linked both the man and the city to the wider nation. Finally, the commemoration was not a single, scholarly or leisure event, but was part sermon, part public ritual, with a ceremony at Franklin’s tomb, much like the popular feasts and celebrations of the revolutionary era. The Philadelphia commemoration represented the last of the ‘living’ rituals connected to Franklin, in which the exhibit was part of a wider commemoration practice.
Metropolitan Museum of Art (1936) and the 250th Anniversary
The visual depiction of Franklin lay at the heart of the next major exhibition, ‘Benjamin Franklin and his Circle’, which took place at the Metropolitan Museum of Art, May to September 1936. [134] This show included 356 items; mostly portraits but also some furniture, busts and decorative objects. The Metropolitan exhibition aimed to show the many friend, acquaintances, correspondents and enemies of the ‘extraordinarily versatile genius’. It also served a political purpose, emphasising the ‘most prominent links between the three nations involved in the American Revolution’, namely Britain , France and the US : the White House lent a portrait, as did the Musée Carnavalet, Paris. The exhibition also attempted to show the cultured side of Franklin, and by association, America : the catalogue is prefaced with a solitary essay, ‘His Interest in the Arts’.[135] Reviews of the exhibition, such as that by Elizabeth Luther Cary in the New York Times, thought that such a collection of portraits could reveal ‘ Franklin’s elusive personality.’ It was the age of Freud, of ‘personality’ rather than just character, but also one that sought to demonstrate the civility and achievement of the American nation.
Restraint and respectability marked another anniversary twenty years’ later. The commemoration of Franklin’s birth in 1956 was largely a sober affair, marked by academic papers, musical performances, and many allusions to the Yale University Franklin Papers project.[136] The American Philosophical Society again helped to co-ordinate a range of meetings and exhibitions, including an exhibition of Franklin portraits at Philosophical Hall, which ‘emphasizes quality rather than quantity, because so far as possible the major portraits are from life.’ They were seen as ‘not only works of art but historical documents of primary significance’. [137] William Lingelbach, the Librarian of the American Philosophical Society, also provided a historical overview of Franklin commemorations, noting how he appeared to fall from favour in the early nineteenth century, and the birthdays were not as celebrated in the past. Lingelbach placed special importance on the discovery of Franklin’s papers by Henry Stevens, the publishing of Franklins works and biographies, and noted the importance of public commemorations such as street names and statues. The American Philosophical Society had also benefited from a number of purchases and donations, such as the Craven Street Gazette; scholarly production and the creation of special collections matched the massive growth in US libraries in the post-war world.
The World of Franklin and Jefferson (1976)
While the previous two exhibitions were relatively scholarly and objective, the bicentennial served as the opportunity for a more emotive exhibition. The Cold War, Watergate, Vietnam and the oil crisis served as a backdrop for an extensive exhibition designed by Charles and Ray Eames. In 1971, the United States Information Service’s Paris Office proposed a Jefferson exhibition as a counterpoint to the Soviet’s Lenin exhibition. The proposal then became a bicentennial exhibition and when the Eameses were offered the contract, Charles Eames suggested that Franklin should ‘share the billing’. The exhibition, which the most complex project undertaken by the firm, running to 40,000 words, travelled to Pairs, Warsaw, Mexico City and the British Museum in London, where 117,000 visited. A $550,000 grant from IBM to the Metropolitan Museum of Art in New York allowed the exhibition to travel in the US .[138] The exhibition, which included copies of the Atrium, Pavilion and Rotunda from Jefferson’s University of Virginia as a stuffed Bison from the Field Museum in Chicago, was accompanied by a book and three films, one narrated by Orson Welles. As one critic has noted, ‘projects such as these elevated the Eameses to the status of US ambassadors overseas and cultural interpreters of the meaning of America at home’.
The exhibition, which was based around three themes, ‘Architects of Independence’, ‘Contrast and Continuity’, ‘Three Document’ and ‘ Jefferson and the West’. It drew on the Eames interest in ‘information overload’, with cases full, as one visitor reported of a ‘too many things – a mish-mash.’ The show was soon dubbed the ‘Bison-tennial’. The critics, led by Hilton Kramer of the New York Times, were not impressed. In a review entitled, ‘What Is This Stuff Doing at the Met?’, offered ‘no inducements to thought’ and was a ‘public relations’ exercise, ‘a contemptible way to make use of works of art, and […] doubly offensive to see it done in one of our greatest art museums’.[139] Everything about the photographs in the show was ‘phony’, showing a world ‘at once cozy and glamorous… all emotions are either noble or picturesque’. The whole show was designed to ‘sell’ something, in the manner of corporate, IBM showroom, in an atmosphere of ‘immaculate neutrality and benign instruction’.
Real and recreation were conflated, propaganda posed as objectivity, all to attempt to ‘commemorate something more than national splendour’, in the words of an USIS memo, and the Eames hoped, to rediscover a collective Americanness based on common interest and reason, to reconstitute the nation in the biennial. But by simply ignoring contemporary crises and rejecting, in one commentator’s words, the counterculture’s ‘emphasis on the self’ the exhibition aroused Eames desire for the government to be seen as a rational ‘corporate enterprise’, with a forum to report to the people, ‘took on a different resonance.’[140] While Kramer’s critique represented something of a metropolitan elite’s view, and the visitor books also record many patriotic appreciations of the exhibition, the focus on Westward Expansion, which Eames saw as a positive, democratic force, was also unpalatable in the aftermath of Vietnam .[141] Official government or corporate influenced representations of the past were a difficult sell in the 1970s.
Experience: Benjamin Franklin 300 (2005)
In contrast, the current Benjamin Franklin exhibition has been greeted with widespread and deserved praise. For example, the New York Times suggested that ‘if Franklin were to mount a museum exhibition about himself, it might very well resemble—in its variety, intelligence and pleasures—“Benjamin Franklin: In Search of a Better World”.[142] A major exhibition, which will tour several cities in the States before arriving in Paris in December 2007, ‘In Search of a Better World’ covers nearly all aspects of his life: ‘Character Matters’, ‘Printer’, ‘Civic Visions’, ‘Useful Knowledge’, ‘World Statesman’, ‘Seeing Franklin’; the final section offers the chance to reflect on Franklin’s reputation and his relevance today.
‘In Search of Better World’ also employed a well-qualified and talented team, who have worked to promote a range of events, under the umbrella of ‘Benjamin Franklin 300’. Reflecting greater understanding of range of Franklin’s exploits, plus a cautious didacticism, the current exhibitions are more measured and self-aware than those in the past, with educational material asking students to design their own exhibition, bringing to mind the choices that they must make.[143] The exhibition, which combines a wealth of original artefacts as well as displays, artwork and interactive, is aimed at a number of audiences, including the ‘general public’, whatever that may be, and children. The result, some reviewers have noted, are displays that sometimes strain too hard for attention, veering into the ‘gimmicky’.[144] However, in the PsP age, it seems to me that exhibitions have little choice but to compete on these terms if they are to reach a wide audience.[145] More interestingly, perhaps, the exhibition demands that visitors engage not just intellectually, but through sensation and emotion; contemporary art, installation, sculpture and interaction, allow the modern visitor to consume Franklin in a variety of ways. And like any commemoration, it celebrates far more than it criticizes.
Seen in historical context, the current exhibition seems to combine elements of the former. There is not just an exhibition, but a series of events, such as lectures and children’s activities, which promote Philadelphia as well as Franklin. There is also a scholarly aspect, with an excellent catalogue and a database of Frankliniana. But unlike the Masonic celebration, which reflected the dense social fabric of civil society, there is more the air of the lobby group, seeking special attention. Unlike the Eameses celebration, which was closely associated with the government, here the connection is more discrete. Franklin – and history – is today more at home in the domain of entertainment and leisure than the civic space.
Britain
Mounting an exhibition in Britain is a problem of a different order. In contrast with America , where such a thing as Franklin fatigue could be diagnosed, Franklin’s reputation one this side of the Atlantic is vaguer and murkier. The 150th anniversary of his death was marked by Northampton, marking the famous descendent of Northamptonshire stock; and St Bartholomew’s Church in the City of London made some moves to mark Franklin’s association with Palmer’s print shop in Bartholomew’s close for his bicentenary in 1906.[146] Such efforts had little success, and in 1939 the preface to a biography noted that Franklin ‘is practically unknown to the reading public in England ’.[147] When the Benjamin Franklin House opened as a centre for the British Society for International Understanding in 1947, the National Archives reveal that that Foreign and Home Offices did not feel it appropriate for the Prime Minister to attend; a minister was proposed instead. The Americans managed Mrs Douglas, the wife of the American Ambassador.[148] In 1951, an appeal was made for a portrait of Franklin to hang in the Craven Street parlour during the Festival of Britain. In this, it appeared to have been unsuccessful.[149]
More recently, London’s Franklin House has reopened (17 January 2007; see the accompanying article in this journal), offering education and the chance to re-imagine and even feel emotionally something of Franklin’s life in London. An exhibition to mark his birth was held by the Royal College of Surgeons, and the US Embassy has begun a scholarship in his name. The British Library hosted a five-case exhibition, illustrating Franklin’s relationship with print, Britain , politics and science, and his reputation. We were able to display several autograph manuscripts held by the Library, as well as items printed by Franklin during his first visit, and a unique copy of the Boston Courant, annotated by Franklin and showing the first of his Silence Dogood letters. In terms of interpretation, it became apparent during the planning stages that the main aim was simply to make Franklin better known to a British audience, incorporating the latest historical research in the accompanying texts as far as possible, showing how Franklin was a man among many. Rather than engaging with a national myth, mounting the exhibition was an exercise in narrative – using the object available to tell a story or to expand horizons – and in judging what an audience visiting the British Library would appreciate being told. Indeed once the deadline approached, thorny interpretative issues could be settled by the limited word length of labels. Putting a single figure at the centre of an exhibition, unless the curator really sticks the boot in, inevitably creates an exercise in veneration; in this case, I shall just have to hope that Franklin (and Adams) doesn’t mind too much.
[119] J. Adams to Benjamin Rush, 4 April 1790, quoted in Carla Mulford, ‘Figuring Benjamin Franklin in American Cultural Memory’, The New England Quarterly, vol. 72, (1999), 415-443, quotation, 415. On Franklin and Adams, see Richard D. Miles, ‘The American Image of Benjamin Franklin’, American Quarterly, vol. 9 (1957), 117-143.
[120] McCullough’s biography of John Adams, ‘a federalist president who failed to secure re-election’ has sold 2 million copies since 2001 (Colin Kidd, ‘Damnable Defeat’, London Review of Books, 17 November 2005). See also Simon Newman, Parades and the Politics of the Street: festive culture in the Early American Republic ( Philadelphia, 1997), xi.
[121] Kidd, ‘Damnable Defeat’. See also C. Bradley Thompson, ‘Great and Small’, Times Literary Supplement, 9 December 2005: ‘If academic historians are writing the histories of ordinary people doing ordinary things (which, it turns out, ordinary people find tediously ordinary), David McCullough, Stanley Weintraub and the non-professional historians are writing books that examine ordinary (and great) people doing great things – and that makes all the difference.’
[122] Mulford, ‘Figuring’, 416.
[123] Cultural historians have offered several analyses of the American ‘national myth’, notably Michael Kammen, in Season of Youth (1977) and Mystic Chords of Memory (1991). Bernard Bailyn and Jack Greene have also, and in different ways, examined the creation of the American nation: Jack Greene, The Intellectual Construction of America (Chapel Hill & London, 1993).
[124] Newman, Parades.
[125] Mulford, ‘Figuring’, 427-34.
[126] Annual Dinner of the Typothetæ of New York in Honor of the Birthday of Benjamin Franklin at Hotel Brunswick, Saturday, January 17, 1891 ([ New York], [1891]), British Library 11903.d.31(9).
[127] Mulford, ‘Figuring’; Lingelbach, ‘Benjamin Franklin’, 361.
[128] Michael Kammen, A Season of Youth: the American Revolution and the Historical Imagination (New York, 1978), 63-5. Kammen suggests that ‘By the early 1900s the American Revolution in national tradition had been trivialized – and to a large degree, de-revolutionized’. The transition from adult to juvenile participation at festivals and commemorative events is commonly seen by anthropologists as a sign of just such trivialisation.
[129] Proceedings of the Right Worshipful Grand Lodge… at its Celebration of the Bi-Centenary of the Birth of Right Worshipful Past Grand Master Brother Benjamin Franklin ( Philadelphia, 1906).
[130] Proceedings, 13.
[131] Proceedings, 179.
[132] Lingelbach, ‘ Franklin’, 365.
[133] Compiled by Julius F. Sasche, for the Committee on Library.
[134] Joseph Downs, ‘Benjamin Franklin and his Circle’, in The Metropolitan Museum of Art Bulletin, Vol. 31, No. 5. (May, 1936), 97-104
[135] R.T. H. Halsey, ‘Benjamin Franklin: His Interest in the Arts’, in Benjamin Franklin and his Circle. A catalogue of an exhibition, New York, 1936.
[136] William E. Lingelbach, ‘Benjamin Franklin and the American Philosophical Society in 1956’, Proceedings of the American Philosophical Society, vol. 100 (1956), 354-368.
[137] Lingelbach, ‘Benjamin Franklin’, 354-5.
[138] Hélène Lipstadt, ‘“Natural Overlap”: Charles and Ray Eames and the Federal Government’, in The Work of Charles and Ray Eames: a legacy of invention (New York, 1997), 166-68.
[139] Hilton Kramer, New York Times, 14 March 1976, D29.
[140] Lipstadt, ‘Natural Overlap’, 166, 170.
[141] The Eameses’ exhibitions, and in particular their films, A Communication Primer, Powers of Ten and The World of Franklin and Jefferson, have been seen as ‘pre-digital precedents’ of the Internet age: Philip C. Repp, ‘Three Information Design Lessons: selected films by Charles and Ray Eames’, LOOP: AIGA Journal of Interaction Design Education (2001:2).
[142] Edward Rothstein, ‘Knowing a Man (Ben Franklin), but Not Melons’, New York Times, 19 December 2005.
[143] One of the educational tools provided by BF300 is a kit for school children to design a museum exhibition, asking them to consider the problems of interpretation: ‘Lesson 8: Designing Benjamin Franklin: In Search of a Better World’, <http://http://www.baas.ac.uk/wp-content/uploads/2010/05/BF300Plans_High8.pdf> (Accessed 18 April 2006).
[144] Rothstein, ‘Knowing a Man’.
[145] Newspaper and online reviews have also treated the exhibition as one of a range of competing attractions for leisure dollars: ‘Let me say right away, the exhibition is amazing and should be seen by everyone who has any chance to see it. That old Ben was quite a guy and more importantly he not only played a big role in the 18th century, but he plays a big role in today’s world and the little thing we call the United States of America’, <http://philadelphia.about.com/od/calendarof events/fr/franklin_world.htm> (Accessed 18 April 2006).
[146] The Times, April 1940.
[147] Evarts S. Scudder, Benjamin Franklin ( London, 1939).
[148] The Times, 21 Jun 1947. Several ‘relics’ were presented to the house and a Benjamin Franklin Fund appeal begun in 1948.
[149] Lord Duncannon, Letters to the Editor, The Times, 30 March 1951; John Underwood, Letters to the Editor, The Times, 4 April 1951; Willard Connelly, Letters to the Editor, The Times, 3 May 1951.
Top
Whitman: “A poet given to compulsive self-revision”
Dorian Hayes reflects on Walt Whitman, hypertextuality and the 1855 edition of Leaves of Grass[150]
[Walt Whitman was]…“a poet given to compulsive self-revision”. Whitman’s work “is better understood in terms of process rather than product, fluidity rather than stability”—a style accommodated… far better by hypertext than by traditionally static printed texts.[151]
A fascinating project is currently under way at the University of Iowa’s Department of English which prompts questions of how poetry should and will be read and interpreted, now and into the future. Indeed, the Walt Whitman Hypertext Archive throws into question precisely what a “finished” text—even one as supposedly canonical as Leaves of Grass—might look like. This new technological initiative also complicates the value and place of a printed artefact such as the prized first edition of Whitman’s Leaves of Grass, of which only a few hundred were published in 1855, and of which the British Library holds one.
This article sketches out the enormous scope and gradual accretion of Whitman’s work, from notebook scribblings, through all six different and entirely distinct editions of Leaves of Grass, and related writings. I will argue that, taken together, Whitman’s work amounts to a uniquely “hypertextual” oeuvre. There will follow a close-textual analysis of some key passages from the first (1855) edition of Leaves, and a consideration of the effects of reading this already multivalent, hybrid text in a “virtual environment”. Without drawing any specific conclusions, I shall then seek to bring together some of the above observations, and suggest areas of literary reception and interpretation which become more pertinent in this context.
“Democratic Vistas”: The Hypertextual Walt Whitman
As numerous critics have pointed out, Whitman’s work is uniquely suited to a modern, “hypertextual” re-presentation. As suggested above, the poet was “given to compulsive self-revision” throughout his literary career. From his earliest notebooks begun in the 1850s, through the life-changing ravages of the Civil War and the bitter Reconstruction years, all the way forward to the sixth, so-called “death-bed” edition published just before his actual death in 1892, Leaves of Grass expanded from a slim 95-page volume to a vast and sprawling 450-page opus. Aside from Leaves of Grass and the later poems of A Passage to India (1871), Whitman was also a prodigious prose-writer (principal works include Democratic Vistas [1870] and Specimen Days [1882]), correspondent, and sketch-writer. In addition to the sheer extent, Whitman’s frequent prolixity, habitual revisions and re-orderings of major works, editorial rigour, and grammatical complexity all render the poet’s oeuvre something of an archivist’s nightmare—his collected published works run to 22 volumes alone, to say nothing of the ever-expanding quantity of manuscript material unearthed year on year.
Describing their original intentions in creating the Whitman Hypertext Archive, academic Kenneth Price asserts that:
Whitman’s writings defy the constraints of the book. Documents associated with a Whitman poem might well include an initial prose jotting containing a key image or idea; trial lines in a notebook; a published version appearing in a periodical; corrected page proofs; and various printed versions of the poem appearing in books, including (but not limited to) the six distinct editions of Leaves of Grass.
Price goes on to argue that “the fixed forms of print are cumbersome and inadequate for capturing Whitman’s numerous and complex revisions”. Meanwhile, Price’s University of Iowa colleague, Ed Folsom, suggests that “the form and structure of hypertext are particularly appropriate and useful for studying Whitman”:
We finally have a technology that can capture Whitman’s incessant alterations of his poetry…. Archives are filled with copies of his printed texts on which he has added handwritten alterations. Working through these documents becomes an exercise in hypertext. You see a poem changing, word by word, line by line, edition by edition.[152]
With these stirring words, Folsom and Price echo the promotional drive of other pioneering digitisation projects, including some produced (or co-produced) by the British Library, such as the extensive “Turning the Pages” initiative, and the International Dunhuang Project. So, how does the experience of working with hypertexts of this kind work in practice? How does it compare with the seemingly anachronistic experience of reading the “original” paper texts? Can the two media be used together—indeed does this act of juxtaposition serve to enrich the process?
“When I read the book”: Re-reading Leaves of Grass (1855)
The American poets are to enclose old and new for America is the race of races. Of them a bard is to be commensurate with a people…. The greatest poet has less a marked style and is more the channel of thoughts and things without increase or diminution, and is the free channel of himself.[153]
Such was Whitman’s injunction to himself and his fellow “American poets” of the antebellum years in the Preface to the 1855 edition of Leaves of Grass. True to his word, Whitman’s corpus contains some of the freest and most inventive, ingenious poetry of its time, and a flowing, long-lined style that has influenced practically every development in American literature over the last 150 years. From Whitman a direct line of thought and articulation can be traced through the writings of Hart Crane and e.e. cummings, the free verse of Beat poets like Ginsberg, Ferlinghetti, and O’Hara, and the more recent prose poetry of Raymond Carver and August Kleinzahler.
In many ways, the opening of Leaves of Grass, like the Preface quoted above, reads—and, just as importantly, looks—like a declaration of literary independence:
I celebrate myself,
And what I assume you shall assume,
For every atom belonging to me as good as belongs to you.
I loafe and invite my soul,
I lean and loafe at my ease… observing a spear of summer grass.[154]
From the very opening lines, the speaking persona of Leaves makes a radical assertion of his unique identity. As Ivan Marki puts it, “that identity, rather than any argument, is the true significance of the volume; that is what it means”:
The topics and themes taken up by the poems are components of the speaker’s personality, and the order in which they are arranged does not so much advance propositions leading toward a reasoned conclusion as it discloses the dynamism through which that personality is constituted.[155]
Opening the slightly tatty, stained and foxed British Library copy of Leaves, smelling its history and sensing the many hands through which it has passed, it nevertheless remains difficult to re-capture the sheer impact and thrill that this lazy, “loafing” statement of intent must have had in the volatile city of New York into which it was launched in 1855. Harder still to quantify is the effect that such a work—with its embattled belief in the power and potential of the American future—might have had in the heady atmosphere of literary London.
It was precisely this act of historical imagining that was required when Professor Folsom invited the British Library to participate in a census of 1855 editions of the text. According to Whitman scholar William White, 795 copies of the first Leaves were produced in the print-shop of James and Thomas Rome in Brooklyn, New Jersey. Of these, a very small number were distributed in the UK by William Horsell of London, and far fewer have survived into the 21st Century[156]. As the census questionnaire made clear, there were numerous differences among these 795 copies, from small typographical or grammatical changes to substantial extra materials tipped- or pasted-in to the inner covers. For obvious reasons, the most notable versions—and therefore among the most valuable works in the entire American canon—are those few which include a copy of the glowing review by Ralph Waldo Emerson, marked “Copy for the convenience of private reading only.” (Unfortunately, the BL copy is not one of these) Nevertheless, it is easy to see even from a cursory reading how the effect of lines like “The past is the push of you and me and all precisely the same, / And the night is for you and me and all” (as printed in the BL version), is amplified when the latter was amended to “And the day and night are for you and me and all”. It should be pointed out that the poet was also an unusually attentive and rigorous proof-reader who was involved at all stages in the preparation of the text. Indeed, for all his undoubted linguistic invention and free-wheeling exuberance, it is clear even from this tiny instance of close-reading that Whitman was in firm control of his material and of his poetic effects.
The Whitman we see here, tweaking and tinkering with his verse, re-thinking and re-writing, as in innumerable other variations and alterations, is in effect a junior apprentice to the ambitious architect who eventually overhauled and augmented Leaves beyond recognition a further six times throughout his life. Indeed, as Ivan Marki notes, the 1855 edition disappeared from view almost as soon as it was published, the poet having expanded the conception of his “experiment” within months of its publication. For this reason, the 1855 version came to be described by Malcolm Cowley as “the buried masterpiece of American writing”[157]. Maybe this is why the text has acquired the status of a fetish object over the years. It is perhaps for this reason that handling the volume in its (variant) original form, reading again the heavy, bold type, and marvelling at the flagrant insouciance of the engraving of Whitman on the frontispiece, is still an incomparably tactile experience.
“I sing the body electric”: Whitman in Cyberspace
If poring over the 1855 edition of Leaves of Grass at the BL, with its curious stippled cover pattern, and ornate, floral lettering, is a distinctly tactile, “analogue” experience, its organic design reflecting the work’s linguistic fecundity, reading Whitman in cyberspace is stimulating in entirely different ways. Certainly, the Whitman Hypertext Archive is a fertile virtual environment. The product of over ten years’ development, and of academic careers in American literature that date back before that, the detailed work of Folsom and Price deliberately exploits the “extensibility” of the Internet. One of the major innovations of the site is that all six successive editions of Whitman’s Leaves of Grass can be consulted in parallel. This means that the evolution of a seminal poem like “Song of Myself” from originating sketch to various published incarnations can be tracked and evaluated by readers at the click of a mouse, and viewed side-by-side.
Likewise, the ever-emerging wealth of Whitman manuscript material coexists with published texts on the site, both in the form of facsimiles and transcripts. Among the most valuable manuscript materials are facsimiles of Whitman’s extensive notebooks, produced in the 1850-1860s. Long thought to have been lost for good, these priceless records have now been recovered and digitised in full by the Library of Congress[158]. Following their recovery by the Library in 1995, the next decade saw the preservation and digitisation of the entire collection, culminating in a grand exhibition in 2005 to coincide with the 150th anniversary of the publication of the 1855 edition of Leaves. As is increasingly the case at major institutions like the Library of Congress, this “artefactual” show was swiftly complemented by an online exhibition, which will remain part of the growing online Whitman heritage now available for students and researchers[159].
Elsewhere in the Whitman Hypertext Archive, scanned and transcribed versions of a huge number of known manuscripts and sketches can now be accessed thanks to the active participation of partner institutions like University of Nebraska-Lincoln Libraries, Boston University, Duke University, University of Texas, Beinecke Rare Book and Manuscript Library at Yale University, Princeton University Library, University of Virginia, and Library of Congress. Nor are primary literary documents the only resources available. In fact, the Archive compiles a huge wealth of secondary material including at least 100 scanned, annotated images of the poet, and—in a nod to the increasingly popular “Wiki” form of web community-building—an expanding, fully interactive body of critical and bibliographical commentary on Whitman, his work and legacy. A quick search of the Whitman Hypertext Archive’s bibliography (now the official Whitman bibliography) yields over 200 citations of articles published by scholars in 2005 alone. It is fair to say that, with the 150th anniversary just passed, critical interest in Whitman’s work has probably never been greater, and with it the need for a compendious repository to draw together the vast output of the poet and his scholars.
“The panorama of the sea”: Exploring Whitman Online
So is there, somewhere in this sea of text, the possibility of a greater appreciation of Whitman’s life and work? Undoubtedly, there is great pedagogical value in being able to quantify at a glance the difference between the raw fluency of the 1855 edition of Leaves, with all its ellipses and blind alleys and perverse primal power, and the calm, benign inclusiveness and controlled passion of the 1891 re-write. Add in the further ingredient of relevant supporting manuscripts and critical apparatus, all available for free and with minimum download time, and the Whitman Hypertext environment really does seem to provide the tools with which to ‘capture Whitman’s incessant alterations of his poetry’.[160] Certainly, the alterations, their effects line-by-line, and cumulatively on the total architecture of his work, are clear at every stage.
This means that the reader can, for instance, plot the evolution of Whitman’s characteristically emphatic, sensual language to describe both hetero- and homosexual love, from the rather cryptic allusions in the 1855 edition, through to the bold declarations of the Children of Adam and Calamus sequences[161] in the 1860 revision. In contrast to the growing radicalism of his personal, sexual politics, Whitman’s 1867 overhaul of the collection, produced at a time when America was emerging from, and seeking to heal some of the racial divisions heightened by the Civil War, might be seen to retreat somewhat from the 1855 version of ‘Song of Myself’. In the latter, one of the many voices ventriloquised by the narrating persona is that of a righteously indignant African-American slave, while in the later incarnation, the empathic anger is slightly dispersed and the effect displaced. This may be seen to reflect the contemporaneous social and political inclination towards healing and reconciliation as opposed to blame and recrimination.
That said, ‘The City Dead-House’, a new poem included in the 1867 version, is arguably a precursor of the ‘protest songs’ of political outrage and solidarity popular during the Depression of the 1930s, and the renewed militancy of the 1960s. One of the powerful themes that emerge from reading Whitman’s work in the Hypertext Archive—probably in part the result of the editorial apparatus which surrounds it—is the poet’s intense, and increasingly sophisticated engagement with the social and political forces that engulfed him. From the impassioned utopianism of new poems from this period like ‘Aboard At a Ship’s Helm’ and the Songs Before Parting cluster, a vision of the ‘ship of democracy’ with the poet-seer at the helm emerges which radically politicises the humanist individualism of that youthful declaration of independence in 1855. Thus, while early (pre-War) versions of Leaves seem indebted to rather diffuse and abstract ideals of ‘America’, ‘solidarity’, and ‘democracy’, it is clear from his constant re-ordering and re-writing of the material throughout the 1860s-1870s, as well as from contemporary notebooks and jottings, that his understanding of such concepts was brought into sharp relief by the trauma of Civil War.
“Roaming in thought”: Final Reflections
But does a recognition of this kind bring us any closer to Whitman himself, and the intent behind his habitual revisions? Certainly, debate around the value of authorial intentions refuses to subside. The theoretical gauntlet was thrown down “New Critics” W.K. Wimsatt and Monroe Beardsley when they declared in The Verbal Icon (1954) that the intentions of an author for his work were neither available nor useful to the critic. In place of this “intentional fallacy”, they argues for a rigorous form close-reading which would direct attention to the text alone. It is undeniably true that the enormous flexibility and intuitive user-interface of the Whitman Hypertext Archive render the breadth and scope, and context, of the poet’s life-work more accessible than ever before. The precision of his craft, the density of his experiment, and the urgency of the social and political forces affecting it, all emerge clearly in this light. As for the question of intent, this arguably remains as challenging and cryptic as the expression on the poet’s face as he brazenly stares out from the frontispiece of the 1855 Leaves.
As Ed Folsom points out in the quotation at the head of this paper, Whitman’s work is undoubtedly “better understood in terms of process rather than product, fluidity rather than stability”. This paper has sought to highlight some aspects of this “process”. Needless to say, the paper-copy of the 1855 edition of Leaves is only one arbitrary point of entry into the vast panorama of Whitman-related material. To suggest that even this text—taken in isolation—represents a fixed, stable statement of intent, and to set it alongside the supposed breadth and fluidity of the hypertext environment, would be to set up a false opposition. In some ways, the two media do function differently, their contrasting properties reminiscent of Marshall McLuhan’s celebrated identification of “hot” and “cool” media; the “hot” immersive medium of the printed text contrasting with the “cool”, participatory, interactive experience of the Hypertext Archive. However, as I hope to have shown, reading Whitman on the page is, in and of itself, an elusive and mysterious process, and there is in fact considerable slippage in and around even the 1855 edition of Leaves. Ultimately, and for all the emphatic self-declaration and unflinching depiction of the working body and mind of the speaker, it is this sense of mercurial mystery and wonder that Whitman in hard copy shares with his hypertextual successor.
Further Reading
Whitman, Walt. Leaves of Grass. Brooklyn, NJ: Rome, 1855.
Whitman, Walt. Complete Poetry and Prose. Library of America Series. New York: Literary Classics of the United States . 1982.
Whitman, Walt. Whitman’s Manuscripts: Leaves of Grass, 1960, A Parallel Text. Ed. by Fredson Bowers. Chicago: University of Chicago. 1955.
Whitman, Walt. The Walt Whitman Archive. Vol. 1: Whitman Manuscripts at the Library of Congress. Vol. 2: Whitman Manuscripts at Duke University, and the Humanities Research Centre at the University of Texas, ed. by Joel Myerson. New York and London: Garland. 1993.
White, William “The First (1855) Leaves of Grass: How Many Copies?”. Papers of the Bibliographical Society of America . 57.1963.
Walt Whitman Hypertext Archive. University of Iowa. (h:ttp://www.whitmanarchive.org/archivephp/criticism/criticismframeset.php?id=44).
“Poet at Work: Walt Whitman Notebooks, 1850-1860”. Exhibition of Recovered Notebooks from the Thomas Biggs Harned Walt Whitman Collection. Library of Congress (http://memory.loc.gov/ammem/collections/whitman/index.html“>http://memory.loc.gov/ammem/collections/whitman/index.html).
[150] An extended version of this article will be available in the electronic British Library Journal, http://www.bl.uk/collections/eblj/2006/articles2006.html.
[151] Ed Folsom, quoted in ‘The Walt Whitman Project’, Obermann Centre for Advanced Studies, University of Iowa (http://www.uiowa.edu/%7Eobermann/projects/whitman.html).
[152] Kenneth Price, ‘Dollars and Sense in Collaborative Digital Scholarship: The Example of Walt Whitman Hypertext Archive’ (2001), Walt Whitman Hypertext Archive (http://www.whitmanarchive.org/introduction/).
[153] Walt Whitman, Preface, Leaves of Grass (Brooklyn, NJ: Rome, 1855), iv, vii.
6 Whitman, Leaves of Grass, 1-5.
[155] Ivan Marki, ‘Leaves of Grass, 1855 edition’, Walt Whitman Hypertext Archive, http://www.whitmanarchive.org/archivephp/criticism/criticismframeset.php?id=44.
[156] White, ‘The First (1855) Leaves of Grass: How Many Copies?’, Papers of the Bibliographical Society of America 57 (1963), 352-354. From the date-stamps on the frontispiece, it seems that the BL copy was either purchased by one of the British Museum’s American agents in the mid-19th Century, or was one of the American editions distributed by Horsell; see Folsom and Price, ‘British Editions of Leaves of Grass’, http://www.whitmanarchive.org/works/british/intro.html.
[157] Cowley quoted in Marki ‘Leaves of Grass, 1855 edition’.
[158] The Library of Congress is one of the partner institutions in the Whitman Hypertext initiative.
[159] See http://memory.loc.gov/ammem/collections/whitman/index.html. The Whitman Hypertext Archive links directly to the full online facsimiles of the Whitman notebooks at Library of Congress.
[160] Ed Folsom, quotation in ‘The Walt Whitman Project’, http://www.uiowa.edu/%7Eobermann/projects/whitman.html.
[161] The small, thematically-related cycles of poems scattered throughout Leaves are known, in Whitman circles, as ‘clusters’.
Top
Readers’ Writes: Ch-ch-changes — a Bibliophile’s Path through Higher Education Resources
Lisa Rull, University of Nottingham, reflects
Whether you are an undergraduate with an enquiring mind, a postgraduate researcher keen to identify your research field, or an academic pursuing that next article or book topic, you are unlikely to own all the sources you need to consult. Consequently, libraries and resources are a key requirement, particularly for those of us in the humanities, the historically much-mocked academic faculty believed cheaply sustained by the provision of pencils, a library and access to archival documents.
Recently, both the nature and accessibility of our resources has started changing. Technology and practical issues of space now drive many transformations and there is an undeniable economic dimension to such changes for the caretakers of those resources. Here, I refer to the arrival of the digital age, the diminishment of paper-based archives and the impact of adopting previous technological advances in resource storage.[162] Whilst technology may not be cheap, physical space is always expensive, especially as student numbers and the demand for access grows; besides, the conservation and management of using archival material also incurs specific physical and human costs.
Yet both users’ experiences and their profile of the users are diversifying. Full time higher degree study has increased massively in the 10 years between 1994/5 and 2004/5 and both the academic backgrounds and economic circumstances of those students is correspondingly different.[163] We must therefore acknowledge that physical and technological expansion in the availability of resources simultaneously reflects, influences and yet can also fail to address changes in those who use resources. In practice, the evolution of resource provision, strategies for reading and researching, and changes in the student body, are often be playing catch-up with each other. This has a practical impact on our scholarship and on our changing relationship with books and resources.
Speaking personally, I am an avowed bibliophile, but despite jointly owning around 4000 books (and more than an alphabet’s worth of folders containing photocopies), over the years my relationship with archival institutions has altered. With great nervousness, I must confess to a problematic and nuanced relationship with libraries. Undoubtedly, I love their contents; I also love to make friends with their staff (note to all students and researchers – never underestimate the usefulness of befriending library employees and those who manage on-line access to resources). But a circuitous route into and through higher education has produced a rather strained relationship with these repositories of knowledge. Mature students often study part-time, at a distance, and nearly always have additional commitments. It is thus hardly surprising that my accessing resources never quite matched the expectations of institutional resource provisions.
Initially an Open University student in full-time office employment, my first HE experience provided no library beyond the excellent OU materials delivered by Royal Mail. Occasional forays to the Angel Row Central Library kept me going, but I hated the noise, the lights and the treatment other people gave to precious shared resources. Only partly satiated by scouring second-hand bookstores, I desperately wanted to read more. Addicted to the thrill of studying, I became a young, if technically mature (I was nearly 26), full-time undergraduate student at a 1992 midlands university. Suddenly, I had to travel to access resources that were erratically located across several library sites, the physical legacy of the institution’s historical and geographic expansion. I was an increasingly cranky library user, and ever aware of the physical damage carrying library books caused my back and posture.
I became more selective in my reading; photocopying extracts from magazines and journals, printing off pages from eye-numbing microfilmed newspapers, and befriending those who could provide access to obscure archives (gallery curators). Rationing my budget enabled me to buy texts as well, giving ample scope to my love of second-hand bookstores. Necessity also alerted me to my ‘resource butterfly’ mentality: I wanted to have everything to hand where I could land on it as required during the writing process. Whilst I appreciated visiting other institutions (especially the Arts Floor of Birmingham’s city-based Central Reference Library), such visits could become cumbersome journeys requiring the transportation of large amounts of existing materials as well as spreading across huge tables whilst there. Working from home kept the cackling noise of younger undergraduates amongst the library shelves at bay. I was accumulating my own personal ‘library’ and becoming adept at filleting both the bibliographies and narratives of existing texts for my information and sources: I wanted to do it in my own space.
After graduation, teaching and art gallery work provided a meagre income, including financing a part-time Masters’ student at a redbrick university that required a weekly 300-mile round trip for one day’s teaching. At the time online materials were still virtually non-existent (no pun intended). Moreover, echoing the historical model of HE this institution aspired to, most of the texts I needed to use were only available in a small – if beautifully stuffed to the rafters – specialist department reference library whose opening hours depended on an archetypal student profile that I could not match. I still treasure the postcard a fellow student from my course once sent me of a medieval chained library, with the comment that its predecessor could be found at our beloved institution. This was 1995-97, long before flexible loans and online resources. All I could do was retreat to Birmingham’s now seemingly paradisal spacious provision, and work copious underpaid hours as a temporary lecturer in order to access alternative libraries close to home.[164] Circumstances and provision thus forced me to be imaginative in choosing, locating and accessing sources.
Officially, my PhD studies at the University of Nottingham began in January 2000, but by then I had been ‘working’ on American art collector Peggy Guggenheim for six years and correspondingly had accumulated an annotated bibliography of sources that already ran to some 15 closely typed pages. When full-time AHRB funding came later that year, I wondered how my methods would change. There would be a shared carrel space in the University’s Humanities Graduate Centre, a location dominated by students from my PhD department of American and Canadian Studies. Extensive shelving was also available, and each carrel had a networked computer (at last, online resources, even if not everything could be accessed off-campus…). All this situated within walking distance of the central Hallward library and with the possibility of ordering up to 40 interlibrary loans per year (a revelatory concept and I soon got used to the delays). I looked at those around me and wondered if I would start using libraries differently, more traditionally, with notebook and pencil in hand, visiting the hallowed glories of British Library or the Smithsonian’s American Archives of Art.
In fact, little changed (although I did become adept at another non-traditional archival skill: the searching of on-line materials… and shouting at convoluted web-based interfaces). I still grumbled at stuff always being ‘in the other place’ wherever I tried to work, and my ageing back meant I had to resist the desire/need to haul large folders to and from home and university (still by public transport). The Smithsonian and Guggenheim Museum’s own archives were grand trips, but my particular research project rendered the British Library chiefly a supplier of inter-library loans. (Perhaps recognising my enthusiasm, people gave me access to their own extensive notes, photocopies and even books. Or maybe they were desperate for storage space once their own projects were complete…)
So ultimately, my PhD research proceeded in much the same way as previous study: slightly out of step with technology and addicted to my bibliophilic accumulating tendencies. But I never underestimated the power of libraries and resources at my disposal locally, nationally, internationally and virtually, however unconventional my points of access were. Search tools, catalogues and the availability of both physical and virtual archives have dramatically improved, and as long as both are maintained, eventually the mismatch between researchers and the location of their materials will be bridged. And if you do end up falling over your own bibliographic accumulations, just keep an eye out for the next generation of scholars: one will almost certainly give your archive a home.
[162] Nicholson Baker, Double Fold ( New York: Random House, 2001): an out-and-out polemic but, having seen first-hand the impact of previous efforts to reduce library storage via the adoption of technology, I’m utterly sympathetic.
[163] HESA statistics available at http://www.hesa.ac.uk.
[164] Another bad pun: Birmingham’s current Central Reference Library is located just off Paradise Circus.
Top
Read All About It: Free online Newspaper sources for Nineteenth-Century
Donald Tait, Glasgow University Library, provides an above-the-fold guide to online resources
The 19th century saw America’s greatest period of economic growth, exploration and expansion, as well as some of its most significant and dramatic events such as the movement West; the Civil War; Slavery; Immigration; Urbanisation etc. The same period also saw the rise of the popular press in America, and historians and other scholars now recognise the value of newspapers in providing a first-hand, detailed insight into all aspects of national life. Traditionally, access to this wealth of information was difficult, as the newspapers themselves were fragile or else were often in microfilm with poor indexes and often with poor reproduction. The situation changed with the advent of the Web and development of digitisation and software searching tools, which now allow researchers to pose new questions of these newspaper collections, which Commager saw as representing “the raw material of history”.
Commercial organisations have been quick to respond with a range of very impressive products. Readex now have significant collections eg their ‘Early American Newspapers Series 1 (1690–1876)’ which is in the process of being supplemented by Series 2 and 3. Proquest offer their ‘Historical Newspapers Programme’, an ongoing project that digitises key newspapers dating from the eighteenth century to the present day, including The New York Times; The Wall Street Journal; and The Washington Post. Thomson Gale have the ‘19th Century US Newspapers Digital Archive’ which has digitised content from the microfilm holdings of a wide range of newspapers, which were chosen with view to providing the greatest value to researchers and to give a detailed insight into all aspects of national life. However, such treasures come at a price and not all libraries are able to meet the asking price.
There are a now a number of freely accessible resources, and what follows below is a discussion of just a selection of these. This is not an exhaustive list, rather an attempt to give a flavour of what is out there, and to indicate how useful or otherwise I found them to be. (When using these sites it is worth noting that in many cases, due to way the site is built, using your browser’s [Back] button will not have the desired effect. Also, the response times can be slow.)
Brooklyn Daily Eagle (1841-1902)
http://www.brooklynpubliclibrary.org/eagle/
Coverage and Usefulness: Spanning sixty-one years, the “Brooklyn Daily Eagle” covers a seminal period in the history of America . As well as what was happening in Brooklyn/New York, it also discusses national events, making this an excellent primary source for a range of topics in American history, e.g., the Civil War, immigration, race relations, etc. Approximately 147,000 pages of newspaper have been digitised, and both articles and advertisements are available. You can access the text either by date of issue or by keyword searching.
Colorado’s Historic Newspaper Collection
http://www.cdpheritage.org/ and click on link [Search Newspaper collections]
Coverage and Usefulness: covers the early years of over 90 newspapers, including the “Rocky Mountain News”; “Boulder Camera”; and the” Colorado Chieftain”. There are over 150,000 pages covering 1859 – 1924, representing 30 Colorado cities and 20 counties. Researchers can leaf through an issue page by page; search the database by topic; look at an individual article by itself or as part of the full page; search through articles, graphics, letters to the editor and ads; search a single newspaper, a group of papers or all papers. The site is being kept up-to-date and the helpdesk are very responsive to any queries.
Georgia Historic Newspapers
http://dlg.galileo.usg.edu/MediaTypes/Newspapers.html
Coverage and Usefulness: offers searchable issues of three important historic Georgia newspapers, the Cherokee Phoenix; the Dublin Post; and the Colored Tribune, taken from the microfilm holdings of the Georgia Newspaper Project. Currently, your search terms are not highlighted in the results pages, nor can you retrieve them using the search tool in Adobe Acrobat, which means this is not nearly as easy to use as it could be, as a fair amount of determination is required to scan through a page of text looking for particular words. This reflects the fact that this resource was one of the early attempts at digitisation and was completed at a time when there were few models to draw upon and little sense of how users might use such a resource. Georgia Historic Newspapers are in the process of exploring methods and sources of funding for a new newspaper digitisation project which would deliver additional functionality and content, although as yet there is no firm timeline for when this will happen
Harpweekly
http://advertising.harpweek.com/
Coverage and Usefulness: allows you to look a small collection of advertisements from Harper’s Weekly, one of the leading illustrated American periodical during the nineteenth century. You can also register for free online access to the index of all the advertisements from 1857-1872
Historic Missouri Newspapers Project
http://digital.library.umsystem.edu/ and click on link [Missouri Historic Newspapers]
Coverage and Usefulness: has digitised 14 newspapers from Missouri, with varying degrees of chronological coverage. Tiles include St Louis Christian Advocate 1857-1879; Phelps County New Era 1875-1880; and the Liberty Weekly Tribune 1846-1883. You can browse or search, but be aware that pages can be very slow to load, and can be of poor quality. However, there is interesting material here, e.g., a search in the periodical The Far West (1836) shows up some advertisements for the sale of slaves.
Historical New York Times
http://www.nyt.ulib.org/
Coverage and Usefulness: coverage of the Civil War period from 1860-1866. Regrettably, there is no search facility and the text can be awkward to read. This particular project was halted by the New York Times themselves, so there is no possibility of the site being improved. Nonetheless, for anyone concerned with the Civil War, it remains worth a look.
Historybuff.com
http://historybuff.com/ and click on the link marked [Online Newspaper Archives]
Coverage and Usefulness: offers a small collection of articles from a range of nineteenth-century publications, organised chronologically by decade. There is no searching and the actual number of available articles is quite small, so its usefulness is limited. Nevertheless, there is some interesting material, e.g. the New York Times coverage of the battle of Gettysburg. Another section of the site (under Primary Source Material) has coverage from the Columbia Centinel and the Massachusetts Federalist of the Louisiana Purchase.
Pennsylvania Civil War Newspapers
http://www.libraries.psu.edu/digital/newspapers/civilwar/
Coverage and Usefulness: offers access to 17 periodical titles from Pennsylvania, which although part of what is called the “Civil War Newspapers”, in some cases go back to 1855 and in others up to 1874. The site offers keyword searching; boolean logic; search limiting eg to articles, pictures or adverts, although your initial search terms are not highlighted in the retrieved document. You can also choose to browse by title to see what issues are available.
Utah digital newspapers
http://www.lib.utah.edu/digital/unews/index.html
Coverage and Usefulness: this site has digitised around 40 Utah newspapers, covering both the nineteenth and twentieth centuries, eg there is a complete run of the Salt Lake Tribune for the nineteenth century, so this is potentially very useful. Search results are listed in alphabetical order by newspaper title, and within that by date: you can’t change this. Again search terms are not highlighted in the retrieved documents.
Forthcoming
Wyoming Newspaper Project
http://www.wyonewspapers.org/
Coverage and Usefulness: this site aims to “make newspapers printed in Wyoming between 1849 and 1922 accessible in an easily searchable format”. As yet, there is no indication as to when this will be available.
Further Sources of Information
There is a useful directory, “US News Archives on the Web” at http://www.ibiblio.org/slanews/internet/archives.html which details those American newspapers which have an online presence, and for each title gives the chronological coverage and the cost (where this applies) to access. The University of Washington have an online guide to Digital Newspaper Projects and Resources at http://lib.washington.edu/mcnews/digital_projects.html which was used in the preparation of this article. There is another useful compilation from the British Columbia Digital Library at http://bcdlib.tc.ca/links-subjects-newspapers.html
Conclusion
As you would expect, the resources mentioned above do not in general offer the same functionality, scope, ease of use, and support as the commercial offerings from Readex, Proquest, and Thomson. They are perhaps best seen as “cheap and cheerful”, though not necessarily “cheap and nasty”. In general it is worth the effort of seeking out and exploring them, as they represent a vital collection of primary source material for all civic, political, social, and cultural events in American life in the nineteenth century. As Arthur Miller once observed, “A good newspaper is a nation talking to itself”; you can now freely eavesdrop on some of these conversations.
Top
American Sheet Music
Jean Petrovic uncovers a visual, as well as a musical, resource
Sheet music was first published in America in 1788, and its publication was firmly established by the mid-1790s. In its early days it generally carried no type of illustration. Instead, the bars of music – and words, if it was a vocal piece – were simply cut on a metal plate and struck off, usually in quite limited numbers. The addition of any type of embellishment would seriously add to the cost and was, by and large, avoided.
With the development of lithography in the 1820s and the low cost and ease of reproduction that this process involved, a whole new world opened up for those who published music. Now, they could simply contact a lithographer, who would in turn contact an artist, and in next to no time an illustration was ready for the publisher’s use. Since an attractive cover would allow a publisher to add between ten and twenty-five cents to the cost of each new publication, it made good sense to use them.
The next half-century was a boom-time for such publishers, as the piano soared in popularity throughout the United States . By the 1860s some 110 American manufacturers were building twenty-five thousand pianos annually.[165] In addition, there were many hundreds of thousands of pianos already in homes across the land. Indeed, in his article “Publishing and Printing of Music” in the New Grove Dictionary of American Music, D.W. Krummel refers to this period as the “age of parlor music”.[166]
As the function of American music changed during this time from the sacred to the secular and from the timeless to the timely, so composers and songwriters began finding inspiration in every aspect of American life:
Wars: always a staple source of inspiration, the Civil War, the Spanish American War, and American involvement in World War I were no exception
Social movements: the abolition, temperance, women’s suffrage (pro- and anti-), and settlement house movements all had songs to commemorate their cause
Politics: every presidential candidate had his own campaign songs and marches
Inventions: trains, telephones, bicycles, and automobiles, all of these and so many more are celebrated in music and song
Discoveries: the discovery of oil prompted numerous ‘Petroleum Polkas’, for example, and the gold rushes in California and the Klondike did likewise
Public Institutions: the Sanitary Fairs of numerous municipalities caused local composers to put pen to paper
Disasters: not surprisingly, the sinking of the Titanic prompted an immediate response
Legislation: the passage of the Eighteenth Amendment provoked an outpouring of emotion and several ‘Prohibition Blues’
In responding to the nation’s seemingly insatiable thirst for music, the publishers of the latest topical songs, marches, waltzes, and schottisches realised that to maximise their profits they should invest in decent artists for the music’s title-pages. James McNeill Whistler and Winslow Homer were just two of the artists who tried their hand at this relatively lucrative type of illustration.
In time, the engagement of so many competent artists in this field ensured that the covers of American sheet music developed into an independent and compelling pictorial form. Indeed, just like American literature, which initially lacked an authentic voice, sheet music illustration slowly but surely emerged from the shadow of its British counterpart to become a truly indigenous art form.
In addition, perhaps more than anyone else at the time, these artists provided the American public with its images of everyday news, both local and national. They recorded, stylishly, the nation’s progress, and their work was used to decorate the walls of living rooms across the country.
In the early twentieth century, the publication of sheet music continued apace, centred around the area of Manhattan known as “Tin pan alley”. In fact sheet music was so popular at this time that it was also issued as supplements to newspapers.
Unfortunately for us today, the frequent handling of sheet music, and the way in which it gradually fell out of fashion, means that only a small fraction of the original output now survives in libraries or in private collections.
The British Library’s Music Collections contains a reasonable number of these works, and the Library’s Eccles Centre for American Studies is currently creating an online exhibition of sheet music covers. It will be mounted on the Library’s website (www.bl.uk) by the autumn of 2006.
[165] Lester S. Levy, Picture the Songs: Lithographs from the Sheet Music of Nineteenth-Century America ( Baltimore: Johns Hopkins University Press), 1976, 1.
[166] D.W. Krummel, ‘Publishing and Printing of Music’, in New Grove Dictionary of American Music, eds., H. Wiley Hitchcock and Stanley Sadie (London: Macmillan, 1986), 653
Top
Useful Web Resources
The Papers of Benjamin Franklin
Sponsored by The American Philosophical Society and Yale University. Digital Edition by The Packard Humanities Institute
http://www.franklinpapers.org/franklin/
UrbanDictionary.Com
Claims to define 400,000 slang words. Entries are posted by users, and visitors vote on their accuracy. Images and videos can be added to definitions, making for a controversial, uncensored site.
Andrews McMeel, Urban Dictionary (2005) – 2,000 of Urban Dictionary’s funniest, smartest definitions. A catalog of popular culture that users helped to write (2005). ISBN 0740751433.
Jonathan Hope, ‘Wimping It (review)’, Times Literary Supplement, 30 June 2006.
http://www.urbandictionary.com
Top
Oxford Dictionary of National Biography
Matthew Shaw looks online
The Oxford Dictionary of National Biography (2005) must be considered as the publishing event of the last decade, indeed the scholarly event of recent years. It will aid, and perhaps change, work in the humanities as much as the digital publication of resources such as Eighteenth Century Collections Online or the Readex Archive of Americana, offering a vast and growing panoply of lives, searchable in a variety of ways, such as occupation, gender or geography. One expects to see the march of prosopography in scholarly journals, the bloom of footnotes adding biographical detail to actors in historical and literary articles and, possibly, a postmodern interest in biography to match the experiments in narrative undertaken by historians such as Simon Schama (Dead Certainties). The range of lives, the inclusion and exclusions, the innovations, such as mythical figures and the biographies of groups, the complexity of the project and the quality of the publishing by Oxford University Press have all been commented upon by other reviews, as have debates over individual interpretation and the accuracy of modern scholarship. What has, however, been little noted, is the usefulness of the ODNB for American Studies, particularly students of the Colonial and Revolutionary Periods, who have over 700 lives to play with.
The editors have included not only those who were born in Britain or her colonies, but those who have ‘shaped British history worldwide’, from the Greek geographer Pytheas to the assassin Udham Singh (d. 1940). Britain ’s relationship with America has served Americanists well. Indeed, as Lawrence Goldman notes in his valuable online essay ‘America in the Oxford DNB’, ‘no set of interrelations is more notable and consistent in the Ox-ford DNB than those linking Britain and America ’. John Smith, the founder of Virginia, William Bradford, its governor, the dissident prophet Anne Hutchinson, George Washington, Alexander Hamilton, Allan Pinkerton, Goldwin Smith, the first professor of history at Cornell, Sam Wanamaker, Paul Robeson, Stanley Kubrick are just some of the Americans, or Britons who settled there, that are covered. The importance of the US for British lives, such as Charles Dickens, Charlie Chaplin, John Lennon or Sir Anthony Eden, is given full weight. Users of the online version can do full text searches for the US , or search for places such as New York or Chicago in the advanced search fields. Undergraduates, Postgraduates and their lecturers alike will undoubtedly find the ONDB to be a valuable resource to confirm a fact, discover the latest scholarly thinking, begin a bibliography or make startling connections. Best of all, these resources can be accessed, for free, by an estimated 48 million UK residents thanks to an MLA agreement.
http://www.oxforddnb.com
http://www.mla.org.uk
Top
American Legislative Intelligence
Arnie Thomas and Paul Jenks of GalleryWatch outline the legislative process and the difficulties of tracking the information flows that it creates
American Congressional research and monitoring has seen a major transformation over the past two decades, a transformation from personal relationships on Capitol Hill and volumes of printed books to online searching, notification and additional documents that previously were the most sought after documents in Washington, DC. The availability of information has been a revolution of access to the U.S. legislature and its process.
Online monitoring revolutionized the process not only of following legislation, but the process of legislating itself, enabling people and organizations either outside of the grounds of the Capitol or without personal relationships within it, to follow the activities of Congress. A vast increase in advocacy organisations, pressing their issues before Congress, has followed. A quick look at the number and type of organisations lobbying Congress in the 1970s compared with the same list today would clearly illustrate this point.
Bills and Amendments
The objective of a piece of legislation introduced in Congress is to get passed or, at the very least, to move an issue further in the public policy debate. Bills and Committee reports are not poetry or fine literary essays. They are ostensibly legal documents and they need help to move along in the legislative maze.
A Member or Senator that introduces a bill knows full well that in the interpersonal and partisan political world of Capitol Hill the legal text of the bill will not speak for itself. Some poetry, perhaps some science, some hype, and definitely some political cover will be needed for other Members to sign on and support the bill when and if it comes to a vote.
The “bill” is the primary vehicle for legislation and as a result a large part of Washington, DC will “track” the progress of bills in considerable detail. A bill will have an effect on someone, if not everyone, and those people will want to know the status of the bill as it moves. Also, many organisations in Washington, DC exist just to watch bills. Some people monitor specific actions, such as when someone attaches their name to a bill as a “cosponsor.” A new cosponsor to a bill could mean that someone was an effective lobbyist. Many others watch the language of the bill, what words have been changed, clauses deleted, sections added. A bill tracking mechanism will tell you of any new development on the bill: hearings scheduled in committee, committee passage, floor action, amendments, speeches and votes.
Tracking a bill requires some knowledge of the different types of bills – Simple Resolutions, Concurrent and Joint Resolutions, and Bills. Also, you should be aware of the various arcane procedures controlling the flow of the legislative process, the perennial favourite being the Senate’s cloture rules. Without denigrating all the procedures (any of which could tie up or end consideration of the bill) the critical step in the process is Amendments. They are the bane of the legislative monitor. No serious piece of legislation beyond the “Karl Malden Post Office” re-naming bill (S 1755) makes it through the process without being amended. Following these amendments will test all your skills and the capabilities of any computer and any online legislative monitoring service.
Amendments are certainly not new. The more you get into following a bill, the more you appreciate the role and impact of an amendment. They sneak in and destroy your precious bill or gut its impact. Congress can play pretty loose with amendments. This year I have seen amendments to bills that don’t exist! Amendments can be offered on the floor or in committee. Floor amendments are easy; they are usually printed in the Congressional Record. Of course, 2nd degree amendments made on the floor at 11:00pm will have to wait for their official printing. Committee amendments do not even have the certainty of the Congressional Record because they are generally not printed officially until long after a committee’s markup has been held. You need a good pair of shoes, a good Rolodex, or a good monitoring service for Committee amendments.
An amendment can be a little one-line change or it can substitute the entire text of the bill with a new bill, thus hi-jacking it for other purposes. A good mechanism is needed for keeping tabs on amendments. It can be time consuming, and a good service would save some time and effort. Anyway you do it, they are the part of the bill tracking process that you do not want to ignore. For it may not be a bill that interests you, it may be an amendment.
In the whole grand scheme of legislative bodies, Congress is not that bad. About 6% of all legislation introduced in Congress actually passes. Compared to many U.S. State legislatures, that is an impressive number. The percentage is in fact much better than it used to be, but this is probably due to rule changes that lifted the limit on the number of cosponsors of a bill (previously there was a limit, so many different versions of the same bill were introduced).
Sometimes bills are introduced knowing full well they have no chance, but a certain constituent or constituency needs the reassurance of a bill in Congress. There is a whole category of legislation called ‘Private Measures’ that are purely that, usually adjusting such things as citizenship status of a specific individual. (Private Measures are becoming less common in the modern era)
A stand-alone bill that passes Congress in most cases is purely ceremonial, technical or private in nature. The majority of bills pass by voice vote, often by unanimous consent. Some bills garner at least some controversy and some bills are rejected, though not many.
Congressional Floor Debate and Hearings
Congressional floor debate and speeches lie at the heart of the legislative process. It is a hallmark of a democracy to have the elected representatives give voice to the opinions of the voters, at least the voters who voted for him or her. Despite the very important and symbolic nature of things, No one would deny that it is not important, but it frequently gets put off to the back burner as more important things are sizzling on the stove that require more attention.
There is a drawback to US Congressional debate. In some respects it is a very cheap show, since few votes are ever changed based upon some wonderful speech; the action or give and take occurs elsewhere on talk shows, in back corridors, newspapers and blogs, etc. No-one to my knowledge has been defeated upon re-election because they were terrible floor speakers. While in House of Commons, for example, the questions and debate can unhinge a government, debate on the floor in Congress rarely makes a difference. Debate outside of the floor, in the cloakroom, in committee or elsewhere outside of the public view is more important in Washington, DC.
Most committees and subcommittees in Congress hold hearings on the issues or bills before their committee. The objective is to hear a variety of viewpoints on a particular topic, hopefully assisting Congress in their legislating. A hearing is distinctly different from a committee markup or business session, which is very influential and important. The public hearing is the markup session’s overly dramatic, less influential sibling, whose general purpose is informational.
I categorize hearings into three types: informational, oversight, and confirmation. Informational hearings on a specific topic or bill are hearings where anyone Congress chooses to invite can opine and answer questions on the topic. Witnesses can range from retired seniors unable to get prescription coverage under Medicare to baseball players testifying about steroid use. Advocacy groups, corporations, unions, local government officials, governors, even small children, testify on any topic imaginable before Congress. While this type of hearing is interesting to those who testify and maybe to the media eager to show pain and suffering, the most the hearing can do is to escalate an issue higher on the priority list. Therefore, the more heart-rending and dramatic the testimony, the better.
Another hearing type is the oversight or investigative hearing. An oversight hearing’s purpose is to keep an eye on the Executive Branch and it is one of the most important, and frequently abused or ignored, functions of Congress. Every single Cabinet Secretary will testify at least once a year before some committee in Congress. Undersecretaries and deputies who testify need to have clear and effective testimony skills on their resumes if they want to continue in the job. Usually, oversight hearings are mundane affairs reviewing that year’s budget priorities or are updates on specific projects. When things go wrong, hurricanes strike or war takes a grim turn, you can always count on Congress to hold a hearing. An agency chief, unfortunately on the job during a disaster, can only dread the committee investigation hearing that will undoubtedly follow. No one is immune; generals, admirals and titans of industry and labour have been brought down to human level at a Congressional hearing.
The third type of hearing, the confirmation hearing, is unique to the Senate. Ordinarily, this is the most boring of all hearings, unless it involves a nomination to the Supreme Court, in which case it is the most dramatic of them all. The Senate is charged of course with approving the nomination of high level executive department chiefs, military officers, ambassadors, and judges. Except for most military commissions, every one of them usually involves testimony and questions from the corresponding Senate committee or subcommittee.
Despite some highly visible hearings, most are quite technical and dry. Comments made in a committee hearing rarely change the debate and usually are a formality. Comments made on the floor of either Chamber, usually equally bland, actually have more weight. When a court reviews the legislative intent of a particular piece of legislation, the committee hearing is last on the list of official importance. It is first on the list, however, for most drama fans and probably the only Congressional event most people have ever seen.
Last year Congress had an interesting experience at a Congressional hearing. House of Commons Member George Galloway testified before a subcommittee of the Senate Committee on Homeland Security and Government Affairs. The affair offered an example of the marked contrasted between the British and American styles of debate. Galloway is a product of a much more rhetorically combative legislature than the Subcommittee Chairman, Senator Norm Coleman. The level of debate at the hearing was definitely one-sided and Galloway’s rough and tumble background allowed him to keep Coleman off-balance. Senator Coleman on the other hand was not interested in the banter, he was just interested recording specific Galloway responses into the record.
Additional Primary Source Materials
Additional legislative materials are more readily available than before. These additional documents provide a different perspective on legislation. A bill at face value in official documents may look like it is languishing in Committee while it may actually be moving along in other ways. Congressional letters, Agency reports, Committee analysis, draft amendments, and press releases may signal life in an issue that appears stalled. These additional materials have made the real difference in monitoring Congress and have also changed the process of how Congress operates.
Much has been written about the divided state of the American electorate, driven by sharp partisan bickering and attacks. I have heard a number of explanations, ranging from the increasing number of media outlets and bloggers to the ineffective role of political parties. From my perspective, all of them may be a cause, but I would also add the ever-increasing volume of additional primary source Congressional documents.
Partisan bickering and invective is not new. Media interference in shaping of the political world is not new. Congressional documents have been produced since Congress existed. Government documents since Babylon are not new either. What is new is the technology. This new technology has pried loose many Congressional documents from their dusty files and has created documents that never existed. A Congressional staffer during the Carter administration would have typed something up on his or her trusty IBM typewriter and gone to some common location and made photocopies of it. A staffer during the Nixon administration could have made a copy somewhere, but probably used carbon paper. This is not the world for distributing documents broadly, let alone creating new ones.
American history is full of purloined letters and documents, making it into the newspapers or public domain. Congress has been producing reports since at least 1789. But access to these has always been limited to small groups even within Congress itself and a few well-placed Congressional reporters. Only occasionally did they filter out with much controversy and publicity. The historical rarity and secrecy of these documents, now increasing available, is part of their allure today.
You now see several hundred seemingly private letters from Members of Congress every month. You can read an equal number of reports. You may see many bills in their draft form, continuously, every day. You can read Congressional staff analysis; they even summarize bills and committee reports. You can read the Congressional Research Service’s reports like they were box scores in the morning newspaper. Twenty years ago I was amazed that I could read a bill online; today you can follow the details of decision making in the U.S. Congress while you lie on the beach in Miami. Now think about the implications of this sea-change.
Civil liberties groups, librarians and “open government” advocates are rejoicing: everything is in the open, open for debate and critique. Elected representatives are watched, corrected, chastised, and supported continuously. The smoke filled rooms are now smoke-free. Cronyism, corruption, back room deals are exposed to the sunlight of public inspection. Mainline media groups are no longer the arbiters of news; anyone, whether on Pennsylvania Avenue or Boise, can see what is really happening. The American Republic is moving into one giant New England Town Meeting. Everyone knows the other’s little secrets, little is private, and all is seemingly known.
The other side of this coin is gridlock. Political and ideological positions are monitored and anyone, no matter how well informed or ill-informed plays a role. The curmudgeons of yesterday now are the arbiters of the debate today. Decision-making in a closed document world depended strongly on interpersonal relationships, trade-offs and even persuasion. I watched a group of Senators debating John Bolton’s UN Ambassador nomination react in amazement when debate in Committee actually changed a few Senator’s minds. That doesn’t happen much now days.
The American Republic is becoming more democratic, perhaps more truly democratic. The critique of democracy going back to Plato is coming closer to relevance than it has ever been in the past. The representative democracy in Washington is all about trading, one vote on one topic in exchange for support on another; the individual representatives determine the priorities. With the new openness, these priorities are more closely and much more quickly examined. Documents, drafts, and reports put the representative in the middle of whirlwind. The individual citizen, particularly represented on issues by a several thousand groups, is now the representative. It used to be we hoped our elected representatives used wisdom and judgment when reading the documents. Will the new citizen representatives do the same?
Primary source documents provide a view of the legislative process to an audience for whom they were originally not intended. As a result, the documents themselves have changed. In some cases, the new recipients have a better perspective than the old guard in Washington, DC, while in other cases they mobilize forces that can tie the Republic into knots. The American Republic has been transformed and monitoring what is happening in Congress is an effective way of measuring policy making in Washington, DC.
This article is based on a talk by Arnie Thomas of GalleryWatch at the British Library, February 2006.
GalleryWatch, located in Washington DC, is an online service providing advanced legislative intelligence. In the United Kingdom , GalleryWatch is available from Books Express, the exclusive reseller of GalleryWatch services: www.books-express.co.uk
See also http://thomas.loc.gov/
Resources for American Studies: Issue 58, 2005
Contents
- Welcome
- Books to Watch Girls By, Paul Woolf
- The Marischal Museum and North America: connections and collections, Neil G. W. Curtis
- US Government Publications: an untapped resource, Gill Ridgley
- American Journals and Magazines in the Arts and Humanities at the British Library, Katherine Baxter
- Best Zine Ever!, Matthew Shaw
- ‘Newly Discovered Documents’ at The Gilder Lehrman Institute of American History, Jayne Hoare
- American Studies Resources Centre, Liverpool John Moores University, Ian Ralston
- Second Air Division USAAF Memorial Library, Alexis K. Ciurczak
- John F. Kennedy-Institute Library Profile, Benjamin Blinten
- Roosevelt Study Center, Middelburg, Netherlands
- Lincolniana at the John Hay Library, Brown University
- Forthcoming Conferences
- Web Sites of Interest
Welcome
Since this journal is available on the BAAS website, the revivified Resources for American Studies can not only welcome regular subscribers, but idle surfers who have typed in “girls”; “books”; “American Studies”, “Lincoln” or “Little Magazines” into Google: they’re all here. Welcome back also to those have received the newsletter or journal of the BLARS Library Sub-Committee over the past years. Our apologies for this brief caesura: the claims notes that came in to the office gave cheer both because of the efficiency and rigor of your serials claims systems, but also because it’s nice to be missed.
Although the name has changed, and there is a slight shift in tone, the aims and purpose of the newsletter have not. We still aim to provide a place for Americanists of all stripes to find common cause, and to promote American Studies and the collections upon which scholarly endeavours are based, whether in libraries or museums. We welcome contributions from all corners: for instance, in this issue, we have a personal essay by a post-graduate, which helps to provide a perspective of the reader’s point of view.
Finally, an apology. The contents, perhaps necessarily, have a British Library bias. This simply reflects the way the articles fell, as well as the editor’s other position as curator in the Americas section of the British Library. Please help us to make amends by promoting your own holdings. We do, however, have guides to three overseas collections, and two resources within the UK.
Top
Books to Watch Girls By, Paul Woolf
Paul Woolf, University of Birmingham
It is now eight years since I completed my undergraduate degree in English and Related Literature at the University of York. After five years away from higher education, working for a company that makes television documentaries, I returned to full-time study in 2002. I am currently in the second year of writing a PhD thesis about depictions of Anglo-American marriages in nineteenth-century fiction at the University of Birmingham.
When I was asked to write this article, about my personal experience of libraries in the UK, I found myself thinking mostly about the differences between the way that I used York’s library, and the way as a postgraduate I now use libraries, Birmingham’s as well as various others.
The very first thing that occurred to me was a fact with which I surprised myself: compared to the number of hours per week I used to spend studying in York’s JB Morrell Library, I now spend very little time actually inside Birmingham’s Main University Library. Whereas I might pass an entire working day in the JB Morrell, a typical visit to the Main University Library is grab-and-dash affair. I go in, seek out and claim the books I need, and get out. Usually, I don’t dwell.
This is, I should stress, no reflection of any dislike of the Main University Library. It’s not perfect, but it’s by no means an unpleasant or ineffectual place. It’s user-friendly and the staff members are always welcoming. No, my initial explanation for spending more time in the library as an undergraduate was, I confess, to do with the overactivity of my late-adolescent hormones. The JB Morrell – specifically, the fourth floor – was a great place to meet girls.(1) Now older, and in a long-term relationship, I evidently do not feel the same need to take advantage of the opportunity for flirtation that university libraries offer.
This explanation soon gave way, though, to a connected, but wider one. Going to the library as an undergraduate was, at least for me, a social activity as much as an academic exercise. It was a place not only to read books and plan essays (and meet girls), but also to see friends, chat, go for a coffee, and make arrangements for the evening’s leisure. The library was, in some ways, just another of the many public spaces on the University campus where one could interact with fellow students.
If my undergraduate study and library use was a public thing, however, postgraduate work is a much more private thing. It requires longer periods of concentration and being surrounded in a library by undergraduates doing exactly what I used to do – chatting with friends and making arrangements for the evening’s leisure – is an unwelcome distraction. I prefer, as I said, to go in, get my books, and go home to read them. That I am enabled to do this by the greater loaning provision available to postgraduates – I am allowed to take out more books and keep them for longer than undergraduates can – I am sure further encourages me to treat the Main University Library as a place from which to borrow books, rather than a place in which to sit down and work.
This brings me on to another point of comparison between my undergraduate and postgraduate experiences. In the period between graduating from York in 1997 and beginning my Master’s degree at Birmingham in 2002, academia went online. During those five years, I had experienced the e-revolution in the professional workplace. When I joined it, the company for which I worked had just two computers with internet access in its main office and most communication was conducted by phone or post. By the time I left, email and internet use were so integral to the firm’s day-to-day life that a few hours during which the office server was temporarily inoperative were a few hours of almost absolute downtime.
Nonetheless, on returning to university, I was still unprepared for the extent to which one could use the internet as an academic research tool. I would have felt overwhelmed by the sheer amount of academic stuff available online had it not been for a practical, sensible compulsory research skills course for postgraduates that was run by the Main University Library’s specialist librarian for my department (American and Canadian Studies).
The School of Historical Studies at Birmingham, of which The Department of American and Canadian Studies is a part, has a computer room set aside for postgraduates. I spend much more time in this room working online than I do in the Main University Library. However, I utilise the room in much the same way as the Library: it is a place in which to assemble the materials I need to be able to work at home. (I do have a computer at home but cannot access from it all of the catalogues and databases available on PCs connected to the campus network.) From downloading journal articles on J-Stor to buying books from Amazon and Abebooks, from consulting web encyclopaedias and OED.com to searching online library catalogues and posting requests for information on H-Net’s mailing lists, I am almost as dependent on the internet now as I was in my previous job.
Amazon’s ‘marketplace’ for second-hand books and Abebooks have, in particular, been invaluable. A quick glance around my bookshelves reveals that I have purchased at least two dozen books through these two websites during the eighteen months of my PhD. Many of these are nineteenth-century novels held by only a few university libraries in the UK. I have never attempted an Inter-Library Loan, having been discouraged from doing so by one of my Master’s degree tutors because of the waiting times involved, and it usually works out that it would be substantially more expensive to travel to another university library than to buy the book. So, I prefer to purchase. In any case, these are novels that I want to read several times and consult periodically, so having the books permanently within reach makes sense.
I acknowledge I am fortunate in that, AHRB-funded and having saved some money from my five years of full-time employment, I can afford to buy most of the books I want. I am also financially able to make trips every few months to London to use The University of London’s Senate House library and the British Library. I do this when I have accumulated a list of books that I do not think are worth buying, but that I do need to read or photocopy from. A Young Person’s Railcard, for which anyone in full-time education can apply, enables me to travel to London from Birmingham at peak-time, but for off-peak fares. This means that I can get to the British Library when it opens and spend the whole day there.
I think, in fact, that I have a crush on the British Library. I cherish days passed in the stillness of its reading rooms. I have a feeling when I am there that I am enjoying something of a guilty pleasure. Even the occasional non-appearance of books carefully ordered in advance and the wallet-emptyingly high cost of photocopying at the British Library cannot deter me from visiting.
There is an added bonus to days spent in London. Many friends from my undergraduate time in York now live in the capital, and British Library trips provide an opportunity to meet up with them in the evenings. One friend, with whom I used to sit on the fourth floor of the JB Morrell Library, is convinced that I only use the British Library because it’s also a good place to meet girls. He refuses to accept my protestations that these days it’s only the books in which I’m interested.
1. Of course, when I say “meet girls,” what I really mean is “see girls I’d like to meet, and chicken out of talking to them.”
Top
The Marischal Museum and North America: connections and collections, Neil G. W. Curtis
Neil G. W. Curtis of the Marischal Museum, University of Aberdeen
In 1824 Professor William Knight of Marischal College wrote in his catalogue of the College’s Museum that the collection included an “Indian Pouch, Indian knife, Belt of Wampum, Eight various Girdles, Belts &c used by the N. American Indians, Cloak, ornamented with Beads” and noted that “One of the Girdles and Garters were presented by Mr Ogilvie of Barras”, also a number of other items with a Cherokee or Choctaw provenance, implying that much of this collection originated in the South-West in the era before the Indian Removal Act of 1830. Unfortunately, not enough is known about Ogilvie to explain how he came by this early collection or the nature of this contact between Scot-land and North America. Nonetheless, this material and a small group of other items is an important record of indigenous life before Western impact.
A more expected link lies in the people from Scotland who joined the Hudson’s Bay Company. Fore-most among those represented in Marischal Museum is William Mit-chell (born in Aberdeen in 1802), who from the age of thirty was a sailor and trader with the Company. Becoming a master mariner in 1851, he commanded a number of their ships on the West Coast, including the Cadboro, Una, Recovery and Beaver. He is recorded as being a ‘generous, good-hearted sailor who utterly despised anything small or mean’. In 1852, when in command of the Brigantine Una, he took some gold miners to Haida Gwai’i (formerly the Queen Charlotte Islands). There a harbour was named after him, but the expedition broke up in the face of native opposition. On his death in Victoria, Vancouver Island in 1876, his collection was bequeathed to the University.
Alongside items from northern coastal British Columbia including a Chilkat blanket and Tsimshian masks is a fine collection of Haida argillite carvings including model totem poles and panel pipes. Among them is one that depicts a paddle steamer with a figurehead in the prow of a beaver, presumably the Beaver, the first steamship on the American West Coast. Others depict people in European dress in combination with traditional Haida motifs. A wonderful mingling of traditions, the panel pipes are a particularly evocative reminder of the complex relationships between native people and European traders and settlers.
Transatlantic Connections
These links are also remembered in a tag attached to a small comb donated in 1929 saying ‘Esquimaux comb from Dr. Rae, Hudson’s Bay Co.’ John Rae, from Orphir in Orkney was another, but much more famous, mid-nineteenth-century Scottish adventurer who was renowned for having a greater understanding of native ways of life than most other Europeans. That these links between Scotland and Canada have continued to the present was shown the donation of an Inuit parka by a student whose late husband, Graham Noble of Fraserburgh, had collected it while working as a storeman with the Hudson’s Bay 1969-71.
Marischal Museum therefore now contains what is the third-largest ethnographic collection in Scotland, with a particular strength in North America material (almost 2,000 items). The Arctic collections are perhaps the most import aspect of this; they are certainly the largest. At their core is a donation by Sir William Macgregor (Governor of Newfoundland in the early years of the twentieth century), which includes archaeological material from Labrador as well as nineteenth-century ethnographic items. The most famous single item in the collection is undoubtedly a Greenlandic kayak with hunting equipment which arrived in Aberdeen in the early eighteenth-century. In the 1820s catalogue it is described as ‘Eskimaux canoe in which a native of that country was driven ashore near Belhelvie, about the beginning of the eighteenth century, and died soon after landing’. The first record of this kayak is in a diary written by a Rev. Francis Gastrell of Stratford-upon-Avon who visited Aberdeen in 1760. He says that,
In the Church which is not used (there being a kirk for their way of worship) was a Canoo about seven yards long by two feet wide which about thirty two years since was driven into the Don with a man in it who was all over hairy and spoke a language which no person there could interpret. He lived but three days, tho’ all possible care was taken to recover him.
The distance from Greenland to Scotland is about 1,200 miles, but this could be broken into shorter lengths by landing in Iceland, the Faroe Islands, Shetland and Orkney. This would be needed to prevent the kayaks becoming waterlogged and to get drinking water. Even so, it is difficult to believe in such a long journey on rough seas, particularly with the difficulties of navigating out of sight of land. There are two theories about how the Inuit could have reached the North Sea with their kayaks. The first suggests that they were kidnapped by whalers and brought to Europe as curiosities, but then managed to escape or were freed by their captors. An alternative is that the kayakers took advantage of the colder weather of the ‘Little Ice Age’ of about 1300 to 1850 when ice floes would have drifted much farther south than today and would have offered extra places on which to rest and collect fresh water. The contemporary resonances of the association with climate change has led to this theory becoming more popular, while the possibility of an indigenous North American man exploring Europe also has an appeal.
Along with another kayak ‘with paddles, darts and other implements; presented, 1800, by Captain William Gibbon, Aberdeen’, there is another kayak in Aberdeen, in the buildings of the University’s Medical School. This may be the one in which Eenoolooapik, an Inuit visitor to Aberdeen in 1839, demonstrated his kayaking skills in the River Dee to an admiring crowd. Eenoolooapik was brought to Aberdeen by Captain Penny of the whaling ship Neptune. Sadly when he returned to Labrador the following year he died of tuberculosis.
Repatriation
That the links between the museum and North America are not solely those of colonial collectors was highlighted in 2003 with the repatriation of a split-horn head-dress to the Horn Society of the Kainai (Blood Tribe). The head-dress was donated to the museum in 1934 by a Mrs Bruce Miller, about whom little is known except that her family owned an Aberdeen chemical factory. It is likely that she visited the Blackfoot reservation in Montana, USA in the 1920s, collecting the head-dress, a decorated buckskin shirt, moccasins and some other items. She did not record any tribal names or other details, so for many years the head-dress was merely catalogued as a ‘war bonnet’. This reflects European attitudes towards Native American people and an ignorance of the head-dress being part of a sacred bundle. Sacred bundles are treated as living beings, cared for like a child by people to whom they are ceremonially transferred.
Twenty-first-century contact between Marischal Museum and the Kainai began when an Aberdeen graduate who was working with them realised that their missing head-dress reminded her of one in Aberdeen. On 13 November 2002 a delegation from the Horn Society visited Aberdeen to see if this head-dress was the final sacred bundle for which they had been searching. The group who came to Aberdeen was an elder, Charlie Crow Chief and his wife, and current members of the Horn Society Randy Bottle, Karen Bottle, Duane. They were welcomed to the University by the Principal and museum staff after which they smudged and prayed before identifying the head-dress. We then spent the following couple of days discussing how their request would be dealt with, looking at other probable Blackfoot objects in the collection and writing a joint press release.
Preparing for the possibility of a repatriation request, the University had approved a procedure that that establishes an expert panel to assess a request for repatriation against six criteria (see http://www.abdn.ac.uk/marischalmuseum/collections/policy>). The panel consisted included curators from the National Museums of Scotland and the Glenbow Museum, the latter nominated by the Horn Society, as well as representatives of the University. As well as considering written statements from Marischal Museum and the Horn Society, Randy Bottle and Frank Weasel Head, an elder, also spoke to the panel at that meeting, impressing everyone with their explanations of the importance of the head-dress. It was striking that, although they believed that it was likely that the buckskin shirt had been worn by the last keeper of the head-dress, it was merely a shirt and not part of the sacred bundle, so they did not ask for it to be repatriated. Issues such as photography and the making of a replica were also discussed. They explained that there could only be four head-dresses (rather like North, South, East and West), so making a replica would be impossible, while the photography of sacred objects would be seen as disrespectful. They did, however, accept that the museum should have photographs for its archive and for use in exhibitions and lectures. The Panel’s recommendation in favour of repatriation was approved by the University Court in May 2003. On 7 July 2004, at a public ceremony in the museum ownership of the head-dress was transferred and a Memorandum of Understanding signed to outline the conditions of the repatriation (including a promise of objects to be given to the museum) and to help us to work together in the future. At a private ceremony afterwards, the head-dress was taken into the care of the Horn Society members.
A few months later, a temporary exhibition ‘Going home: museums and repatriation’ told the story of the repatriation of the head-dress and raised some of the issues behind other demands, displaying the copy of the Lakota Ghost Dance shirt on loan from Glasgow. My favourite comments were ‘all of humanity is connected to each other’ and ‘so glad to see this as a discussion – I knew very little about procedures and cases of repatriation.’ Exhibiting the absence of an object can clearly make as much of an impact as displaying an object.
The links established by the repatriation have continued, including an invitation to the Sundance with my family in summer 2004. This visit was far from being just an opportunity to see an exotic ceremony (1). Rather, it was about friendship, generosity and understanding, and led to a greater understanding of the importance of the head-dress and why the repatriation mattered, as well as a richer perspective on the historical contact between the North-East and the native people of North America. Marischal Museum has now played a part in this: the current keeper noted that the head-dress is ‘in good shape’ and the link between Aberdeen and the Kainai will continue to develop. We now look forward to welcoming friends to Aberdeen again, not just to receive objects for the museum collection, but also to learn from each other about concerns we both have such as land rights, the role of traditional languages in schools and how to work with the oil industry for social benefit.
The collections now in the care of Marischal Museum thus document some of the many changing connections between the people of the North-East of Scotland and the native people of North America. From the use of glass beads in the wampum belts donated by William Ogilvie and the argillite panel pipes to the repatriation request from the Kainai, many of these contacts are the product of creative actions by native people faced with the impact of European culture. The museum’s reach has recently been greatly expanded with the development of on-line resources funded by the Joint Information Systems Committee (JISC) which were launched in September 2004. They include a virtual version of the displays of Marischal Museum with images of all display cases and objects, the texts of all captions and QuickTime panoramas of the museum and conservation lab (see http://www.abdn.ac.uk/virtualmuseum>). This is underlain by a database of some 3,000 items in the collection (http://www.abdn.ac.uk/museumsearch>) with links to archival evidence, associated objects and records of changing interpretations over the last two centuries. While only a small proportion of the items illustrated derive from North America, they all show the connections mediated by people from North-East Scotland that link together many parts of the world. While some of these contacts have been unequal and destructive, some have been much more creative. The challenge for the museum is now to use these collections to increase mutual understanding of our entangled connections, including the creative, the destructive, the unequal, the painful and the challenging.
Marischal Museum contacts and website
Marischal Museum
University of Aberdeen
Marischal College
Broad Street
Aberdeen AB10 1YS
Tel: (general enquiries) +44 (0) 1224 274301
Tel: (conservation laboratory) +44 (0) 1224 274300
Fax: +44 (0) 1224 274302
Email: museum@abdn.ac.uk
http://www.abdn.ac.uk/diss/historic/museum/index.shtml>
1. N. Curtis, ‘Going home: from Aberdeen to Standoff’, British Archaeology, 82, May/June 2005, pp. 40-43
Top
US Government Publications: an untapped resource, Gill Ridgley
Gill Ridgley provides a guide to the giant US Congressional Serial Set
The British Library has excellent holdings of US government publications – a legacy of over two centuries of assiduous collecting from North America. In the Library’s previous incarnation as the British Museum Library it amassed a treasure house of important material: the federal collection has publications going back to the late eighteenth-century, while the states collections include seventeenth-century material from the New England states, legislative material such as the Journals of the House of Assembly of the State of New York (starting with the first session on 10 September 1777) and many of the publications of the confederate states. Inevitably, gaps exist, but the collection measures up very well to those of the principal US libraries.
The US official holdings consist of all the main ‘sets’: The American State Papers 1789-1838 (actually published 1832-1861), The United States Congressional Serial Set (1817 to date), much of the material not contained within the Serial Set such as Congressional Journals and Proceedings, Hearings and Committee prints, and most of the current output of the US Government Printing Office.
Researching official publications is necessarily a complex process, but matters have improved substantially with the advent of the Internet and the growth of online resources generally. In common with many publishers of official publications, the US government now makes most of its current material available on the Government Printing Office website at http://www.gpoaccess.gov/index.html>, while commercial publishers such as Newsbank International, with its digital Archive of Americana are creating exciting full-text databases which exploit the rich resources of official publications and which, most importantly, provide numerous access points. The US Congressional Serial Set has lent itself admirably to projects like this.
The Serial Set is an immense, and underused, resource. Like the British Parliamentary papers, to which it can in part be compared, it forms an unparalleled repository of information of all kinds. Organised into a retrospective, and ongoing, series in 1895 for the GPO by Dr John G Ames, the Set – which contains documents dating back to 1817 – forms the official bound archive of Congressional publications (currently numbering over 14,000 volumes). It consists of Congressional documents of all kinds including administrative reports, and other internal papers; Congressional reports on public and private legislation, Presidential messages (though not proclamations), treaty materials, reports from Congressionally commissioned or conducted investigations, and the annual reports of a number of non-governmental organisations such as the National Society of the Daughters of the American Revolution and the Boy and Girl Scouts of America. Confusingly, what has been included has changed over time: at certain periods the Census Statistical Abstract and the House and Senate Journals of Proceedings were numbered within the set, as were certain Executive branch publications. These have subsequently been excluded. For those intrigued by the long saga of the compilation of the Serial Set, an interesting account of the history of its printing is available online at http://www.access.gpo.gov/su_docs/fdlp/history/sset>. Also of interest on the Library of Congress website is a fascinating glimpse into some of the legislative resources of the Serial Set in its American Memory pages at http://memory.loc.gov/ammem/amlaw/lawhome.html>.
Apart from documents relating to the administrative and legislative functions of Congress, the Set contains a number of very detailed historical documents, the results of far reaching deliberations of committees of inquiry. As an example, the Report of the Joint Committee on the Conduct of the War in 1863 provides an extensive – and perhaps lesser known – investigation, with testimony, on the conduct of the Battle of Bull Run. Other examples include the Commission on National Aid to Vocational Education, 1914, and the rather better known Iran-Contra investigations of 1987.
The Set is full of unexpected gems, making it a resource not only for its natural constituency of social scientists and historians but also for genealogists. Not only are there statistics, maps and illustrations in abundance, there are also numerous lists of names: of people registering patents, of army and navy pensioners, of prisoners such as those found in the Report of the Marshall of the District of Columbia, which provides details of the names, ages and offences of those held in Washington County jail on 9 December 1861. The commercially produced digital archives are particularly useful in this regard: the Newsbank set has a special ‘lists’ menu and the texts themselves are searchable.
For earlier Congressional documents, researchers should consult the American State papers set which covers the period from 1789 to 1838 and which was published from 1832 to 1861. As well as this resource, the Library also holds 57 rare Congressional documents published between 1792 and 1817 which are not enumerated as part of the set. They are all individually catalogued.
There is no denying that the Serial Set is bibliographically complex and therefore daunting to first time users. In addition, apart from the few archival sets in the United States, very few libraries have complete runs of the set in numerical order. This is true to a certain extent of the British Library collection, despite its relative comprehensiveness. It acquired many of the Serial Set publications as they were issued, before they were designated with special numeration by the GPO, and some titles are therefore scattered within the catalogue. Fortunately, they are readily findable with a title search.
On the whole, the most useful index to the Serial Set is the CIS US Serial Set Index, 1789-1969, but there are many other published indexes covering particular periods such as the Tables of and annotated index to the Congressional series of United States public documents (1817-1893), the Numerical lists and schedule of volumes of the reports and documents (1933-1980) and the GPO Monthly catalogs. Online indexes such as the GPO website have added a further dimension, while the full-text commercial databases have the bonus of many additional accessing points.
To complement the Serial Set, the BL holds virtually complete runs of the Journals, Records and Proceedings of Congress and a good collection of Congressional Hearings (particularly after the 1950s when these publications were included in the official exchange/depository arrangements for the first time). Departmental publications, most of which are no longer part of the Serial Set, are collected where possible, while major subject collections such as the official history of the American Civil War, published in 128 volumes by the War Records Office in 1880-1902, have always been actively sought. The library also holds archive collections on microform such as the Presidential papers series and the State constitutional conventions.
Librarians frequently complain that not enough use is made of official publications. Difficulty in negotiating catalogues and indexes is certainly one reason; the other is that researchers are frequently unaware of the depth and breadth of the information they contain. The advent of new and improving online resources will help, but Librarians must also recognise a responsibility to ‘advertise’ their collections and to demonstrate to potential users the riches they undoubtedly contain.
Government Printing Office:
http://www.gpoaccess.gov/ index.html>
An account of the history of the Serial set’s printing:
http://www.access.gpo.gov/su_docs/fdlp/history/sset>
American Memory:
http://memory.loc.gov/ammem/amlaw/lawhome.html>
Top
American Journals and Magazines in the Arts and Humanities at the British Library, Katherine Baxter
Katherine Baxter, writing as Curator, U.S. Collections, British Library
The British Library is home to a vast and varied collection of American journals and magazines in the arts and humanities. The holdings range from microfilm of John H. Payne’s The Thespian Mirror, originally published between December 1805 and May 1806, to The Last Supplement to the Whole Earth Catalog, edited by Ken Kesey and Paul Krassner in 1971.
Early Magazines
The history of periodical publishing in the U.S. is rich and well documented. The beginning of the last century ushered in a spate of books examining the publishing history of the eighteenth and nineteenth century. Lyon N. Richardson, A History of Early American 0Magazines 1741-1789, 1931 [British Library shelfmark 20017.b.20], already draws on several earlier publications in the field. Focusing on the early years of American periodical publication, Richardson identifies characteristics that, by and large, define the field in the following hundred years. Written by and for the educated classes these periodicals were predominantly eclectic in content but were typically didactic in nature. Material from Europe, which included contemporary works and “classics”, was juxtaposed with home-grown American content such as poetry, science, politics, rhetoric, social comment, essays and occasional narrative. Periodicals were vehicles for intellectual communication and extended discussion of the issues of the day. Neither this, nor the education of the authors, guaranteed the quality or the success of the publication and many only appeared in one or two editions. Perhaps this did not overly matter: the authors were typically highly-educated amateurs for whom the production of periodicals was not a career per se but rather a pleasurable pursuit.
The end of the nineteenth century saw significant developments in every area of the publishing business, leading to a considerable professionalisation of the industry. Syndication became prevalent, distributing articles simultaneously across the U.S. and Europe. Copyright agreements with the U.K., originally intended to protect British authors from piratical reprints, now served U.S. authors sending their work to the U.K. Magazines such as McClure’s [p.p.6383.ae] began employ staff writers who would extensively research articles for illustrated serialisation. No longer offering just the personal views and opinions of an educated but amateur elite, magazines now proffered apparently independent journalism, based on well researched “facts” eloquently presented.
Little Magazines
The flipside of this increasing professionalism was an increasing sense of homogenisation that sparked a backlash in the form of what are now commonly termed “the little magazines”. These, in many ways, harked back to the earlier era of American periodicals in their tendency towards coterie, their intellectual pretensions, and their frequently short lifespans. Unlike earlier journals, many women played prominent roles that in the editing of little magazines, a development explored in detail by Jayne E Marek, Women Editing Modernism: “Little” Magazines and Literary History, 1995 [YA.1997.a.13656].
Since those who began to champion the modernist cause (the raw material of which appeared in these irregular and provocative little magazines) were frequently academics, it comes as no surprise to find that the next trend in publications came from the universities and colleges. With a certain inevitability, these periodicals whose cause was initially revolt and innovation became in time the bastions of received wisdom: bastions against which new writers in turn arrayed their artillery in another wave of little magazining. This later spate of little magazining grew out of the beat and hippy movements. Drawing eclectically, as much from transatlantic Dadaism as from regionalised folk traditions, they presented political and social radicalism in aesthetic forms. Like the modernist little magazines before them, and the late eighteenth century periodicals before them, these publications appeared for a coterie audience and were frequently short-lived.
Contemporary Magazines
The development of both academic and independent strands in periodical publication has been symbiotically supported in the past thirty years by the proliferation of provision for “Creative Writing”, whether within the academic setting of university or the communal setting of urban and rural writing groups. Moreover, such proliferation has led to increased hybridity in content and editorial policy. Today, the internet and the accessibility of blogging provides the possibility for even greater hybridity and has given a new lease of life to the American traditions of miscellanies, satire, and communal production.
The British Library
The British Library collections represent an extensive cross section of America’s periodical output in original copy and microfilm. They are augmented by a wide range of auxiliary material such as bibliographies and critical histories.
Of particular use is American periodicals, 1741-1900: an index to the microfilm collections, edited by Jean Hoornstra and Trudy Heath, 1979 [RAM 094.30973]. This volume provides a comprehensive introduction to the microfilm contents, through four separate indexes for periodical title, subject, editor and reel number. The microfilms themselves can also be consulted at the British Library [Series 1 mic.a.130-162; Series 2 mic.a163-416, mic.a.3621-4212, mic.b.604/846-1966; Series 3 mic.b.606/1-771].
Twentieth century collections are supported by predominantly thematic bibliographies, such as Nancy K. Humphrey’s American women’s Magazines: An Annotated Historical Guide, 1989 [2725.e.1297] and Walter Goldwater’s Radical Periodicals in America 1890-1950, 1966 [2764.m.29]. The Greenwood Press series, Historical Guides to the World’s Periodicals and Newspapers, devotes the majority of its output to American publications, presenting descriptive and bibliographical information about significant publications within thematically contrived volumes, for example, American Indian and Alaska native newspapers and periodicals, 1971-1985, 1986 [2725.d.373] and Corporate magazines of the United States, 1992 [YC.1992.b.4355].
For contemporary material the CLMP (Council of Literary Magazines and Presses) Directory of Literary Magazines and Presses is invaluable and is updated each year. This lists alphabetically individual presses and periodicals, along with information on editorial policy and contact details. The listing is also available through CLMP’s website, http://www.clmp.org>. A further useful website belongs to Small Press Traffic, http://www.sptraffic.org>, whose links page is rich and extensive. Between them CLMP and Small Press Traffic cover much of contemporary American web and print periodical output from the hyper-productive centres of New York and San Francisco, respectively. Both websites direct readers to publications held at the British Library as well as e-journals accessible through the world wide web. These sources alongside the original materials gathered in the British Library collections themselves provide invaluable resources for future historians of American periodical publication as well at the critic of late twentieth- and early twenty first-century literature.
Top
Best Zine Ever!, Matthew Shaw
Matthew Shaw looks at the current zine scene as captured by Best Zine Ever! #3
The partial roof collapse during a concert by the sombre and delicate alternative rock band Yo La Tengo in Hoboken, NJ constituted one of the least-reported disasters of recent years. Only The Onion, ‘America’s Finest Newsource’, carried the tragic story of the loss of 37 record store clerks: “I haven’t seen this much senseless hipster carnage since the Great Sebadoh Fire Of ‘93,” reported one rescue worker, as he pulled out a gold and green puma trainer. Amongst the missing, The Onion noted, were two zine publishers.1
Zines, the punk chapbooks of the literary world, were traditionally assembled by fans (hence the name: an abbreviation of fanzine) of 1960s pop groups and were distributed by mail or handed out at gigs. In time, zines became more political and personal, providing a literary output for adolescent angst, radical politics, oddball comedy and cartoons, and anything in between. The punk and underground music scene of the 80s, the culture wars, the Riot Grrrl movement, and queer politics all added energy and purpose to the zine movement.2 Phrases like the ‘independent magazine revolution’ began to be bandied about and the mainstream media’s fascination with ‘Generation X’ in the early and mid-nineties brought the independent and alternative into the full glare of the capitalist mainstream. Zines, it seemed, were everywhere: critics and cultural theorists began to get a handle on the genre, and mainstream print media also got in on the act, publishing ‘best of’ collections); films such as Ghost World, which drew on the world of alternative comics, projected the ‘indie’ vibe into, if not the mainstream, then at least away from the eddies and grottoes of the past. As one ex-zinester writes, it was the time of the Great Zine Explosion and ‘seemingly every other obsessed nut in the country started putting out zines’.3 Even libraries began to collect them.4
But the brief flowering of the movement also induced criticism of a format that has become moribund, inward-looking and self-referential, more concerned with attitude and image than the content or purpose of the product.5 By the beginning of Bill Clinton’s lame duck period, the creative and relevant spirit of zines had been snuffed out. Zines were still being produced, but the energy, some argued, had gone. To many, zines’ defining individualism, quirkiness and indie spirit had been co-opted by commercial culture as a way to market new products. And, as any hipster knows, fashion moves on. The commercialization of zines and indie culture was seen be practitioners as a Bad Thing. Zines lost their underground cachet, the GenX media story grew old, circulation fell and numerous zines crumpled, never to return. Shrinking from the limelight, faithful zinesters returned to the darkness and authenticity of the underground, which many claimed to prefer (risking only The Onion’s satire).
http:zine?
By the late 90s, the new big story had become the internet: dotcom replaced cut and paste; bytes became cooler than tip-ex. The greatest threat to zines, apart from collapsing concrete at Yo La Tengo concerts, could well have been the internet. No more inky fingers, staples, Xerox bills, or postage costs: the world wide web promised a digital future for twenty-first-century hipster thoughts. Sites such as http://www.blogger.com> and tools such as MoveableType have made online publishing simple and often at zero-cost (for those with internet access). The dated, serial format of weblogging provides the perfect format for the autobiographical output typical of perzines (“personal zines”), and quick search of Blogger, Technorati or Google unleashes an avalanche of digital zines.
But as with most predictions about the e-future, the web has simply expanded the non-virtual world, supporting and developing the printed zine scene rather than replacing it. A new wave of zines is being produced: after the Great Zine Crash of the late 90s, zinesters had simply incubated their zines or passed the torch to a new, younger generation, for whom Nirvana provided their nursery-school music. The political lurch to the right in American politics, the dominance of media conglomerates, the burgeoning NoLogo culture of anti-corporatism, and – perhaps most importantly of all – the attacks of 11 September 2001 and subsequent political reaction, galvanised the faltering zine production machine. Distribution and marketing (a word zinesters often shy from, while often being adept at guerrilla marketing tactics) had posed the biggest problem to the historic zinesters of the 70s to the early 90s (although many were happy with a secretive readership). Zine production had always driven by technology (think of the Xerox copy machine, the home computer, scanners); now distribution and marketing could be transformed. The internet could provide an online presence, pointing potential readers to the real-world zine and its spin-offs, such as clothing, music projects or books, providing a virtual magazine store, listings service and critical mass.
So, zines are still being stapled in Kinkos across the fifty states as we sit here, but are they still relevant: and, more pertinently should they form part of library collections?
Best Zine Ever
Edited by Greg Beans and assembled at the Independent Published Resource Center (http://www.iprc.org>), Best Zine Ever! A review of our favorite zines of 2004 helps to answer these questions. Printed simply in black and white and running to 20 pp (including the cover), BZE sets out to list the ‘greatest zines on the planet’. It takes the view that the best zines let you dial ‘into someone else’s brain and as you read their stories, witness their art and experience their life’, and provides a selection which may well do exactly that. The review, which is stapled and photocopied in time-honoured zine format, provides the names and address of the eighty-or-so zines which BZE considered to be the best of the year. The zines are reviewed by a team of zinesters and zine librarians, including Brooke Young (Leeds supporter and founder of the zine collection at Salt Lake City Public Library) and Jenna Freedman, (zine librarian at Barnard College, New York), describing the merits of their pick in humorous and sometimes idiosyncratic couple of sentences. Each page has a black and white illustration taken from one of the zines. What trends can be detected?
Unsurprisingly, given the predominance of the genre, the majority of the zines chosen are autobiographical perzines, such as America? #12 ($1 to Travis, PO Box 13077, Gainesville, FL 32604-1077):
Whether he’s hanging around his hometown of Gainesville, FL playing soccer, working at the library and going to house shows or travelling the world with his favourite punk band, Travis gets right down to it: the world is full of beautiful friends and blissful moments but just beyond our sight is unimaginable ugliness and ruin.
Zines such as Burn Collector #13 ($4, Stickfigure, PO Box 55462, Atlanta, GA 30308) can have a ‘much more literary feeling’, while still focusing on the ‘dark, jaded musings’ of the zinester, Al Burian, who deftly compares his life with that of Kilgore Trout, the main character from Kurt Vonegut’s Breakfast of Champions: ‘if you aren’t already familiar with Burn Collector, close your curtains, unplug your phone, bake some cookies, and recline with the new issue’.
Other zines have a wider, less personal remit: in the case of Art Missive ($3-$10, Lauren Jade Martin, PO Box 150318, Brooklun NY 11215, laurenjaded@ riseup.net), a cultural, Brooklyn-centred one: ‘insightful interviews with younger, emerging visual, book, and video artists and a photo essay on south Brooklyn’. Art Missive is also bound with a letter pressed cover adding ‘a touch of elegance.’ Stop Go Destroy #5 ($2 to Clint, 5245 College Ave, Box 411, Oakland CA 94618) is a ‘gem’ made by ‘smart art school students’ with a black-on-black cover. Politics also features strongly, typically from a left of centre or anarchic perspective. East Village Inky #25 ($3 to Ayun Halliday, PO Bo 22754, Brookly, NY 11202-2754) protests the Republican National Convention’s New York jaunt.
Some zines emerge out of distinct scenes, often bringing a political and witty take on lifestyle choices or leisure pursuits: Bearing Edge: a publication for and about drummers; Cash Flag (B-movies and horror films); Chainbreaker (political biking), Salt & Slush: Winter Recipies (vegetarian or vegan, with zinester-friendly instructions such as form dough balls ‘the size of skateboard wheels’).
Pop culture finds its zinester in Cartography for Beginners or Femme: the golden age of Wonder Woman (Mandy, femmezine@hotmail.com) ‘writings and comix about comix’, and several others listed n BZE.
Other zines reveal the format’s ability to rework and share reality, celebrating and philosophising the mundane (for example King-Cat #63 ($3 to John Porcellino, PO Box 170535, San Francisco, CA 94117, www.king-cat.net) offers ‘stories of barbers I have known, pigeons, palm tree and a California Road Trip’). More profound issues are tackled in Broken Hipster Zine ($1, Emikop, 2520 SE 43rd Ave #B, Portland, OR 97206), with drawings of Emiko’s ‘wrenching and intense story of… kidney failure and dialysis’. Body image and institutionalised medicine are explored without ‘poor-me-victim kind of narration’ in Fat Farm ($1, Max Airbour, 8728 Thermal St., Oakland CA 94605).
Many of the zines concentrate on employment, such as the remarkable stories of John Mejias, Paping #11 ($8 to John Mejias, Box 12845, E 7th, NYC 10003, www.paping.org) whose ‘woodcut like comics about the immigrant kids and worn-out teachers that really win me over every time’, or the (mental) underemployment of McJobs in Leeking Ink #28 & 29 ($2, Davida Briar, P.O. Box 963, Harve de Grace, MD 21078), ‘a tale of job hell’, Lululand #3 & 4 ($3 to Amy Adoyzie, c/o Lululand, PO Box 356, Van Nuys, CA 91408-0356) ‘This is it. The zine of the year!… You almost wish you were at that crappy job with Amy just so you can hear her riff on it’: (“Ghee was there for three and a half months. After the daily grind of clicking, dragging and typing up grant proposal after proposal, her hands became so incapacitated that she couldn’t even grip a pen. She quit. She said, ‘Salary and paid vacation don’t mean much if I can’t fucking turn the doorknob to leave’”)) or Adventures of Corprit Boy #6 ($2 to Paul Nama, PO Box 82055, Portland, OR 97282), which capitalises on the fact that ‘all zinesters love getting mail’ by publishing forty postcards about the writer’s ‘surroundings and his thoughts on the state of the world’ while the author was sent to work in a dumptruck factory.
Transport provided the most striking theme this year, with many zinesters providing etiquette tips or lashing out at inconsiderate motorists: Transport: Constant Rider: Manners for Mover #6 ‘romance stories, hygiene hints & tips for successfully riding the bus’; Go By Bicycle #3 and Little, Official Primer of Bicycling Culture in Portland are just some of the zines dealing with the perils of getting out and about in the US. Or try Sara’s account of cycling from Bellingham, Washington, to Tucson, Arizona: Glossalia #4 ($2 to Sara, 5711, NE 24th, Portland, OR 97211): ‘“She Rode HUNDRESD OF MILES! By HERSELF! On a BICYCLE!” (said by old codger in Crescent City, CA’. And there are guides to where to go, if you can get there, such as Miniature Pocket Guide to San Francisco, inspired by Portland Zinester’s Guide, night-time walks in Oakland by Clint (Stop Go Destroy #5) or two-week educational art school tour of Cuba (Educational Tourist, $3, Microcosm Zine Distro):
This morning we travel to ISA, the only art university in Cuba… Legend has it that Che and Castro played golf here in 1961. They decided that this place could not be used for such unimportant things, ‘Let’s make a city of arts!’ they cried. I wonder if they were wearing military gear or golfing outfits?
The prize for conservation nightmare must go to Dream Whip #13 ($4, Bill Brown, P.O. Box 53832, Lubbock, TX 79453): ‘The only down side of Dream Whip is that it is bound with rubber bands, though the thickness of the zine would make binding a problem, and the rubber bands are only a problem for dorky librarians’, concludes Celia Perez, a librarian in Chicago.
What’s out? The environment seems to find fewer followers, although Frugal Environmentalist ($17.95 per annum, 4 issues, P.O. Box 45095, Seattle, WA 98145-0095) contains ‘not one bit of useless information.’ BZE is printed on recycled paper with vegetable based inks. (RAS, I fear, is not). Music zines are thin on the ground, and there are (I think) no zines about working at a dot.com. Both sexes and all genders are given fairly equal weight in the selection of zines, but Riot Grrrls are, it seems, thin on the ground.
Finally, BZE, as is natural given its institutional berth at the IPPR in Portland is North-West coast-centric, although zines from New York and the heartland – and even Australia (All Slay: the sex issue – a Buffy the Vampire Slayer zine) do make their appearance. Enough, one feels, for several future MA thesis and the mulch for a PhD or two, at less than the cost of an academic monograph (although acquisition, processing and preservation costs are likely to disproportionably high).
Best Zine Ever! #3 ([2004]) Tugboat Press, PO Box 12409, Pdx, OR 97212, USA.
Also available via Microcosm Zine Distro, http://www.microsmpublishing.com>.
Library Zines
The United Kingdom is presently not well-served by collections of US Zines, although some institutions, such as the Women’s Library, London, have notable collections of British zines (with some US copies). In part this is because zines are hostile to libraries, preferring less institutional and bureaucratic habitats as their home. They are serials (never the easiest format to collect); they are published erratically (with runs coming to a sudden stop and then revivifying, perhaps under a different title); they may be classed as ephemera (and so may be out of scope); they are usually printed on cheap, acidic paper (lying in the stacks as latent preservation time bombs); and, worst of all, they are not distributed by library suppliers (and refuse to issue invoices). These are all surmountable, with the final point perhaps the hardest to overcome. A stash of US postage stamps, dollar bills, a corporate credit card and, most importantly, access to a Paypal account may smooth the acquisition of US zines. There may also be restrictions on access: as some US librarians have discovered, zines are sometimes obscene, libellous or unsuitable for younger readers.
US Zine collections can be found at Salt Lake Public Library, San Diego State University, and Bingham Center, Duke University, holds the Sarah Dyer zine collection.
Yahoo Groups hosts a zine librarian’s list at http://groups.yahoo.com/group/zinelibrarians/>
Barnard College
http://www.barnard.edu/library/zines/index.htm
Salt Lake Public Library (Zine Collection)
http://www.slcpl.lib.ut.us
West Coast Zine Archive in the Special Collections and University Archives at San Diego State University
http://infodome.sdsu.edu/about/depts/spcollections/rarebooks/zinesfindingaid.shtml#Intro
The Women’s Library, London
http://www.thewomenslibrary.ac.uk/cat_zine.html
Zillions of Zines: the Sarah Dyer Collection, Bingham Center, Duke University:
In July of last year, the Bingham Center received a gift that could prove to be one of our most important 20th century collections. The Sarah Dyer Zine Collection came to us in seven unassuming boxes bursting with thousands of self-published works by women and girls. The publications are opinionated and sometimes unapologetically personal. They range from photocopy-collage to slick-looking glossies, and they express the incredible breadth of women’s interests and talents, as well as the depth of their desire to communicate.
http://scriptorium.lib.duke.edu/women/newsletter /issue01/page4.html
Bibliography
Duncombe, Stephen, Notes from Underground. Zines and the politics of alternative culture (London & New York: Verso, 1997).
Eggers, David Eggers, The Best American Nonrequired Reading (2002), pp. 105-108.
Friedman, R. Seth, ed., The Factsheet Five Zine Reader: the best writing from the underground world of zines (New York: Three Rivers Press; London: Random Houses, c.1997).
Gunderloy, Mike & Cari Goldberg Janice, The World of Zines. A guide to the independent magazine revolution (New York: Penguin Books, 1992).
Perris, Kate. Unearthing the underground: a comparative study of zines in libraries. Dissertation for the MA in Information Services Management at London Metropolitan University, August 2004.
Robbins, Trina, From Girls to Grrrlz. A history of ? comics from teens to zines (San Francisco: Chronicle Books, 1999).
http://grrrlzines.net/writingonzines.htm> includes an extensive bibliography
http://www.zinebook.com
1. http://www.murmurs.com/talk/archive/index.php/t-5951.html>; http://www.theonion.com/onion3813/record-store_clerks.html>
2. Stephen Duncombe, Notes for from Underground: zines and the politics of culture (London & New York: Verso, 1997), pp. 141-173. For an example of the commercial interest in the post-baby boomer market labelled ‘Generation X’, see The Generation X Market (New York: [Packaged Facts], 1996). Douglas Coupland, Generation X: tales from an accelerated culture (New York: St Martins Press, 1991), may be to blame. A commentary on zines brief breaking of the surface of mainstream culture can be found at Chip Rowe, ‘What They’re Saying About Us’, www.zinebook.com, http://www.zinebook.com/resource/biblio.html> (accessed 12 April 2005), which lists articles in The New York Times, Rolling Stone, Newsday and Penthouse. Since the early ‘90s, many critics would argue that the underground has been colonised by mainstream, commercial culture and that the distinction is only made when marketers appropriate the underground in search of the cool.
3. John Marr, ‘Zines are Dead’, Bad Subjects, issue 46 (December 1999), ‹http://www.eserver.org/bs/46/marr.html?source=zinebook› (accessed 12 April 2005).
4. Mike Gunderloy & Cari Goldberg Janice, The World of Zines: a guide to the independent magazine revolution (New York: Penguin Books, 1992); R. Seth Friedman, The Factsheet Five Zine Reader: the best writing from the underground world of zines (New York: Three Rivers Press, c. 1997); Duncombe, Notes for from Underground.
5. For a collection of valedictory essays and criticisms, see ‘Zine Controversies’, www.zinebook.com, http://www.zinebook.com/directory/zine-controversies.html> (accessed 12 April 2005). Similar charges have been levelled at the hand press movement.
Top
‘Newly Discovered Documents’ at The Gilder Lehrman Institute of American History, Jayne Hoare
Jayne Hoare, Cambridge University Library
The US-based Gilder Lehrman Institute of American History, founded in 1994 to promote the study and love of American history, maintains a substantial collection of more than 60,000 documents detailing the social and political history of the United States. The collection’s holdings include manuscript letters, diaries, maps, photographs, printed books and pamphlets ranging from 1493 through modern times. The Collection is particularly rich with materials in the Revolutionary, Antebellum, Civil War and Reconstruction periods. Highlights of the Collection include signed copies of the Emancipation Proclamation, the Thirteenth Amendment, a rare printed copy of the first draft of the Constitution, and thousands of unpublished Civil War soldiers’ letters. Letters written by George Washington, Thomas Jefferson, Abraham Lincoln, Frederick Douglass and others vividly record the issues and events of their day. The writings of such notable women as Lucy Knox, Mercy Otis Warren and Catherine Macaulay discuss a variety of military, political and social issues.
The collection is on deposit at the New York Historical Society in New York, and researchers can visit by appointment. To make its resources more internationally available for researchers, librarians, and teachers the Institute has undertaken to publish some of the documents on its web site. ‘Newly Discovered Documents’ are published every two weeks and are among some of the rarest and most valuable materials in the collection. Anyone who wants to know instantly when a new document is published can sign up to be notified via email.
Featured documents already published are listed below, with brief descriptions taken from the Institute’s web pages (http:// www.gilderlehrman.org>)
Martha Washington to Francis B. Washington – In one of Martha Washington’s earliest known letters, she shows conflicting feelings about balancing her family life with her role as a political wife.
Diary of a Black Soldier in the 8th U.S. Colored Troops, Company G, and Eyewitness – William P. Woodlin, a musician in Company G of the 8th Regiment of the United States Colored Troops (8th USCT), compiled a 123-page diary describing his military service from November 1863 to December 1864. His near daily reports, told in a stoic and detached voice, provide a window into the life of African American soldiers on the front line.
The “Three-Fifths Clause” – This broadside detailing data from the 1800 census in New York provides a sense of the pervasiveness of slavery, even in a northern state like New York.
“… we Cannot indulge in grief however mournful yet pleasing.” – In this beautifully written letter, Robert E. Lee attempts to console his son on the loss of his son’s wife. The letter demonstrates the tremendous emotion Lee felt for his family and offers a glimpse of the strength that carried Lee through the war.
Provisional Army Orders Detailing Ceremony in Honor of George Washington’s Death – Washington could have never held back the outpouring of national grief despite his specific request to “be Interred in a private manner, without parade, or funeral Oration.”
George Washington to New Hampshire, December 29, 1777 – George Washington’s words in this letter represent a stirring plea for help at the darkest moment of the American Revolution. This famous letter illustrates Valley Forge as an icon of American perseverance and resolve in the face of cruel fortune and overwhelming odds.
Political Intrigue and the Electoral College – The Jeffersonians felt threatened by Burr’s ambition and took out an insurance policy with the passage of the 12th amendment.
Mutiny – The plot to either kidnap or assassinate George Washington was never close to reaching its lofty goals, but it did point toward disaffection in the Continental Army.
Hamilton vs. the Partisan Press – Alexander Hamilton made verbal jousting in the press a venerated American tradition. He took full advantage of the freedom of the press outlined in the Bill of Rights, as did his innumerable enemies.
“That Monster, the constitution.” – This unique copy of the Constitution, printed in the early spring of 1788 by Claxton and Babcock in Albany, New York, can be seen as a last minute offensive by the Federalists to garner support for the proposed government.
“- 5 VIII (-),IV,X 5VII1 III IV IX made of more V 10 1 5 III IX II 5 IX 1 VII 8 5 10 VIII IX 7 5 III II ( ) VIII 1 10 0 The advice comes with double force ~” – Written to his son-in-law on Friday, July 20, 1804, nine days after his duel with Alexander Hamilton, this letter offers the possibility that hidden in the mystery of his cipher lie Burr’s genuine motives, plans, and feelings at this critical moment in American history.
“I doubt whether we should have had a real Union but for Hamilton; I think you must know that Jefferson would never have given us one.” Letter from Horace Greeley to Henry Stephens Randall – Written in 1861, this letter from Horace Greeley to Henry Stephens Randall emphasizes Alexander Hamilton’s role in building a strong federal government and stable economy.
The Declaration of Independence – The Declaration of Independence called for recognition of fundamental rights that demanded protection. The Revolution secured American Independence and the Constitution codified a means to maintain American liberty. Alexander Hamilton may not have signed the Declaration, but he certainly left his imprimatur on the new government it promised.
The Preamble to the Constitution of the United States – Where the draft’s opening reflects the sense of the thirteen states as separate entities, the final version’s “We the People of the United States” invokes the Hamiltonian vision of a united nation.
“General Hamilton was this morning woun[d]ed by that wretch Burr” – Mere hours after the duel, Angelica Church writes in haste to her brother Philip Schuyler to break the news to him, expressing her futile hope that Hamilton would recover. The hasty scrawl of her handwriting suggests the degree of her distress.
“The expediency of encouraging manufactures in the United States…appears at this time to be pretty generally admitted”
Alexander Hamilton’s Report on Manufactures – Hamilton offered a remarkably modern economic vision based on investment, industry, and expanded commerce. Most strikingly, it was an economic vision with no place for slavery.
“Jefferson is in my view less dangerous than Burr”
Alexander Hamilton on the Deadlocked Presidential Election – This letter is one of a stream that Hamilton sent fellow Federalists during the deadlock that followed the election of 1800. They were among the most consequential Hamilton ever wrote, for both Hamilton and the nation.
“To be thus monopolized, by a little nut-brown maid like you”. A Love Letter from Alexander Hamilton to Elizabeth Schuyler – In this intimate letter to Elizabeth Schuyler, Hamilton casts himself as both a lover and a statesman. His charm succeeded and the two were married on December 14, 1780 at Betsey’s family home near Albany, New York.
“The want of money makes us want everything else” – In his letter to a French diplomat, Hamilton cannot refute his ally’s gloomy view of the war. By October 1780 Hamilton was discouraged by the apparent apathy of the American people and the ineffectuality of their elected representatives, as well as by the recent discover of Benedict Arnold’s treachery.
“I never mean… to possess another slave by purchase”:
George Washington on the Abolition of Slavery – Among all the well known founders who were major slaveholders at the time of the Revolution, George Washington was the only one who actually ended up freeing his slaves. But Washington never spoke out publicly against the institution of slavery. Instead, he arrived at his conclusion that slavery was immoral and inconsistent with the ideals of the American Revolution gradually, privately, and with difficulty.
“In the End You Are Sure to Succeed”: Lincoln on Perseverance – In one of Lincoln’s most accomplished personal letters, he writes to George Clayton Latham, a friend of his son Robert, on perseverance. This gem of optimistic correspondence testifies as eloquently to Lincoln’s own perseverance, discipline, and uncompromising work ethic as it does to his extraordinary ability to inspire others.
“Heaven Alone Can Foretell”: Washington’s Reluctance to Become President – In this April 1789 letter to General Henry Knox, George Washington’s friend from the Continental Army who now served as Secretary of War, Washington accepts the inevitability of his election to the presidency, but with remarkable reluctance.
“Your Late Lamented Husband” – Following the assassination of Abraham Lincoln, widow Mary Todd Lincoln sent Frederick Douglass the President’s “favorite walking staff.” In his remarkable letter of reply, Douglass assured the First Lady that he would forever possess the cane as an “object of sacred interest,” not only for himself, but because of Mr. Lincoln’s “humane interest in the welfare of my whole race.”
Thomas Jefferson on Politics & Principles – Apart from his authorship of the Declaration of Independence, Thomas Jefferson is best known to us through his letters. A conscientious correspondent both as a public servant and private citizen, Jefferson’s letters over a period of some 65 years number in the tens of thousands, many of which are still unpublished. But their numbers are less notable than their wide-ranging and highly quotable content, which is matched by the skill and precision with which he wrote.
John Quincy Adams and the Amistad Case – Abolitionists enlisted former U.S. President John Quincy Adams to represent the Amistad captives’ petition for freedom before the Supreme Court. Adams, then a 73-year-old U.S. Congressman from Massachusetts, had in recent years fought tirelessly against Congress’s “gag rule” banning anti-slavery petitions. Here, with characteristic humility, Adams accepts the job of representing the Amistad captives, hoping he will “do justice to their cause.”
“I love you, but hate slavery.” – On October 4, 1857, Frederick Douglass wrote an extraordinary letter to his former master, Hugh Auld. At the heart of this letter, written when Douglass was 39 and already famous as an abolitionist leader, is the great man’s effort to recover facts and dates from his childhood.
Account of Sinking of the Titanic – This letter, written on Carpathia stationery by a first class passenger on the Titanic, is one of the earliest, most immediate and compelling accounts of the disaster.
“Newly Discovered Documents” and email notification service can be found on the The Gilder Lehrman Institute’s web pages at:
http://www.gilderlehrman.org/collection/documents.html
Top
American Studies Resources Centre, Liverpool John Moores University, Ian Ralston
Ian Ralston outlines the purpose and holdings of the Centre
Established in 1987 with the aid of a grant from the United States government and with the support of the British Association for American Studies, the American Studies Resources Centre (ASRC) initially began its work supporting the study of the United States in secondary schools and community colleges across the United Kingdom. This was achieved by responding directly to research requests from both students and teachers. An audiovisual loan service, the publication of an annual magazine and a programme of student conferences complemented this work. Over a period of several years the ASRC expanded its remit to include undergraduate students of American Studies, as well as private requests from the general public and the media both in the UK and abroad. A highly successful website ARNet (http://www.americansc.org.uk) was also created to support this work. This website now includes a full listing of every American Studies undergraduate degree and MA programme offered by UK Universities, as well as an online magazine and book reviews section.
At present the ASRC continues to support all of the above noted areas. It has also expanded its conference programme and has acted as host for numerous visiting US academics and the presentation of guest lectures. The ASRC also maintains close links with a number of American Universities and academics/writers, as well as having a US-based Advisory Panel.
Located in the Aldham Robarts Learning Resource Centre, the ASRC welcomes visits from school or college groups for study days and from interested individuals. Services such as the audiovisual loan service remain open to teachers or lecturers. Visiting groups are also welcome to make use of the ASRC’s collection of texts on all aspects of the study of the USA. A new collection of materials (donated to John Moores University) on the McCarthy period will also become available to users in the near future.
For details of all ASRC services, log onto the ASRC web site, ARNet, at http://www.americansc.org.uk.
The opening hours of the ASRC are subject to semester variation; therefore, before any visit, users are strongly advised to email the Director (info@americansc.org.uk) or telephone 0151 231 3241.
Top
Second Air Division USAAF Memorial Library, Alexis K. Ciurczak
Alexis K. Ciurczak, Fulbright Librarian, Second Air Division Memorial Library
During the Second World War over 6,700 young Americans, members of the 2nd Air Division of the 8th United States Army Air Forces, based in Norfolk and Suffolk England, lost their lives in the line of duty. The Second Air Division Memorial Library is an outgrowth of an original idea of three senior officers of the 2nd Air Division, B/Gen Milton Arnold (2nd Combat Wing), Col. Frederick Bryan, Deputy Chief of Staff, Headquarters 2nd Air Division, and Lt. Col. Ion Walker, Ground Exec., 467th Bomb Group. It began with a fund raising effort launched before hostilities ended in 1945. An Appeal was launched to leave a permanent memorial to the fallen comrades of the 2nd Air Division. What resulted was the 2nd Air Division Memorial Library, which, through the Memorial Trust, has been funded largely through the 2nd Air Division Association, which is comprised of former members of the 2nd Air Division USAAF.
The purpose of the Library is to house a collection of materials about American freedom, culture, and life, about the Second World War in the air, and about the special relationship between the people of the United Kingdom, specifically the people of East Anglia, and the people of the United States.
The Second Air Division USAAF Memorial Library is a unique war memorial as well as being a fully functioning library, open to the public. The library is spread over approximately 2,000 square feet (185 square meters) on the ground floor of the Norfolk and Norwich Millennium Library, in The Forum, located in the city centre. The library consists of the book collection of over 4700 titles, numerous current American periodical subscriptions, archival material and a collection of paintings and memorabilia. The subject strength is World War II, US Air Force and aviation history, and unit histories of the Bomb Groups of the 8th Air Force. The archive collection, currently located at the Norfolk Records Office, consists of records of the various Bomb Groups and their personnel. The library also has an extensive collection of videos and oral history material, relating to veterans’ experiences in WW II, as well as a large collection of original photographs relating to the airfields in Norfolk and Suffolk of the “Mighty Eighth” (8th Air Force). In addition to the World War II collection, the library focuses on American Studies materials, notably American art, history, music (jazz and blues), Native American art and culture, biography, cookery, quilting and other American crafts, and current travel guides.
The Shrine Area, as this library is an official War Memorial, is designed to be a place for peaceful reflection and meditation. The Roll of Honour, listing all those from the Second Air Division, who lost their lives in the line of duty between 1942 and 1945, is on display here as well as the standards and banners of the various Bomb Groups.
Visitors to the Memorial Library include returning veterans from the 8th Air Force and their families, descendants of these veterans researching family and flight mission history, aviation and World War II enthusiasts and the general public. Students from the American Studies programs in the area use the collection as well.
Each year, the Second Air Division sponsors a Fulbright Fellow to serve the Trust as the American Librarian, in the Memorial Library. Both the Trust Librarians and the Library Information Assistants can assist visitors with their information needs and research questions during regular opening hours, which are currently Monday to Saturday 9:00am – 5:00pm with late opening Tuesday at 10:00am. Phone and email enquiries are also accepted. Photocopying facilities and free Internet access are also available.
The Forum, Millennium Plain
Norwich NR2 1AW
Norfolk
Phone 01603 774747
Fax 01603 774749
Email: 2admemorial.lib@norfolk.gov.uk
Web site: http://www.2ndair.org.uk
Top
John F. Kennedy-Institute Library Profile, Benjamin Blinten
Benjamin Blinten, Head Librarian, Library of the John F. Kennedy-Institute
The Library of the John F. Kennedy-Institute (JFKI) was founded in 1952 and today holds the largest collection of North American Studies materials in Germany. Besides serving teachers and students of the Kennedy-Institute it is also an important European source for research on North America. This role is acknowledged by the Deutsche Forschungsgemeinschaft which has included the JFKI Library in its grant scheme, supporting the acquisition of microform collections on ethnic minorities and North American newspapers on microfilm. The library is also supported by the Canadian and American embassies in Berlin. Every year, between forty and fifty scholars from all over Europe visit the library through a grant program funded jointly by the Canadian embassy and the Freie Universität Berlin of which the JFKI is a part.
Today the library holds about 900,000 volumes, 70% of which are in microform, and subscribes to about 350 periodicals. The large microform collections include Early American Imprints, Pre-1900 Canadiana and Early American Newspapers. Subject areas covered are literary studies, cultural studies, history, sociology (including urban and women’s studies), political science, economy and – to a smaller degree – language (American English), art, religion, education and geography. The library also owns 16,000 slides, 2,000 videos and 1,500 records. It is connected to the electronic services of the University Library of the Freie Universität, including e-journals and bibliographic databases like MLA, America: History and Life, Sociological Abstracts and Contemporary Authors.
The library is open free of charge to the general public. Lending service, though, is restricted to residents of Berlin and the surrounding area. Almost all books and a large part of the microforms are available in open stacks. Since 2000 the library has used the Dewey Decimal Classification for shelving its new acquisitions, although the largest part of the book collection is organised by the old classification system created especially for the JFKI library in the 1960s and 1970s. The reading room area offers twenty workplaces with PCs, 66 without, several copying machines and microform readers, two reader-printers and two scanners – one for microforms and one for books. There is also a screening room equipped with VHS and DVD players and a stereo set.
Details of the collections can be found at
http://www.fu-berlin.de/jfki/index_e.html>
Freie Universität Berlin
John F. Kennedy-Institute for North American Studies
Lansstrasse 7-9
14195 Berlin
phone: +49 (0)30 838 52703
fax : +49 (0)30 838 52882
email: jfki@zedat.fu-berlin.de
Top
Roosevelt Study Center, Middelburg, Netherlands
The Roosevelt Study Center is a research institute, conference centre, and library on twentieth-century American history located in a twelfth-century abbey in Middelburg, the Netherlands. It is named after three famous Americans, whose ancestors emigrated from Zeeland, the Netherlands, to the New World in the seventeenth century: President Theodore Roosevelt (1858-1919), President Franklin D. Roosevelt (1882-1945), and Eleanor Roosevelt (1884-1962).
The Roosevelt Study Center (RSC) is subsidized by the Royal Netherlands Academy of Arts and Sciences, the Province of Zeeland, and private corporations and institutions. The RSC cooperates with Dutch universities in research projects, as well as with the Theodore Roosevelt Association and the Franklin and Eleanor Roosevelt Institute in various ways. The RSC is a founding member of the American Studies Network, a cooperation of the twenty foremost American Studies Centers in Europe. It has an annual newsletter, The Roosevelt Review, and an inhouse publications series.
Research in American history
The Roosevelt Study Center offers the following resources for research in American history :
Primary sources
The RSC has 93 collections with primary documents. Most of those documents were put on microfilm or microfiche. Some of these collections were published in hardbound volumes. The documents can be read by using readers. A reader printer can photocopy documents both from microfiche and microfilm (€ 0.10 a copy). The list below reveals whether it is a microfilm (FILM) or microfiche (FICHE) collection. Almost all collections have a guide or index. It is also possible to make prints from digital collections (€ 0.10 a copy).
Books
The RSC maintains a reference library with 7,000 volumes. Most books can be borrowed free of charge, with a maximum of five volumes per visitor. An online catalogue allows searches by title, author, and subject. A detailed systematic guide is available. A photocopier is available at the secretary’s office.
Periodicals and Newspapers
The RSC subscribes to 27 scholarly journals and has a collection of 15 historical magazines, partly on microform. Periodicals cannot be checked out. A digital copy of The New York Times can be accessed.
Videos and DVDs
The RSC owns a collection of 400 videos on aspects of American history and DVDs with newsreels on the Roosevelts. A detailed subject guide with summaries of the videos is available. Viewers should make an appointment with the secretary.
Films
The RSC has 75 films on 35 mm, mostly historical documentaries of the TR period and the immediate post-World War II period.
Conferences
The Roosevelt Study Center is the venue for a number of conferences. Each year the RSC hosts the annual meeting of the Netherlands American Studies Association in June. In the month April in odd-numbered years European Historians of the United States meet in the RSC for a conference. Apart from these events the RSC stages international seminars and conferences on specialized subjects and participates in conferences organized by related institutes.
Recent conferences include the Netherlands American Studies Association (NASA) organized a conference on “The Stories of World War II” at the Vrije Universiteit Amsterdam, a bilingual conference at the Vrije Universiteit in Amsterdam on “Morsels in the Melting Pot: The Persistence of Isolated Dutch Communities in North America, 1800-2000” and a symposium on “Nederlandse media en de Amerikaanse presidentsverkiezingen” in Utrecht.
RSC Research Grant
European scholars at all stages in their careers (advanced students preparing for a master’s or doctoral degree, and scholars preparing a publication) are invited to apply for a RSC Research Grant. The grant consists of a per diem of € 30 (covering bed and breakfast in a low-budget hotel), payment for a rail ticket/ferry ticket and a lump sum of € 45 for photocopies. The research period at the Roosevelt Study Center must be one (minimum) to four (maximum) weeks. The maximum of the grant is € 950.
All applications for a RSC research grant involving research work leading to a master’s or doctoral degree must be endorsed by the professor supervising the work. The Roosevelt Study Center can only offer a limited number of grants and will divide them between applicants from different European countries.
Applications for a Roosevelt Study Center research grant should be submitted at least two months before the desired period of research. Applications including the completed forms, a description of the research project (1-3 pages), and a curriculum vitae should be submitted to:
Prof. Dr. Cornelis A. van Minnen
Executive Director
Roosevelt Study Center
P.O. Box 6001
4330 LA Middelburg
THE NETHERLANDS
Top
Lincolniana at the John Hay Library, Brown University
Brown University Library Special Collections
The Special Collections of the Brown University Library contain more than 2,500,000 items, well over half the library’s total resources. Holdings range from Babylonian clay tablets and Egyptian papyri to books, manuscripts and ephemera. Among the more unexpected items are portraits and paintings by old masters, Elizabeth Barrett Browning’s tea set, Amy Lowell’s cigars, 5,000 toy soldiers, and locks of hair from several famous heads – among them, Lincoln, Napoleon, and Sir Walter Scott.
The collections, housed in the John Hay Library include some 300,000 monographs, 725,000 manuscripts, 500,000 pieces of sheet music, and 50,000 each of broadsides, photographs, prints and postage stamps, plus over one million archival files and records. Among the most notable holdings are the world’s largest collection of American poetry and plays, one of the nation’s finest history of science collections, an exceptional collection of Lincolniana, and an internationally-known collection on military history. There are also important collections of incunabula, collections devoted to the writings of major individual authors, such as Poe, Thoreau, Zola, and William Blake, and manuscript and archival collections that offer research opportunities in a wide variety of historical and literary subjects.
The approximately 250 separate collections have been accumulated over more than two centuries. They fall into two general categories: books or other research materials originally included in the library’s general collections and later isolated because of value, age, rarity, fragility, historical interest, or potential for enhancement of a specific special collection; and collections or individual items acquired with the intention of retaining their separate identity.
Charles Woodberry McLellan Collection of Lincolniana
A collection, comprising more than 30,000 items in various media, of materials by and about Abraham Lincoln, 16th president of the United States, and about the historical and political context of his life and career, chiefly the U.S. Civil War and its causes and aftermath. The collection of Charles Woodberry McLellan, one of five great Lincoln collectors at the turn of the 20th century, was acquired for Brown University in 1923 by John D. Rockefeller, Jr., Class of 1897, and others, in memory of John Hay, Class of 1858, one of Lincoln’s White House secretaries; in the ensuing 75 years it has been increased to more than five times its original size.
The books and pamphlets include 85-90 percent of the titles in Jay Monaghan’s Lincoln bibliography, 1829-1939 (many in multiple editions and variant copies), as well as many thousand volumes of contemporary and later publications relating to the Civil War and the slavery controversy. In conjunction with the Harris Collection, the John Hay Library holds probably the largest collection anywhere of poems about Lincoln. There is also a good selection of representative titles of books that Lincoln read.
The manuscript collection includes original letters, notes, and documents, over 950 written or signed by Lincoln; material relating to Lincoln’s family and associates; and facsimiles of manuscripts held by other institutions. The broadsides include song sheets, political sheets, ballots, and posters; also 27 of the 52 printed editions listed in Charles Eberstadt, Lincoln’s Emancipation Proclamation. There is a selection of newspapers for 1860-1865; an index to the 11,300+ entries for Lincoln items in all existing files of Illinois newspapers to the end of the Civil War; and photocopies of the clipping files of the Louis A. Warren Lincoln Library and Museum, Fort Wayne, Ind.
The prints, arranged according to Meserve numbers, include most of the known photographs of Lincoln, engravings, and Currier & Ives prints. There are also original oil portraits by artists of Lincoln’s day, most notably the portrait by Peter Baumgras, 1827-1903; some original drawings, as well as a scrapbook of Thomas Nasts’s Civil War sketches. The statuary includes two Rogers groups, an original Truman Bartlett plaster statuette, and replicas of Leonard Volk’s work. The sheet music comprises every known piece relating to Lincoln, including funeral marches, memorial songs, and campaign songs. The museum objects include over 550 medals, mourning and campaign badges, coins, postage stamps, etc.
Holly_Snyder@brown.edu
http://dl.lib.brown.edu/collatoz/>
For further information on the Civil War history collections at Brown, see also the following Bibliofile article:
http://www.brown.edu/Facilities/University_Library/publications/Bibliofile/Biblio32/civilwar.html>
N.B. The information and text on the John F. Kennedy Institute, the Roosevelt Study Center and the John Hay Library have been sourced from the relevant web pages, with the kind permission and assistance of the librarians and curators.
Top
Forthcoming Conferences
Points of Contact: The Heritages of William Carlos Williams
July 27-29, 2005
Johann Wolfgang Goethe University, Frankfurt am Main, Germany.
Lesbian, Gay, Bisexual, and Transgender Theatre
July 28-31, 2005
Westin St. Francis Hotel, San Francisco, California.
(Trans)Forming Bodies
5 August 2005
School of American and Canadian Studies, University of Nottingham.
12th Olomouc Colloquium of American Studies
4-9 September 2005
Palacky University, Olomouc, Czech Republic.
Paddy’s Lamentation: Ireland and the American Civil War
13 October 2005
Poetry Centre, South Wing (Room S.1.5), Faculty of Arts, University of Manchester.
US Political Cartoons
18 October 2005
Day Conference, British Library.
Organised by the Eccles Centre for American Studies
http://www.bl.uk/collections/americas/america.html>
Transatlantic James
10-13 November 2005
MMLA: Milwaukee, Wisconsin.
The United States in the 1980s: the Reagan Years
10-12 November 2005
A three-day interdisciplinary conference examining the subject of the United States in the 1980s.
Rothermere American Institute, Oxford University
http://www.rai.ox.ac.uk/seminars/index.html.
5th MESEA Conference The Society for Multi-Ethnic Studies: Europe and the Americas
18-20 May 2006
Pamplona, Spain.
The Atlantic World of Print in the Age of Franklin of Philadelphia.
29-30 September 2006
Sponsored by the Library Company of Philadelphia, McNeil Center for Early American Studies, and the School of Arts and Sciences, University of Pennsylvania. In conjunction with the American Printing History Association annual meeting.
Current listings are available on the BAAS website:
Top
Web Sites of Interest
France in America
The Library of Congress and the National Library of France (Bibliothèque nationale de France) have launched a bilingual (English-French) online presentation that explores the history of the French presence in North America and the interactions between the French and American peoples from the early 16th to the early 19th centuries.
http://gallica.bnf.fr/FranceAmerique/fr/default.htm
http://international.loc.gov/intldl
FirstGov US Government Graphics and Photographs
http://www.firstgov.gov/Topics/Graphics.shtml