American cities in the late nineteenth century

By Brinkley, A.

Edited by Paul Ducham


“We cannot all live in cities,” the journalist Horace Greeley wrote shortly after the Civil War, “yet nearly all seem determined to do so.” The urban population of America increased sevenfold in the half century after the Civil War. And in 1920, the census revealed that for the first time, a majority of the American people lived in “urban” areas—defined as communities of 2,500 people or more. New York City and its environs grew from 1 million in 1860 to over 3 million in 1900. Chicago had 100,000 residents in 1860 and more than a million in 1900. Cities were experiencing similar growth in all areas of the country.
Natural increase accounted for only a small part of the urban growth. In fact, urban families experienced a high rate of infant mortality, a declining fertility rate, and a high death rate from disease. Without immigration, cities would have grown slowly, if at all. The city attracted people from the countryside because it off ered conveniences, entertainments, and cultural experiences unavailable in rural communities. Cities gave women the opportunity to act in ways that in smaller communities would have been seen to violate “propriety.” They gave gay men and lesbian women space in which to build a culture (even if still a mostly hidden one) and experiment sexually at least partly insulated from the hostile gaze of others. But most of all, cities attracted people because they off ered more and better-paying jobs than were available in rural America or in the foreign economies many immigrants were fleeing.
People moved to cities, too, because new forms of transportation made it easier for them to get there. Railroads made simple, quick, and inexpensive what once was a daunting journey from parts of the American countryside to nearby cities. The development of large, steam-powered ocean liners created a highly competitive shipping industry, allowing Europeans and Asians to cross the oceans to America much more cheaply and quickly than they had in the past.


As a result of urbanization, the late nineteenth century became an age of unprecedented geographic mobility, as Americans left the declining agricultural regions of the East at a dramatic rate. Some who left were moving to the newly developing farmlands of the West. But many were moving to the cities of the East and the Midwest.
Among those leaving rural America for industrial cities in the late nineteenth century were young rural women, for whom opportunities in the farm economy were limited. As farms grew larger, more commercial, and more mechanized, they became increasingly male preserves; and since much of the workforce on many farms consisted of unskilled and often transient workers, there were fewer family units than before. Farm women had once been essential for making clothes and other household goods, but those goods were now available in stores or through catalogs. Hundreds of thousands of women moved to the cities, therefore, in search of work and community.
Southern blacks were also beginning what would be a nearly century-long exodus from the countryside into the cities. Their withdrawal was a testament to the poverty, debt, violence, and oppression African Americans encountered in the late-nineteenth century rural South. The opportunities they found in cities were limited. Factory jobs for blacks were rare, and professional opportunities almost nonexistent. Urban blacks tended to work as cooks, janitors, domestic servants, and in other low-paying service occupations. Because many such jobs were considered women’s work, black women often outnumbered black men in the cities.
By the end of the nineteenth century, there were substantial African American communities (10,000 people or more) in over thirty cities—many of them in the South, but some (New York City, Chicago, Washington, D.C., Baltimore) in the North or in border states. Much more substantial African American migration was to come during World War I and after; but the black communities established in the late nineteenth century paved the way for the great population movements of the future.
The most important source of urban population growth in the late nineteenth century, however, was the arrival of great numbers of new immigrants from abroad: 10 million between 1860 and 1890, 18 million more in the three decades after that. Some came from Canada, Mexico, Latin America, and—particularly on the West Coast—China and Japan. But by far the greatest number came from Europe. After 1880, the flow of new arrivals began for the first time to include large numbers of people from southern and eastern Europe: Italians, Greeks, Slavs, Slovaks, Russian Jews, Armenians, and others. By the 1890s, more than half of all immigrants came from these new regions, as opposed to less than 2 percent in the 1860s.
In earlier stages of immigration, most new immigrants from Europe (with the exception of the Irish) were at least modestly prosperous and educated. Germans and Scandinavians in particular had headed west on their arrival, either to farm or to work as businessmen, merchants, professionals, or skilled laborers in midwestern cities such as St. Louis, Cincinnati, and Milwaukee. Most of the new immigrants of the late nineteenth century, however, lacked the capital to buy farmland and lacked the education to establish themselves in professions. So, like the poor Irish immigrants before the Civil War, they settled overwhelmingly in industrial cities, where most of them took unskilled jobs.


By 1890, the population of some major urban areas consisted of a majority of foreign-born immigrants and their children: 87 percent of the population of Chicago, 80 percent in New York City, 84 percent in Milwaukee and Detroit. (London, the largest industrial city in Europe, by contrast, had a population that was 94 percent native.) New York had more Irish than Dublin and more Germans than Hamburg. Chicago eventually had more Poles than Warsaw. Equally striking was the diversity of the new immigrant populations. In other countries experiencing heavy immigration in this period, most of the new arrivals were coming from one or two sources: Argentina, for example, was experiencing great migrations too, but almost everyone was coming from Italy and Spain. In the United States, however, no single national group dominated. In the last four decades of the nineteenth century, substantial groups arrived from Italy, Germany, Scandinavia, Austria, Hungary, Russia, Great Britain, Ireland, Poland, Greece, Canada, Japan, China, Holland, Mexico, and many other nations. In some towns, a dozen diff erent ethnic groups found themselves living in close proximity.
Most of the new immigrants were rural people, and their adjustment to city life was often painful. To help ease the transition, many national groups formed close-knit ethnic communities within the cities: Italian, Polish, Jewish, Slavic, Chinese, French-Canadian, Mexican, and other neighborhoods (often called “immigrant ghettoes”) that attempted to recreate in the New World many of the features of the Old.
Some ethnic neighborhoods consisted of people who had migrated to America from the same province, town, or village. Even when the population was more diverse, however, the immigrant neighborhoods off ered newcomers much that was familiar. They could find newspapers and theaters in their native languages, stores selling their native foods, churches or synagogues, and fraternal organizations that provided links with their national pasts. Many immigrants also maintained close ties with their native countries. They stayed in touch with relatives who had remained behind. Some (perhaps as many as a third in the early years) returned to Europe or Asia or Mexico after a short time; others helped bring the rest of their families to America.
The cultural cohesiveness of the ethnic communities clearly eased the pain of separation from the immigrants’ native lands. What role it played in helping immigrants become absorbed into the economic life of America is a more difficult question to answer. It is clear that some ethnic groups (Jews and Germans in particular) advanced economically more rapidly than others (for example, the Irish). One explanation is that, by huddling together in ethnic neighborhoods, immigrant groups tended to reinforce the cultural values of their previous societies. When those values were particularly well suited to economic advancement in an industrial society—as was, for example, the high value Jews placed on education—ethnic identification may have helped members of a group to improve their lots. When other values predominated—maintaining community solidarity, sustaining family ties, preserving order—progress could be less rapid.


Despite the substantial diff erences among the various immigrant communities, virtually all groups of the foreign-born had certain things in common. Most immigrants, of course, shared the experience of living in cities (and of adapting from a rural past to an urban present). Most were young; the majority of newcomers were between fifteen and forty-five years old. And in virtually all communities of foreign-born immigrants, the strength of ethnic ties had to compete against another powerful force: the desire for assimilation.
Many of the new arrivals from abroad had come to America with romantic visions of the New World. And however disillusioning they might find their first contact with the United States, they usually retained the dream of becoming true “Americans.” Even some first-generation immigrants worked hard to rid themselves of all vestiges of their old cultures, to become thoroughly Americanized. Second-generation immigrants were even more likely to attempt to break with the old ways, to try to assimilate completely into what they considered the real American culture. Some even looked with contempt on parents and grandparents who continued to preserve traditional ethnic habits and values.
The urge to assimilate put a particular strain on relations between men and women in immigrant communities. Many of the foreign-born came from cultures in which women were more subordinate to men, and more fully lodged within the family, than most women in the United States. In some immigrant cultures, parents expected to arrange their children’s marriages and to control almost every moment of their daughters’ lives until marriage. But out of either choice or economic necessity, many immigrant women (and even more of the American-born daughters of immigrants) began working outside the home and developing friendships, interests, and attachments outside the family. The result was not the collapse of the family-centered cultures of immigrant communities; those cultures proved remarkably durable. But there were adjustments to the new and more fluid life of the American city, and often considerable tension in the process.
Assimilation was not entirely a matter of choice. Native-born Americans encouraged it, both deliberately and inadvertently, in countless ways. Public schools taught children in English, and employers often insisted that workers speak English on the job. Although there were merchants in immigrant communities who sold ethnically distinctive foods and clothing, most stores by necessity sold mainly American products, forcing immigrants to adapt their diets, wardrobes, and lifestyles to American norms. Church leaders were often native-born Americans or assimilated immigrants who encouraged their parishioners to adopt American ways. Some even reformed their theology and liturgy to make it more compatible with the norms of the new country. Reform Judaism, imported from Germany to the United States in the mid-nineteenth century, was an effort by American Jewish leaders (as it had been among German leaders) to make their faith less “foreign” to the dominant culture of a largely Christian nation.


The arrival of so many new immigrants, and the way many of them clung to old ways and created culturally distinctive communities, provoked fear and resentment among some nativeborn Americans, just as earlier arrivals had done. Some people reacted against the immigrants out of generalized fears and prejudices, seeing in their “foreignness” the source of all the disorder and corruption of the urban world. “These people,” a Chicago newspaper wrote shortly after the Haymarket bombing, referring to striking immigrant workers, “are not American, but the very scum and off al of Europe . . . Europe’s human and inhuman rubbish.” Native-born Americans on the West Coast had a similar cultural aversion to Mexican, Chinese, and Japanese immigrants. Other native laborers were often incensed by the willingness of the immigrants to accept lower wages and to take over the jobs of strikers.
The rising nativism provoked political responses. In 1887, Henry Bowers, a self-educated lawyer obsessed with a hatred of Catholics and foreigners, founded the American Protective Association, a group committed to stopping the immigrant tide. By 1894, membership in the organization had reportedly reached 500,000, with chapters throughout the Northeast and Midwest. That same year a more genteel organization, the Immigration Restriction League, was founded in Boston by five Harvard alumni. It was dedicated to the belief that immigrants should be screened, through literacy tests and other standards designed to separate the desirable from the undesirable. The league avoided the crude conspiracy theories and the rabid xenophobia of the American Protective Association. Its sophisticated nativism made it possible for many educated, middle-class people to support the restrictionist cause. Even before the rise of these new organizations, politicians were struggling to find answers to the “immigration question.” In 1882 Congress had responded to strong anti-Asian sentiment in California and elsewhere and restricted Chinese immigration, even though the Chinese made up only 1.2 percent of the population of the West Coast (see pp. 448–449). In the same year, Congress denied entry to “undesirables”—convicts, paupers, the mentally incompetent—and placed a tax of 50 cents on each person admitted. Later legislation of the 1890s enlarged the list of those barred from immigrating and increased the tax.
These laws kept out only a small number of aliens, however, and more ambitious restriction proposals made little progress. Congress passed a literacy requirement for immigrants in 1897, but President Grover Cleveland vetoed it. The restrictions had limited success because many native-born Americans, far from fearing immigration, welcomed it and exerted strong political pressure against the restrictionists. Immigration was providing a rapidly growing economy with a cheap and plentiful labor supply; many employers argued that America’s industrial (and indeed agricultural) development would be impossible without it.


In the eighteenth and early nineteenth centuries, cities had generally grown up haphazardly, with little central planning. By the mid-nineteenth century, however, reformers, planners, architects, and others began to call for a more ordered vision of the city. The result was the self-conscious creation of public spaces and public services.
Among the most important innovations of the midnineteenth century were great urban parks, which reflected the desire of a growing number of urban leaders to provide an antidote to the congestion of the city landscape. The most successful American promoters of this notion of the park as refuge were the landscape designers Frederick Law Olmsted and Calvert Vaux, who teamed up in the late 1850s to design New York’s Central Park. They deliberately created a public space that would look as little like the city as possible. Instead of the ordered, formal spaces common in some European cities, they created a space that seemed to be entirely natural—even though almost all of Central Park was carefully designed and constructed. Central Park was from the start one of the most popular and admired public spaces in the world, and as a result Olmsted and Vaux were recruited to design other great parks and public spaces in other cities: Brooklyn, Boston, Philadelphia, Chicago, and Washington, D.C.
At the same time that cities were creating great parks, they were also creating great public buildings: libraries, art galleries, natural history museums, theaters, concert halls, and opera houses. New York’s Metropolitan Museum of Art was only the largest and best known of many great museums taking shape in the late nineteenth century; others were created in such cities as Boston, Chicago, Philadelphia, and Washington, D.C. In one city after another, new and lavish public libraries appeared as if to confirm the city’s role as a center of learning and knowledge.
Wealthy residents of cities were the principal force behind the creation of the great public buildings and at times even parks. As their own material and social aspirations grew, they wanted the public life of the city to provide them with amenities to match their expectations. Becoming an important patron of a major cultural institution was an especially effective route to social distinction. But this philanthropy, whatever the motives behind it, also produced valuable assets for the city as a whole.
As both the size and the aspirations of the great cities increased, urban leaders launched monumental projects to remake the way their cities looked. Inspired by massive city rebuilding projects in Paris, London, Berlin, and other European cities, some American cities began to clear away older neighborhoods and streets and create grand, monumental avenues lined with new and more impressive buildings. A particularly important event in inspiring this effort to remake the city was the 1893 Columbian Exposition in Chicago, a world’s fair constructed to honor the 400th anniversary of Columbus’s first voyage to America. At the center of the wildly popular exposition was a cluster of neoclassical buildings—the “Great White City”— constructed in the fashionable “beaux-arts” style of the time, arranged symmetrically around a formal lagoon. It became the inspiration for what became known as the “city beautiful” movement, led by the architect of the Great White City, Daniel Burnham. The movement aimed to impose a similar order and symmetry on the disordered life of cities around the country. “Make no little plans,” Burnham liked to tell city planners. His influence led to the remaking of cities all across the country—from Washington, D.C., to Chicago and San Francisco. Only rarely, however, were planners able to overcome the obstacles of private landowners and complicated urban politics. They rarely achieved more than a small portion of their dreams. There were no reconstructions of American cities to match the elaborate nineteenthcentury reshaping of Paris and London.
The effort to remake the city did not just focus on redesigning the existing landscape. It occasionally led to the creation of entirely new ones. In Boston in the late 1850s, a large area of marshy tidal land was gradually filled in to create the neighborhood known as “Back Bay.” The landfill project took more than forty years to complete and was one of the largest public works projects ever undertaken in America to that point. But Boston was not alone. Chicago reclaimed large areas from Lake Michigan as it expanded and at one point raised the street level for the entire city to help avoid the problems the marshy land created. In Washington, D.C., another marshy site, large areas were filled in and slated for development. In New York and other cities, the response to limited space was not so much creating new land as annexing adjacent territory. A great wave of annexations expanded the boundaries of many American cities in the 1890s and beyond.


One of the greatest problems of this precipitous growth was finding housing for the thousands of new residents who were pouring into the cities every day. For the prosperous, however, housing was seldom a worry. The availability of cheap labor and the reduced cost of building let anyone with even a moderate income afford a house.
Many of the richest urban residents lived in palatial mansions in the heart of the city and created lavish “fashionable districts”—Fifth Avenue in New York City, Back Bay and Beacon Hill in Boston, Society Hill in Philadelphia, Lake Shore Drive in Chicago, Nob Hill in San Francisco, and many others.
The moderately well-to-do (and as time went on, increasing numbers of wealthy people as well) took advantage of the less expensive land on the edges of the city and settled in new suburbs, linked to the downtowns by trains or streetcars or improved roads. Chicago in the 1870s, for example, boasted nearly 100 residential suburbs connected with the city by railroad and off ering the joys of “pure air, peacefulness, quietude, and natural scenery.” Boston, too, saw the development of some of the earliest “streetcar suburbs”—Dorchester, Brookline, and others—which catered to both the wealthy and the middle class. New Yorkers of moderate means settled in new suburbs on the northern fringes of Manhattan and commuted downtown by trolley or riverboat. Real estate developers worked to create and promote suburban communities that would appeal to nostalgia for the countryside that many city dwellers felt. Affluent suburbs, in particular, were notable for lawns, trees, and houses designed to look manorial. Even modest communities strove to emphasize the opportunities suburbs provided for owning land.


Most urban residents, however, could not afford either to own a house in the city or to move to the suburbs. Instead, they stayed in the city centers and rented. Because demand was so high and space so scarce, they had little bargaining power in the process. Landlords tried to squeeze as many rent-paying residents as possible into the smallest available space. In Manhattan, for example, the average population density in 1894 was 143 people per acre—a higher rate than that of the most crowded cities of Europe (Paris had 127 per acre, Berlin 101) and far higher than in any other American city then or since. In some neighborhoods—the Lower East Side of New York City, for example—density was more than 700 people per acre, among the highest levels in the world.
Landlords were reluctant to invest much in immigrant housing, confident they could rent dwellings for a profit regardless of their conditions. In the cities of the South—Charleston, New Orleans, Richmond—poor African Americans lived in crumbling former slave quarters. In Boston, they moved into cheap three-story wooden houses (“triple deckers”), many of them decaying fire hazards. In Baltimore and Philadelphia, they crowded into narrow brick row houses. And in New York, as in many other cities, more than a million people lived in tenements.
The word “tenement” had originally referred simply to a multiple-family rental building, but by the late nineteenth century it was being used to describe slum dwellings only. The first tenements, built in New York City in 1850, had been hailed as a great improvement in housing for the poor. “It is built with the design of supplying the laboring people with cheap lodgings,” a local newspaper commented, “and will have many advantages over the cellars and other miserable abodes which too many are forced to inhabit.” But tenements themselves soon became “miserable abodes,” with many windowless rooms, little or no plumbing or central heating, and often a row of privies in the basement. A New York state law of 1870 required a window in every bedroom of tenements built after that date; developers complied by adding small, sunless air shafts to their buildings. Most of all, tenements were incredibly crowded, with three, four, and, sometimes many more people crammed into each small room.
Jacob Riis, a Danish immigrant and New York newspaper reporter and photographer, shocked many middle-class Americans with his sensational (and some claimed sensationalized) descriptions and pictures of tenement life in his 1890 book, How the Other Half Lives. Slum dwellings, he said, were almost universally sunless, practically airless, and “poisoned” by “summer stenches.” “The hall is dark and you might stumble over the children pitching pennies back there.” But the solution many reformers (including Riis) favored, and that governments sometimes adopted, was to raze slum dwellings without building any new or better housing to replace them.


Urban growth posed monumental transportation challenges. Old downtown streets were often too narrow for the heavy traffic that was beginning to move over them. Most were without a hard, paved surface producing either a sea of mud or a cloud of dust. In the last decades of the century, more and more streets were paved, usually with wooden blocks, bricks, or asphalt; but paving could not keep up with the number of new thoroughfares the expanding cities were creating. By 1890, Chicago had paved only about 600 of its more than 2,000 miles of streets.
But it was not simply the conditions of the streets that impeded urban transportation. It was the numbers of people who needed to move every day from one part of the city to another, numbers that mandated the development of mass transportation. Streetcars drawn on tracks by horses had been introduced into some cities even before the Civil War. But the horsecars were not fast enough, so many communities developed new forms of mass transit.
In 1870, New York opened its first elevated railway, whose noisy, filthy steam-powered trains moved rapidly above the city streets on massive iron structures. New York, Chicago, San Francisco, and other cities also experimented with cable cars, towed by continuously moving underground cables. Richmond, Virginia, introduced the first electric trolley line in 1888, and by 1895 such systems were operating in 850 towns and cities. In 1897, Boston opened the first American subway when it put some of its trolley lines underground. At the same time, cities were developing new techniques of road and bridge building. One of the great technological marvels of the 1880s was the completion of the Brooklyn Bridge in New York, a dramatic steel-cable suspension span designed by John A. Roebling.


Cities were growing upward as well as outward. Until the midnineteenth century, almost no buildings more than four or five stories high could be constructed. Construction techniques were such that it was difficult and expensive to build adequate structural supports for tall buildings. There was also a limit to the number of flights of stairs the users of buildings could be expected to climb. But by the 1850s, there had been successful experiments with machine-powered passenger elevators; and by the 1870s, new methods of construction using cast iron and steel beams made it easier to build tall buildings.
Not long after the Civil War, therefore, tall buildings began to appear in the major cities. The Equitable Building in New York, completed in 1870 and rising seven and a half floors above the street, was one of the first in the nation to be built with an elevator. A few years later, even taller buildings of ten and twelve stories were appearing elsewhere in New York, in Chicago, and in other growing cities around the country. With each passing decade, both the size and the number of tall buildings increased until, by the 1890s, the term “skyscraper” began to become a popular description of them.


One serious problem was fires. In one major city after another, fires destroyed large downtown areas, where many buildings were still constructed of wood. Chicago and Boston suff ered “great fires” in 1871. Other cities—among them Baltimore and San Francisco, where a tremendous earthquake produced a catastrophic fire in 1906—experienced similar disasters. The great fires were terrible and deadly experiences, but they also encouraged the construction of fireproof buildings and the development of professional fire departments. They also forced cities to rebuild at a time when new technological and architectural innovations were available. Some of the modern, high-rise downtowns of American cities arose out of the rubble of great fires.


Modern notions of environmentalism were unknown to most Americans in the late nineteenth and early twentieth centuries. But the environmental degradation of many American cities was a visible and disturbing fact of life in those years. The frequency of great fires, the dangers of disease and plague, the extraordinary crowding of working-class neighborhoods were all examples of the environmental costs of industrialization and rapid urbanization.
Improper disposal of human and industrial waste was a common feature of almost all large cities in these years. Such practices contributed to the pollution of rivers and lakes and also, in many cases, to the compromising of the city’s drinking water. This was particularly true in poor neighborhoods with primitive plumbing (and sometimes no indoor plumbing at all), outdoor privies that leaked into the groundwater, and overcrowded tenements. The presence of domestic animals—horses, which were the principal means of transportation until the late nineteenth century, but in poor neighborhoods also cows, pigs, and other animals—contributed as well to the environmental problems.
Air quality in many cities was poor as well. Few Americans had the severe problems that London experienced in these years with its perpetual “fogs” created by the debris from the burning of soft coal. But air pollution from factories and from stoves and furnaces in offices, homes, and other buildings was constant and at times severe. The incidence of respiratory infection and related diseases was much higher in cities than it was in non-urban areas, and it accelerated rapidly in the late nineteenth century.
By the early twentieth century, reformers were actively crusading to improve the environmental conditions of cities and were beginning to achieve some notable successes. By 1910, most large American cities had constructed sewage disposal systems, often at great cost, to protect the drinking water of their inhabitants and to prevent the great bacterial plagues that impure water had helped create in the past—such as the 1873 yellow fever epidemic in Memphis that killed more than 5,000 people.
Alice Hamilton, a physician who became an investigator for the United States Bureau of Labor, was a pioneer in the identification of pollution in the workplace. She documented ways in which improper disposal of such potentially dangerous substances as lead (she was one of the first physicians to identify lead poisoning), chemical waste, and ceramic dust was creating widespread sickness. And despite considerable resistance from many factory owners, she did bring such problems to public attention and, in some states at least, inspired legislation to require manufacturers to solve them. In 1912, the federal government created the Public Health Service, which was charged with preventing such occupational diseases as tuberculosis, anemia, and carbon dioxide poisoning, which were common in the garment industry and other trades. It attempted to create common health standards for all factories; but since the agency had few powers of enforcement, it had limited impact. It did, however, establish the protection of public health as a responsibility of the federal government and also helped bring to public attention the environmental forces that endangered health. The creation of the Occupational Health and Safety Administration in 1970, which gave government the authority to require employers to create safe and healthy workplaces, was a legacy of the Public Health Service’s early work.


Above all, perhaps, the expansion of the cities created widespread and often desperate poverty. Despite the rapid growth of urban economies, the sheer number of new residents ensured that many people would be unable to earn enough for a decent subsistence.
Public agencies and private philanthropic organizations offered very limited relief. They were generally dominated by middle-class people, who tended to believe that too much assistance would breed dependency and that poverty was the fault of the poor themselves—a result of laziness or alcoholism or other kinds of irresponsibility. Most tried to restrict aid to the “deserving poor”—those who truly could not help themselves (at least according to the standards of the organizations themselves, which conducted elaborate “investigations” to separate the “deserving” from the “undeserving”).
Other charitable societies—for example, the Salvation Army, which began operating in America in 1879, one year after it was founded in London—concentrated more on religious revivalism than on the relief of the homeless and hungry. Tensions often arose between native Protestant philanthropists and Catholic immigrants over religious doctrine and standards of morality.
Middle-class people grew particularly alarmed over the rising number of poor children in the cities, some of them orphans or runaways, living alone or in small groups scrounging for food. These “street arabs,” as they were often called, attracted more attention from reformers than any other group—although that attention produced no serious solutions to their problems.


Poverty and crowding naturally bred crime and violence. Much of it was relatively minor, the work of pickpockets, con artists, swindlers, and petty thieves. But some was more dangerous. The American murder rate rose rapidly in the late nineteenth century (even as such rates were declining in Europe), from 25 murders for every million people in 1880 to over 100 by the end of the century—a rate slightly higher than the relatively high rates of the 1980s and 1990s. That reflected in part a very high level of violence in some nonurban areas: the American South, where rates of lynching and homicide were particularly high; and the West, where the rootlessness and instability of new communities (cow towns, mining camps, and the like) created much violence. But the cities contributed their share to the increase in crime as well. Native-born Americans liked to believe that crime was a result of the violent proclivities of immigrant groups, and they cited the rise of gangs and criminal organizations in various ethnic communities. But native-born Americans were as likely to commit crimes as immigrants.
The rising crime rates encouraged many cities to develop larger and more professional police forces. In the early nineteenth century, police forces had often been private and informal organizations; urban governments had resisted professionalized law enforcement. By the end of the century, however, professionalized public police departments were a part of the life of virtually every city and town. They worked closely with district attorneys and other public prosecutors, who were also becoming more numerous and more important in city life. Police forces themselves could also spawn corruption and brutality, particularly since jobs on them were often filled through political patronage. And complaints about police dealing diff erently with white and black suspects, or with rich and poor communities, were common in the late nineteenth century.
Some members of the middle class, fearful of urban insurrections, felt the need for even more substantial forms of protection. Urban national guard groups (many of them created and manned by middle-class elites) built imposing armories on the outskirts of affluent neighborhoods and stored large supplies of weapons and ammunition in preparation for uprisings that, in fact, never occurred.


Newly arrived immigrants, many of whom could not speak English, needed help in adjusting to American urban life: its laws, its customs, usually its language. Some ethnic communities created their own self-help organizations. But for many residents of the inner cities, the principal source of assistance was the political machine.
The urban machine was one of America’s most distinctive political institutions. It owed its existence to the power vacuum that the chaotic growth of cities (and the very limited growth of city governments) had created. It was also a product of the potential voting power of large immigrant communities. Any politician who could mobilize that power stood to gain enormous influence, if not public office. And so there emerged a group of urban “bosses,” themselves often of foreign birth or parentage. Many were Irish, because they spoke English and because some had acquired previous political experience from the long Irish struggle against the English at home. Almost all were men (in most states women could not yet vote). The principal function of the political boss was simple: to win votes for his organization. That meant winning the loyalty of his constituents. To do so, a boss might provide potential voters with occasional relief—baskets of groceries, bags of coal. He might step in to save those arrested for petty crimes from jail. He rewarded many of his followers with patronage: with jobs in city government or in such city agencies as the police (which the machine’s elected officials often controlled); with jobs building or operating the new transit systems; and with opportunities to rise in the political organization itself.
Machines were also vehicles for making money. Politicians enriched themselves and their allies through various forms of graft and corruption. Some of it might be fairly open—what George Washington Plunkitt of New York’s Tammany Hall called “honest graft.” For example, a politician might discover in advance where a new road or streetcar line was to be built, buy an interest in the land near it, and profit when the city had to buy the land from him or when property values rose as a result of the construction. But there was also covert graft: kickbacks from contractors in exchange for contracts to build streets, sewers, public buildings, and other projects; the sale of franchises for the operation of such public utilities as street railways, waterworks, and electric light and power systems. The most famously corrupt city boss was William M. Tweed, boss of New York City’s Tammany Hall in the 1860s and 1870s, whose excesses finally landed him in jail in 1872.
Middle-class critics saw the corrupt machines as blights on the cities and obstacles to progress. In fact, political organizations were often responsible not just for corruption, but also for modernizing city infrastructures, for expanding the role of government, and for creating stability in a political and social climate that otherwise would have lacked a center. The motives of the bosses may have been largely venal, but their achievements were sometimes greater than those of the more scrupulous reformers who challenged them.
Several factors made boss rule possible. One was the power of immigrant voters, who were less concerned with middle-class ideas of political morality than with obtaining the services that machines provided and reformers did not. Another was the link between the political organizations and wealthy, prominent citizens who profited from their dealings with bosses. Still another was the structural weakness of city governments. The boss, by virtue of his control over his machine, formed an “invisible government” that provided an alternative to what was often the inadequacy of the regular government. The urban machine was not without competition. Reform groups frequently mobilized public outrage at the corruption of the bosses and often succeeded in driving machine politicians from office. Tammany, for example, saw its candidates for mayor and other high city offices lose almost as often as they won in the last decades of the nineteenth century. But the reform organizations typically lacked the permanence of the machine. Thus, many critics of machines began to argue for more basic reforms: for structural changes in the nature of city government.


American industry could not have grown as it did without the expansion of markets. The growth of demand occurred at almost all levels of society, a result not just of the new techniques of production and mass distribution that were making consumer goods less expensive, but also of rising incomes. Incomes in the industrial era were rising for almost everyone, although at highly uneven rates. The most conspicuous result of the new economy was the creation of vast fortunes. But more important for society as a whole were the growth and increasing prosperity of the middle class. The salaries of clerks, accountants, middle managers, and other “white-collar” workers rose on average by a third between 1890 and 1910—and in some parts of the middle class, salaries rose by much more. Doctors, lawyers, and other professionals, for example, experienced a particularly dramatic increase in both the prestige and the profitability of their professions.
Working-class incomes rose too in those years, although from a much lower base and considerably more slowly. Ironand steelworkers, despite the setbacks their unions suff ered, saw their hourly wages increase by a third between 1890 and 1910; but industries with large female, African American, or Mexican workforces—shoes, textiles, paper, laundries, many areas of commercial agriculture—saw very small increases, as did almost all industries in the South.
Also important to the new mass market were the development of affordable products and the creation of new merchandising techniques, which made many consumer goods available to a broad market for the first time. A good example of such changes was the emergence of readymade clothing. In the early nineteenth century, most Americans had made their own clothing—usually from cloth they bought from merchants, at times from fabrics they spun and wove themselves. Affluent people contracted with private tailors to make their clothes. But the invention of the sewing machine and the spur that the Civil War (and its demand for uniforms) gave to the manufacture of clothing created an enormous industry devoted to producing ready-made garments. By the end of the century, almost all Americans bought their clothing from stores.
Partly as a result, much larger numbers of people became concerned with personal style. Interest in women’s fashion, for example, had once been a luxury reserved for the affluent. Now middle-class and even working-class women could strive to develop a distinctive style of dress. New homes, even modest ones, now included clothes closets. Even people in remote rural areas could develop more stylish wardrobes by ordering from the new mail-order houses.
Another example of the rise of the mass market was the way Americans bought and prepared food. The development and mass production of tin cans in the 1880s created a large new industry devoted to packaging and selling canned food and (as a result of the techniques Gail Borden, an inventor and politician, developed in the 1850s) condensed milk. Refrigerated railroad cars made it possible for perishables—meats, vegetables, dairy products, and other foodstuff s—to travel long distances without spoiling. The development of artificially frozen ice made it possible for many more households to afford iceboxes. Among other things, the changes meant improved diets and better health; life expectancy rose six years in the first two decades of the twentieth century.


Changes in marketing also altered the way Americans bought goods. Small local stores faced competition from new “chain stores.” The Great Atlantic & Pacific Tea Company (A & P) began creating a national network of grocery stores as early as the 1850s and expanded it rapidly after the Civil War.
F. W. Woolworth opened his first “Five and Ten Cent Store” in Utica, New York, in 1879 and went on to build a national chain of dry goods stores. Chain stores were able to sell manufactured goods at lower prices than the local, independent stores because the chains had so much more volume. From the beginning, the chains faced opposition from the established merchants they threatened to displace, and from others who feared that they would jeopardize the character of their communities. (Similar controversies have continued into the twenty first century over the spread of large chains such as WalMart and Barnes & Noble.) But most customers, however loyal they might feel to a local merchant, found it difficult to resist the greater variety and lower prices the chains provided them.
Chain stores were slow to reach remote, rural areas, which remained dependent on poorly stocked and often very expensive country stores. But rural people gradually gained access to the new consumer world through the great mail-order houses. In 1872, Montgomery Ward—a Chicago-based traveling salesman—distributed a catalog of consumer goods in association with the farmers’ organization, the Grange (see p. 535). By the 1880s, he was off ering thousands of items at low prices to farmers throughout the Midwest and beyond. He soon faced stiff competition from Sears Roebuck, first established by Richard Sears in Chicago in 1887. Together, the bulky catalogs from Ward and Sears changed the lives of many isolated people—introducing them to (and explaining for them) new trends of fashion and home decor as well as making available new tools, machinery, and technologies for the home.


In larger cities, the emergence of great department stores (which had appeared earlier in Europe) helped transform buying habits and turn shopping into an alluring and glamorous activity. Marshall Field in Chicago created one of the first American department stores, and others soon followed: Macy’s in New York, Abraham and Straus in Brooklyn, Jordan Marsh and Filene’s in Boston, Wanamaker’s in Philadelphia.
The department stores transformed the concept of shopping in several ways. First, they brought together under one roof an enormous array of products that had previously been sold in separate shops. Second, they sought to create an atmosphere of wonder and excitement, to make shopping a glamorous activity. Department stores were elaborately decorated to suggest great luxury and elegance. They included restaurants and tea rooms and comfortable lounges, to suggest that shopping could be a social event as well as a practical necessity. They hired well-dressed salesclerks, mostly women, to provide attentive service to their mostly female customers. Third, department stores—like mailorder houses—took advantage of economies of scale to sell merchandise at lower prices than many of the individual shops with which they competed.


The rise of mass consumption had particularly dramatic effects on American women. Women’s clothing styles changed much more rapidly and dramatically than men’s, which encouraged more frequent purchases. Women generally bought and prepared food for their families, so the availability of new food products changed not only the way everyone ate, but also the way women shopped and cooked.
The consumer economy produced new employment opportunities for women as salesclerks in department stores and as waitresses in the rapidly proliferating restaurants. And it spawned the creation of a new movement in which women were to play a vital role: the consumer protection movement. The National Consumers League, formed in the 1890s under the leadership of Florence Kelley, a prominent social reformer, attempted to mobilize the power of women as consumers to force retailers and manufacturers to improve wages and working conditions for women workers. By defining themselves as consumers, many middle-class women were able to find a stance from which they could become active participants in public life. Indeed, the mobilization of women behind consumer causes— and eventually many other causes—was one of the most important political developments of the late nineteenth century.


The growth of free time produced a redefinition of the idea of “leisure.” In earlier eras, relatively few Americans had considered leisure a valuable thing. On the contrary, many equated it with laziness or sloth. “Rest,” as in the relative inactivity many Americans considered appropriate for the Sabbath, was valued because it off ered time for spiritual reflection and prepared people for work. But leisure—time spent amusing oneself in nonproductive pursuits—was not only unavailable to most Americans, but faintly scorned as well.
But with the rapid expansion of the economy and the increasing number of hours workers had away from work, it became possible to imagine leisure time as a normal part of the lives of many people. Industrial workers, in pursuit of shorter hours, adopted the slogan “Eight hours for work, eight hours for rest, and eight hours for what we will.” Others were equally adamant in claiming that leisure time was both a right and an important contribution to an individual’s emotional and even spiritual health.
The economist Simon Patten was one of the first intellectuals to articulate this new view of leisure, which he tied closely to the rising interest in consumption. Patten, in The Theory of Prosperity (1902), The New Basis of Civilization (1910), and other works, challenged the centuries-old assumption that the normal condition of civilization was a scarcity of goods. “We are now in the transition stage,” he wrote, “from this pain economy [the economy of scarcity] to a pleasure economy.” The principal goal of such an economy, he claimed, “should be an abundance of goods and the pursuit of pleasure.”
As Americans became more accustomed to leisure as a normal part of their lives, they began to look for new experiences with which to entertain themselves. Entertainment usually meant “going out,” spending leisure time in public places where there would be not only entertainment, but also other people. Thousands of working-class New Yorkers flocked to the amusement park at Coney Island, for example, not just for the rides and shows, but for the excitement of the crowds as well. So did the thousands who spent evenings in dance halls, vaudeville houses, and concert halls. Affluent New Yorkers enjoyed afternoons in Central Park, where a principal attraction was seeing other people (and being seen by them). Moviegoers were attracted not just by the movies themselves, but also by the energy of the audiences at the lavish “movie palaces” that began to appear in cities in the early twentieth century, just as sports fans were drawn by the crowds as well as by the games.
Mass entertainment did not always bridge diff erences of class, race, or gender. Saloons and most sporting events tended to be male preserves. Shopping (itself becoming a valued leisure-time activity) and going to tea rooms and luncheonettes were more characteristic of female leisure. Theaters, pubs, and clubs were often specific to particular ethnic communities or particular work groups. There were, in fact, relatively few places where people of widely diverse backgrounds gathered together.
When the classes did meet in public spaces—as they did, for example, in city parks—there was often conflict over what constituted appropriate public behavior. Elites in New York City, for example, tried to prohibit anything but quiet, “genteel” activities in Central Park, while working-class people wanted to use the public spaces for sports and entertainments. But even divided by class, ethnicity, and gender, leisure and popular entertainment did help sustain a vigorous public culture.


The search for forms of public leisure hastened the rise of organized spectator sports, especially baseball, which by the end of the century was well on its way to becoming the national pastime (see “Patterns of Popular Culture,” pp. 392). A game much like baseball, known as “rounders” and derived from cricket, had enjoyed limited popularity in Great Britain in the early nineteenth century. Versions of the game began to appear in America in the early 1830s, well before Abner Doubleday supposedly “invented” baseball. (Doubleday, in fact, had little to do with the creation of baseball and actually cared little for sports. Alexander Cartwright, a member of a New York City baseball club in the 1840s, defined many of the rules and features of the game as we know it today.)
By the end of the Civil War, interest in baseball had grown rapidly. More than 200 amateur or semiprofessional teams or clubs existed, many of which joined a national association and agreed on standard rules. The first salaried team, the Cincinnati Red Stockings, was formed in 1869. Other cities soon fielded professional teams, and in 1876, at the urging of Albert Spalding, they banded together in the National League. A rival league, the American Association, soon appeared. It eventually collapsed, but in 1901 the American League emerged to replace it. In 1903, the first modern World Series was played, in which the American League Boston Red Sox beat the National League Pittsburgh Pirates. By then, baseball had become an important business and a great national preoccupation (at least among men), attracting paying crowds in the thousands.
The second most popular game, football, appealed at first to an elite segment of the male population, in part because it originated in colleges and universities. The first intercollegiate football game in America occurred between Princeton and Rutgers in 1869, and soon the game became entrenched as part of collegiate life. Early intercollegiate football bore only an indirect relation to the modern game; it was more similar to what is now known as rugby. By the late 1870s, however, the game was becoming standardized and was taking on the outlines of its modern form.
As college football grew in popularity, it spread to other sections of the country, notably to the midwestern state universities, which were destined soon to replace the eastern schools as the great powers of the game. It also began to exhibit the taints of professionalism that have marked it ever since. Some schools used “ringers,” tramp athletes who were not even registered as students. In an effort to eliminate such abuses, Amos Alonzo Stagg, athletic director and coach at the University of Chicago, led in forming the Western Conference, or Big Ten, in 1896, which established rules governing eligibility. Football also became known for a high level of violence on the field; eighteen college students died of football-related injuries and over a hundred were seriously hurt in 1905. The carnage prompted a White House conference on organized sports convened by President Theodore Roosevelt. As a result of its deliberations, a new intercollegiate association (which in 1910 became known as the National College Athletic Association, the NCAA) revised the rules and the required equipment of the game in an effort to make it more honest and safer.
Other popular spectator sports were emerging at about the same time. Basketball was invented in 1891 at Springfield, Massachusetts, by Dr. James A. Naismith, a Canadian working as athletic director for a local college. Boxing, which had long been a disreputable activity concentrated primarily among the urban working classes, had become by the 1880s a more popular and in some places more reputable sport, particularly after the adoption of the Marquis of Queensberry rules (by which fighters wore padded gloves and fought in three-minute rounds). The first modern boxing hero, John L. Sullivan, became heavyweight champion of the world in 1882. Even so, boxing remained illegal in some states until after World War I. Horse racing, popular since colonial times, became increasingly commercialized with the construction of large tracks and the establishment of large-purse races such as the Kentucky Derby.
Even in their infancy, spectator sports were closely associated with gambling. There was elaborate betting—some of it organized by underground gambling syndicates—on baseball and football almost from the start. One of the most famous incidents in the history of baseball was the alleged “throwing” of the 1919 World Series by the Chicago White Sox because of gambling (an incident that became known as the “Black Sox Scandal”). That event resulted in the banning of some of the game’s most notable figures from the sport for life and the establishment of the office of commissioner of baseball to “clean up” the game. Boxing was troubled throughout its history by the influence of gambling and the frequent efforts of managers to “fix” fights in the interests of bettors. Horse racing as it became commercialized was openly organized around betting, with the race tracks themselves establishing odds and taking bets.
The rise of spectator sports and gambling was largely a response to the desire of men to create a distinctively male culture in cities. But not all sports were the province of men. A number of sports were emerging in which women became important participants. Golf and tennis seldom attracted crowds in the late nineteenth century, but both experienced a rapid increase in participation among relatively wealthy men and women. Bicycling and croquet also enjoyed widespread popularity in the 1890s among women as well as men. Women’s colleges were beginning to introduce their students to more strenuous sports as well—track, crew, swimming, and (beginning in the late 1890s) basketball—challenging the once prevalent notion that vigorous exercise was dangerous to women.


Many ethnic communities maintained their own theaters, in which immigrants listened to the music of their homelands and heard comedians making light of their experiences in the New World. Italian theaters often drew on the traditions of Italian opera to create sentimental musical events. The Yiddish theater built on the experiences of American Jews—and was the training ground for a remarkable group of musicians and playwrights who later went on to play a major role in mainstream, English-speaking theater.
Urban theaters also introduced one of the most distinctively American entertainment forms: the musical comedy, which evolved gradually from the comic operettas of European theater. George M. Cohan, an Irish vaudeville entertainer, became the first great creator of musical comedies in the early twentieth century; in the process of creating his many shows, he wrote a series of patriotic songs—“Yankee Doodle Dandy,” “Over There,” and “You’re a Grand Old Flag”—that remained popular many decades later. Irving Berlin, a veteran of the Yiddish theater, wrote more than 1,000 songs for the musical theater during his long career, including such popular favorites as “Alexander’s Ragtime Band” and “God Bless America.”
Vaudeville, a form of theater adapted from French models, was the most popular urban entertainment in the first decades of the twentieth century. Even saloons and small community theaters could afford to off er their customers vaudeville, which consisted of a variety of acts (musicians, comedians, magicians, jugglers, and others) and was, at least in the beginning, inexpensive to produce. As the economic potential of vaudeville grew, some promoters— most prominently Florenz Ziegfeld of New York—staged much more elaborate spectacles. Vaudeville was also one of the few entertainment media open to black performers. They brought to it elements of the minstrel shows they had earlier developed for black audiences in the late nineteenth century. (See “Patterns of Popular Culture,” pp. 432–433.)


The most important form of mass entertainment (until the invention of radio and television) was the movies. Thomas Edison and others had created the technology of the motion picture in the 1880s. Not long after, short films became available to individual viewers through “peep shows” in pool halls, penny arcades, and amusement parks. Soon larger projectors made it possible to project the images onto big screens, which permitted substantial audiences to see films in theaters.
By 1900, Americans were becoming attracted in large numbers to these early movies—usually plotless films of trains or waterfalls or other spectacles designed mainly to show off the technology. D. W. Griffith carried the motion picture into a new era with his silent epics— The Birth of a Nation (1915), Intolerance (1916), and others—which introduced serious plots and elaborate productions to filmmaking. Some of these films— most notably The Birth of a Nation, with its celebration of the Ku Klux Klan and its demeaning portraits of African Americans— also contained notoriously racist messages, an indication, among other things, that the audiences for these early films were overwhelmingly white. Nevertheless, motion pictures were the first truly mass entertainment medium, reaching all areas of the country and almost all groups in the population.


Leisure had a particular importance to working-class men and women—in part because it was a relatively new part of their lives and in part because it stood in such sharp contrast to the grueling environments in which many industrial workers labored. More than most other groups in society, workers spent their leisure time on the streets—walking alone or in groups, watching street entertainers, meeting friends, talking and joking. For people with time but little money, the life of the street was an appealing source of camaraderie and energy.
Another important setting for the leisure time of workingclass men was the neighborhood saloon, which became a place where a worker could be sure of encountering a regular circle of friends. Saloons were often ethnically specific, in part because they served particular neighborhoods dominated by particular national groups. They also became political centers. Saloonkeepers were especially important figures in urban political machines, largely because they had regular contact with so many men in a neighborhood. When the Anti-Saloon League and other temperance organizations attacked the saloon, one of the reasons they cited was that eliminating saloons would weaken political machines. Opponents also noted correctly that saloons were sometimes places of crime, violence, and prostitution—an entryway into the dark underworld of urban life.
Boxing was a particularly popular sport among working-class men. Many workers could not afford to attend the great public boxing matches pairing such popular heroes as John L. Sullivan and “Gentleman Jim” Corbett. But there were less glittering boxing matches in small rings and even in saloons—bareknuckled fights organized by ethnic clubs and other groups that gave men an opportunity to demonstrate their strength and courage, something that the working world did not always provide them.


The Fourth of July played a large role in the lives of many working-class Americans. That was in part because in an age of six-day (and sometimes seven-day) workweeks and before regular vacations, it was for many decades one of the few full days of leisure—other than the Sabbath, during which activities were often restricted by law—that many workers had. Fourth of July celebrations were one of the highlights of the year in many ethnic, working-class communities. In Worcester, Massachusetts, for example, the Ancient Order of Hibernians (an Irish organization) sponsored boisterous picnics for the Irish working class of the city. Competing with them were Irish temperance organizations, which off ered more sober and “respectable” entertainments to those relatively few workers who wished to avoid the heavy drinking at the Hibernian affairs. Other ethnic groups organized their own Fourth of July events—picnics, games, parades— making the day a celebration not just of the nation’s independence, but of the cultures of immigrant communities. The city’s affluent middle class, in the meantime, tended to stay away, remaining indoors or organizing family picnics at resort areas outside the city.


Urban industrial society created a vast market for new methods of transmitting news and information. Between 1870 and 1910, the circulation of daily newspapers increased nearly ninefold (from under 3 million to more than 24 million), a rate three times as great as the rate of population increase. And while standards varied widely from one paper to another, American journalism began to develop the beginnings of a professional identity. Salaries of reporters increased; many newspapers began separating the reporting of news from the expression of opinion; and newspapers themselves became important businesses.
One striking change was the emergence of national press services, which made use of the telegraph to supply news and features to papers throughout the country and which contributed as a result to the standardization of the product. By the turn of the century, important newspaper chains had emerged as well. The most powerful was William Randolph Hearst’s, which by 1914 controlled nine newspapers and two magazines. Hearst and rival publisher Joseph Pulitzer helped popularize what became known as “yellow journalism”— a deliberately sensational, often lurid style of reporting presented in bold graphics, designed to reach a mass audience. (See “Patterns of Popular Culture,” pp. 552–553.) Another major change occurred in the nature of American magazines. Beginning in the 1880s, new kinds of magazines appeared that were designed for a mass audience. One of the pioneers was Edward W. Bok, who took over the Ladies’ Home Journal in 1899 and, by targeting a mass female audience, built its circulation to over 700,000.


Some writers and artists—the local-color writers of the South, for example, and Mark Twain, in such novels as Huckleberry Finn and Tom Sawyer —responded to the new industrial civilization by evoking an older, more natural world. But others grappled directly with the modern order.
One of the strongest impulses in late-nineteenth- and early twentieth- century American literature was the effort to recreate urban social reality. This trend toward realism found an early voice in Stephen Crane, who—although best known for his novel of the Civil War, The Red Badge of Courage (1895)—was the author of an earlier, powerful indictment of the plight of the working class. Crane created a sensation in 1893 when he published Maggie: A Girl of the Streets, a grim picture of urban poverty and slum life. Theodore Dreiser was even more influential in encouraging writers to abandon the genteel traditions of earlier times and turn to the social dislocations and injustices of the present. He did so both in Sister Carrie and in other, later novels (including An American Tragedy, published in 1925).
Many of Dreiser’s contemporaries followed him in chronicling the oppression of America’s poor. In 1901 Frank Norris published The Octopus, an account of a struggle between oppressed wheat farmers and powerful railroad interests in California. The socialist writer Upton Sinclair published The Jungle in 1906, a novel designed to reveal the depravity of capitalism. It exposed abuses in the American meatpacking industry; and while it did not inspire the kind of socialist response for which Sinclair had hoped, it did help produce legislative action to deal with the problem. Kate Chopin, a southern writer who explored the oppressive features of traditional marriage, encountered widespread public abuse after publication of her shocking novel The Awakening in 1899. It described a young wife and mother who abandons her family in search of personal fulfillment. It was formally banned in some communities. William Dean Howells, in The Rise of Silas Lapham (1884) and other works, described what he considered the shallowness and corruption in the search for wealth.
Other critics of American society responded to the new civilization not by attacking it but by withdrawing from it. The historian Henry Adams published a classic autobiography in 1906, The Education of Henry Adams, in which he portrayed himself as a man disillusioned with and unable to relate to his society, even though he continued to live in it. The novelist Henry James lived the major part of his adult life in England and Europe and produced a series of coldly realistic novels— The American (1877), Portrait of a Lady (1881), The Ambassadors (1903), and others—that showed his ambivalence about the character of modern, industrial civilization.
The growing popularity of literature helped spawn a remarkable network of clubs, mostly formed and populated by women, to bring readers together to talk about books. Reading clubs proliferated rapidly in cities and even small towns, among African American as well as white women. They made literature a social experience for hundreds of thousands of women and created a tradition that has continued into the twenty-first century.


American art through most of the nineteenth century had been overshadowed by the art of Europe. Many American artists studied and even lived in Europe. But others broke from the Old World traditions and experimented with new styles. Winslow Homer was vigorously American in his paintings of New England maritime life and other native subjects. James McNeil Whistler was one of the first Western artists to appreciate the beauty of Japanese color prints and to introduce Oriental concepts into American and European art.
By the first years of the new century, some American artists were turning decisively away from the traditional academic style, a style perhaps best exemplified in America by the brilliant portraitist John Singer Sargent. Instead, many younger painters were exploring the same grim aspects of modern life that were becoming the subject of American literature. Members of the so-called Ashcan school produced work startling in its naturalism and stark in its portrayal of the social realities of the era. John Sloan portrayed the dreariness of American urban slums; George Bellows caught the vigor and violence of his time in paintings and drawings of prize fights; Edward Hopper explored the starkness and loneliness of the modern city. The Ashcan artists were also among the first Americans to appreciate expressionism and abstraction; and they showed their interest in new forms in 1913 when they helped stage the famous and controversial Armory Show in New York City, which displayed works of the French Postimpressionists and of some American moderns.
The work of these and other artists marked the beginning in America of an artistic movement known as modernism, a movement that had counterparts in many other areas of cultural and intellectual life as well. Rejecting the heavy reliance on established forms that characterized the “genteel tradition” of the nineteenth-century art world, modernists rejected the grip of the past and embraced new subjects and new forms. Where the genteel tradition emphasized the “dignified” and “elevated” aspects of civilization (and glorified the achievements of gifted elites), modernism gloried in the ordinary, even the coarse. Where the genteel tradition placed great importance on respect for the past and the maintenance of “standards,” modernism looked to the future and gloried in the new. Eventually, modernism developed strict orthodoxies of its own. But in its early stages, it seemed to promise an escape from rigid, formal traditions and an unleashing of individual creativity.


The single most profound intellectual development in the late nineteenth century was the widespread acceptance of the theory of evolution, associated most prominently with the English naturalist Charles Darwin. Darwinism argued that the human species had evolved from earlier forms of life (and most recently from simian creatures similar to apes) through a process of “natural selection.” It challenged the biblical story of the Creation and almost every other tenet of traditional American religious faith. History, Darwinism suggested, was not the working out of a divine plan, as most Americans had always believed. It was a random process dominated by the fiercest or luckiest competitors.
The theory of evolution met widespread resistance at first from educators, theologians, and even many scientists. By the end of the century, however, the evolutionists had converted most members of the urban professional and educated classes. Even many middle-class Protestant religious leaders had accepted the doctrine, making significant alterations in theology to accommodate it. Evolution had become enshrined in schools and universities; few serious scientists any longer questioned its basic validity.
Unseen by most urban Americans at the time, however, the rise of Darwinism was contributing to a deep schism between the new, cosmopolitan culture of the city—which was receptive to new ideas such as evolution—and a more traditional, provincial culture located mainly (although not wholly) in rural areas—which remained wedded to more fundamentalist religious beliefs and older values. Thus the late nineteenth century saw not only the rise of a liberal Protestantism in tune with new scientific discoveries but also the beginning of an organized Protestant fundamentalism, rejecting evolution, which would make its presence felt politically in the 1920s and again in the late twentieth century and beyond.
Darwinism helped spawn other new intellectual currents. There was the Social Darwinism of William Graham Sumner and others, which industrialists used so enthusiastically to justify their favored position in American life. But there were also more sophisticated philosophies, among them a doctrine that became known as “pragmatism,” which seemed peculiarly a product of America’s changing material civilization. William James, a Harvard psychologist (and brother of the novelist Henry James), was the most prominent publicist of the new theory, although earlier intellectuals such as Charles S. Peirce and later ones such as John Dewey were also important to its development and dissemination. According to the pragmatists, modern society should rely for guidance not on inherited ideals and moral principles but on the test of scientific inquiry. No idea or institution (not even religious faith) was valid, they claimed, unless it worked and unless it stood the test of experience. “The ultimate test for us of what a truth means,” James wrote, “is the conduct it dictates or inspires.”
A similar concern for scientific inquiry was intruding into the social sciences and challenging traditional orthodoxies. Economists such as Richard T. Ely and Simon Patten argued for a more active and pragmatic use of scientific discipline. Sociologists such as Edward A. Ross and Lester Frank Ward urged applying the scientific method to the solution of social and political problems. Historians such as Frederick Jackson Turner and Charles Beard argued that economic factors more than spiritual ideals had been the governing force in historical development. John Dewey proposed a new approach to education that placed less emphasis on the rote learning of traditional knowledge and more on a flexible, democratic approach to schooling, one that enabled students to acquire knowledge that would help them deal with the realities of their society.
The relativistic implications of Darwinism also promoted the growth of anthropology and encouraged some scholars to begin examining other cultures—most significantly, perhaps, the culture of American Indians—in new ways. A few white Americans began to look at Indian society as a coherent culture with its own norms and values that were worthy of respect and preservation, even though diff erent from those of white society. But such ideas about Native Americans found very little support outside a few corners of the intellectual world until much later in the twentieth century.


A society that was coming to depend increasingly on specialized skills and scientific knowledge was, of course, a society with a high demand for education. The late nineteenth century, therefore, was a time of rapid expansion and reform of American schools and universities.
One example was the spread of free public primary and secondary education. In 1860, there were only 100 public high schools in the entire United States. By 1900, the number had reached 6,000, and by 1914 over 12,000. By 1900, compulsory school attendance laws were in effect in thirty-one states and territories. But education was still far from universal. Rural areas lagged far behind urban-industrial ones in funding public education. And in the South, many blacks had no access to schools.
Educational reformers, few of whom shared the more relativistic views of anthropologists, sought to provide educational opportunities for the Indian tribes as well, in an effort to “civilize” them and help them adapt to white society. In the 1870s, reformers recruited small groups of Indians to attend Hampton Institute, a primarily black college. In 1879, Richard Henry Pratt, a former army officer, organized the Carlisle Indian Industrial School in Pennsylvania. Carlisle emphasized the kind of practical “industrial” education that Booker T. Washington had urged at his school at Tuskegee. Equally important, it isolated Indians from their tribes and tried to force them to assimilate to white norms. The purpose, Pratt said, was to “kill the Indian and save the man.” Carlisle inspired other, similar schools in the West. Ultimately, the reform efforts failed, both because of Indian resistance and because of inadequate funding, incompetent administration, and poor teaching.
Colleges and universities were also proliferating rapidly in the late nineteenth century. They benefited particularly from the Morrill Land Grant Act of the Civil War era, by which the federal government had donated land to states for the establishment of colleges. After 1865, states in the South and West took particular advantage of the law. In all, sixty-nine “landgrant” institutions were established in the last decades of the century—among them the state university systems of California, Illinois, Minnesota, and Wisconsin.
Other universities benefited from millions of dollars contributed by business and financial tycoons. Rockefeller, Carnegie, and others gave generously to such schools as the University of Chicago, Columbia, Harvard, Northwestern, Princeton, Syracuse, and Yale. Other philanthropists founded new universities or reorganized and renamed older ones to perpetuate their family names—Vanderbilt, Johns Hopkins, Cornell, Duke, Tulane, and Stanford.


The post–Civil War era saw, too, an important expansion of educational opportunities for women, although such opportunities continued to lag far behind those available to men and were almost always denied to black women.
Most public high schools accepted women readily, but opportunities for higher education were few. At the end of the Civil War, only three American colleges were coeducational. In the years after the war, some of the land-grant colleges and universities in the Midwest and such private universities as Cornell and Wesleyan began to admit women along with men. But coeducation provided fewer opportunities than the creation of a network of women’s colleges. Mount Holyoke, which had begun its life in 1836 as a “seminary” for women, became a full-fledged college in the 1880s. At about the same time, new female institutions were emerging: Vassar, Wellesley, Smith, Bryn Mawr, Wells, and Goucher. A few of the larger private universities created separate colleges for women on their campuses (Barnard at Columbia and Radcliff e at Harvard, for example). Proponents of women’s colleges saw the institutions as places where female students would not be treated as “second-class citizens” by predominantly male student bodies and faculties.
The female college was part of an important phenomenon in the history of modern American women: the emergence of a distinctive women’s community. Most faculty members and many administrators were women (often unmarried). And the life of the college produced a spirit of sorority and commitment among educated women that had important effects in later years, as women became the leaders of many reform activities. Most female college graduates eventually married, but many married at a later age than their noncollege-educated counterparts and in some cases continued to pursue careers after marriage and motherhood. A significant minority, perhaps over 25 percent, did not marry, but devoted themselves exclusively to careers. A leader at Bryn Mawr remarked, “Our failures marry.” That was surely rhetorical excess. The growth of female higher education clearly became for some women a liberating experience, persuading them that they had roles to perform in society in addition to those of wives and mothers.