Racial and Ethnic Inequality

By Schaefer, R.T.

Edited by Paul Ducham


A numerical minority is any group that makes up less than half of some larger population. The population of the United States includes thousands of numerical minorities, including television actors, green-eyed people, tax lawyers, and descendants of the Pilgrims who arrived on the Mayflower. However, these numerical minorities are not considered to be minorities in the sociological sense; in fact, the number of people in a group does not necessarily determine its status as a social minority (or a dominant group). When sociologists define a minority group, they are concerned primarily with the economic and political power, or powerlessness, of that group. A minority group is a subordinate group whose members have significantly less control or power over their own lives than the members of a dominant or majority group have over theirs.
Sociologists have identified five basic properties of a minority group: unequal treatment, physical or cultural traits, ascribed status, solidarity, and in-group marriage (Wagley and Harris 1958):

1. Members of a minority group experience unequal treatment compared to members of a dominant group. For example, the management of an apartment complex may refuse to rent to African Americans, Hispanics, or Jews. Social inequality may be created or maintained by prejudice, discrimination, segregation, or even extermination.
2. Members of a minority group share physical or cultural characteristics that distinguish them from the dominant group. Each society arbitrarily decides which characteristics are most important in defining groups.
3. Membership in a minority (or dominant) group is not voluntary; people are born into the group. Thus, race and ethnicity are considered ascribed statuses.
4. Minority group members have a strong sense of group solidarity. William Graham Sumner, writing in 1906, noted that people make distinctions between members of their own group (the in-group) and everyone else (the out-group). When a group is the object of long-term prejudice and discrimination, the feeling of “us versus them” can and often does become extremely intense.
5. Members of a minority group generally marry others from the same group. A member of a dominant group is often unwilling to marry into a supposedly inferior minority group. In addition, the minority group’s sense of solidarity encourages marriage within the group and discourages marriage to outsiders.


Many people think of race as a series of biological classifications. However, research shows that is not a meaningful way of differentiating people. Genetically, there are no systematic differences between the races that affect people’s social behavior and abilities. Instead, sociologists use the term racial group to refer to those minorities (and the corresponding dominant groups) who are set apart from others by obvious physical differences. But what is an “obvious” physical difference? Each society labels those differences that people consider important, while ignoring other characteristics that could serve as a basis for social differentiation.

Social Construction of Race Because race is a social construction, the process of defining races typically benefits those who have more power and privilege than others. In the United States, we see differences in both skin color and hair color. Yet people learn informally that differences in skin color have a dramatic social and political meaning, whereas differences in hair color do not.
When observing skin color, many people in the United States tend to lump others rather casually into the traditional categories of “Black,” “White,” and “Asian.” Subtle differences in skin color often go unnoticed. In many nations of Central America and South America, in contrast, people recognize color gradients on a continuum from light to dark skin color. Brazil has approximately 40 color groupings, while in other countries people may be described as “Mestizo Hondurans,” “Mulatto Colombians,” or “African Panamanians.” What we see as “obvious” differences, then, are subject to each society’s social definitions.
The largest racial minorities in the United States are African Americans (or Blacks), Native Americans (or American Indians), and Asian Americans (Japanese Americans, Chinese Americans, and other Asian peoples). Figure 10-1 provides information about the population of racial and ethnic groups in the United States over the past five centuries.
Given current population patterns, it is clear that the nation’s diversity will continue to increase. In 2011, for the first time ever, census data revealed that the majority of all children ages three and under are now either Hispanic or non-White. This turning point marks the beginning of a pattern in which the nation’s minority population will slowly become the majority. By 2050, if not sooner, the majority of all school-age children in the United States will belong to racial or ethnic minority groups (Frey 2011).
Racial definitions are crystallized through what Michael Omi and Howard Winant (1994) have called racial formation, a sociohistorical process in which racial categories are created, inhabited, transformed, and destroyed. In this process, those who have power define groups of people according to a racist social structure. The creation of a reservation system for Native Americans in the late 1800s is one example of racial formation. Federal officials combined what were distinctive tribes into a single racial group, which we refer to today as Native Americans. The extent and frequency with which peoples are subject to racial formation is such that no one escapes it.
Another example of racial formation from the 1800s was known as the “one-drop rule.” If a person had even a single drop of “Black blood,” that person was defined and viewed as Black, even if he or she appeared to be White. Clearly, race had social significance, enough so that White legislators established official standards about who was “Black” and who was “White.”
The one-drop rule was a vivid example of the social construction of race —the process by which people come to define a group as a race based in part on physical characteristics, but also on historical, cultural, and economic factors. For example, in the 1800s, immigrant groups such as Italian and Irish Americans were not at first seen as being “White,” but as foreigners who were not necessarily trustworthy. The social construction of race is an ongoing process that is subject to debate, especially in a diverse society such as the United States, where each year increasing numbers of children are born to parents of different racial backgrounds.
Recognition of Multiple Identities In 1900, in an address to the Anti-Slavery Union in London, scholar W. E. B. DuBois predicted that “the color line” would become the foremost problem of the 20th century. DuBois, born a free Black man in 1868, had witnessed prejudice and discrimination throughout the United States. His comment was prophetic. Today, over a century later, race and ethnicity still carry enormous weight in the United States (DuBois [1900] 1969).
The color line has blurred significantly since 1900, however. Interracial marriage is no longer forbidden by law and custom. Thus, Geetha Lakshmi-narayanan, a native of Ann Arbor, Michigan, is both White and Asian Indian. Often mistaken for a Filipina or Latina, she has grown accustomed to the blunt question “What are you?” (Navarro 2005).
In the late 20th century, with immigration from Latin America rising, the fluid nature of racial formation became evident. Suddenly, people were speaking about the “Latin Americanization” of the United States, or about a biracial, Black/White society being replaced by a triracial one. In the 2010 Census, over 9 million people in the United States (or about 2.9 percent of the population) reported that they were of two or more races. Half the people classified as multiracial were under age 18, suggesting that this segment of the population will grow in the years to come. People who claimed both White and American Indian ancestry were the largest group of multiracial residents (Bonilla-Silva 2004; Humes et al. 2011).
This statistical finding of millions of multiracial people obscures how individuals are often asked to handle their identity. For example, the enrollment forms for government programs typically include only a few broad racial-ethnic categories. This approach to racial categorization is part of a long history that dictates single-race identities. Still, many individuals, especially young adults, struggle against social pressure to choose a single identity, and instead openly embrace multiple heritages. Public figures, rather than hide their mixed ancestry, now flaunt it. Singer Mariah Carey celebrates her Irish American background, and President Barack Obama speaks of being born in Hawaii to a Kenyan father and a White mother from Kansas. Tiger Woods, the world’s bestknown professional golfer, considers himself both Asian and African American.

FIG 10-1


An ethnic group, unlike a racial group, is set apart from others because of its national origin or distinctive cultural patterns. Among the ethnic groups in the United States are peoples with a Spanish-speaking background, referred to collectively as Latinos or Hispanics, such as Puerto Ricans, Mexican Americans, Cuban Americans, and other Latin Americans. Other ethnic groups in this country include Jewish, Irish, Italian, and Norwegian Americans. Although these groupings are convenient, they serve to obscure differences within ethnic categories (as in the case of Hispanics), as well as to overlook the mixed ancestry of so many people in the United States.
The distinction between racial and ethnic minorities is not always clear-cut. Some members of racial minorities, such as Asian Americans, may have significant cultural differences from other racial groups. At the same time, certain ethnic minorities, such as Latinos, may have obvious physical differences that set them apart from other ethnic groups in the United States.
Despite categorization problems, sociologists continue to feel that the distinction between racial groups and ethnic groups is socially significant. In most societies, including the United States, socially constructed physical differences tend to be more visible than ethnic differences. Partly as a result of this fact, stratification along racial lines is more resistant to change than stratification along ethnic lines. Over time, members of an ethnic minority can sometimes become indistinguishable from the majority—although the process may take generations and may never include all members of the group. In contrast, members of a racial minority find it much more difficult to blend in with the larger society and gain acceptance from the majority.


Prejudice is a negative attitude toward an entire category of people, often an ethnic or racial minority. If you resent your roommate because he or she is sloppy, you are not necessarily guilty of prejudice. However, if you immediately stereotype your roommate on the basis of such characteristics as race, ethnicity, or religion, that is a form of prejudice. Prejudice tends to perpetuate false definitions of individuals and groups.
Sometimes prejudice results from ethnocentrism —the tendency to assume that one’s own culture and way of life represent the norm or are superior to all others. Ethnocentric people judge other cultures by the standards of their group, which leads quite easily to prejudice against cultures they view as inferior.
One important and widespread ideology that reinforces prejudice is racism, the belief that one race is supreme and all others are innately inferior. When racism prevails in a society, members of subordinate groups generally experience prejudice, discrimination, and exploitation. In 1990, as concern mounted about racist attacks in the United States, Congress passed the Hate Crimes Statistics Act. As a result, hate crimes are now beginning to be reported and investigated in much the same way as conventional crimes against property and people.
Prejudice is also rooted in racial and ethnic stereotypes—unreliable generalizations about all members of a group that do not recognize individual differences within the group. The dominant or majority group creates these stereotypes through the process of racial formation. As the interactionist William I. Thomas (1923) noted, the dominant group’s “definition of the situation” is often so powerful, it can mold the individual personality. That is, people respond not only to the objective features of a situation or person, but to the social meaning that situation or person carries. Thus, the false images or stereotypes created by the dominant group can become real in their consequences.


Over the past three generations, nationwide surveys have consistently shown growing support among Whites for integration, interracial dating, and the election of minority group members to public office—including the presidency of the United States. How can this trend be explained, given the persistence of residential segregation and the commission of thousands of hate crimes every year? The answer, to some extent, is that prejudice and discriminatory attitudes are no longer expressed as freely as they once were. Often, they are couched in terms of equal opportunity.
Color-blind racism is the use of the principle of race neutrality to defend a racially unequal status quo. Proponents of race neutrality claim they believe that everyone should be treated equally. However, the way they apply the principle to government policy is anything but neutral. Proponents of this approach oppose affirmative action (see pages 237–238), public welfare assistance, and to a large extent, government-funded health insurance, all of which they see largely as favors to minority groups. Yet they do not object to practices that privilege Whites, such as college admissions criteria that give preference to 233 the relatives of alumni. Nor do they oppose tax breaks for homeowners, most of whom are White, or government financial aid to college students, who are also disproportionately White. Though race neutrality is not based on theories of racial superiority or inferiority, then, the idea that society should be color-blind only perpetuates racial inequality.
Color-blind racism has also been referred to as “covert racism.” Although its proponents rarely speak of racism, other indicators of social status, such as social class or citizenship, tend to become proxies for race. Thus, many White people can convince themselves that they are not racist—nor do they know anyone who is—and yet remain prejudiced against “welfare mothers” and “immigrants.” They can conclude, mistakenly, that racial tolerance, or even racial and ethnic equality, has been achieved.
Researchers who have surveyed White attitudes toward African Americans over the past several decades have reached two inescapable conclusions. First, people’s attitudes do change. In periods of social upheaval, dramatic attitudinal shifts can occur within a single generation. Second, less racial progress was made in the late 20th and early 21st centuries than in the relatively brief period of the 1950s and 1960s. Today, economically disadvantaged groups such as African Americans and Latinos have become so closely associated with urban decay, homelessness, welfare, and crime that those problems are now viewed as racial issues, even if they are not labeled as such. The tendency to blame the victims of these social ills complicates their resolution, especially at a time when government’s ability to address social problems is limited by recession, antitax initiatives, and concern over terrorism. In short, the color line is still in place, even if more and more people refuse to acknowledge its existence (Ansell 2008; Bonilla-Silva 2006; Coates 2008; M. King 2007:3–4; Quillian 2006; Winant 1994:106–108).


Prejudice often leads to discrimination, the denial of opportunities and equal rights to individuals and groups because of prejudice or other arbitrary reasons. Say that a White corporate president with a prejudice against Asian Americans has to fill an executive position. The most qualified candidate for the job is a Vietnamese American. If the president refuses to hire this candidate and instead selects an inferior White candidate, he or she is engaging in an act of racial discrimination.
Prejudiced attitudes should not be equated with discriminatory behavior. Although the two are generally related, they are not identical; either condition can be present without the other. A prejudiced person does not always act on his or her biases. The White corporate president, for example, might choose—despite his or her stereotypes—to hire the Vietnamese American. That would be prejudice without discrimination. On the other hand, a White corporate president with a completely respectful view of Vietnamese Americans might refuse to hire them for executive posts out of fear that biased clients would take their business elsewhere. In that case, the president’s action would constitute discrimination without prejudice.
A field experiment by sociologist Devah Pager, then a doctoral candidate at the University of Wisconsin–Madison, documented racial discrimination in hiring. Pager sent four polite, well-dressed young men out to look for an entry-level job in Milwaukee, Wisconsin. All were 23-year-old college students, but they presented themselves as high school graduates with similar job histories. Two of the men were Black and two were White. One Black applicant and one White applicant claimed to have served 18 months in jail for a felony conviction—possession of cocaine with intent to distribute.
As one might expect, the four men’s experiences with 350 potential employers were vastly different. Predictably, the White applicant with a purported prison record received only half as many callbacks as the other White applicant—17 percent compared to 34 percent. But as dramatic as the effect of his criminal record was, the effect of his race was more significant. Despite his prison record, he received slightly more callbacks than the Black applicant with no criminal record (17 percent compared to 14 percent). Race, it seems, was more of a concern to potential employers than a criminal background. The implications of this research are not limited to any one city, such as Milwaukee. Similar studies have confirmed discriminatory handling of job applications in Chicago; New York City; Long Island, New York; San Diego; and Washington, D.C. Over time, the cumulative effect of such differential behavior by employers contributes to significant differences in income. Figure 10-2 vividly illustrates the income inequality between White men and almost everyone else (Pager 2007; Pager et al. 2009).
If race serves as a barrier, why do Asian American men earn slightly more income than White men (see Figure 10-2 )? Not all Asian American men earn high incomes; indeed, some Asian American groups, such as Laotians and Vietnamese, have high levels of poverty. Nevertheless, a significant number of Asian Americans have advanced degrees that qualify 235 them for highly paid jobs, and which raise the median income for the group as a whole. Although these highly educated Asian Americans earn a lot of money, they do not earn quite as much as their White counterparts, however. With a doctorate holder in the family, the typical Asian American household earns an estimated $130,000, compared to $140,000 for a White household.
Sometimes racial and ethnic discrimination is overt. Internet forums like Craigslist.org or Roommate.com feature classified ads that state “African Americans and Arabians tend to clash with me” or “Clean, Godly Christian men only.” While anti-discrimination laws prevent such notices from being published in the newspapers, existing law has not caught up with online bigotry in hiring and renting (Liptak 2006).
Discrimination persists even for the most educated and qualified minority group members from the best family backgrounds. Despite their talents and experiences, they sometimes encounter attitudinal or organizational bias that prevents them from reaching their full potential. The term glass ceiling refers to an invisible barrier that blocks the promotion of a qualified individual in a work environment because of the individual’s gender, race, or ethnicity (R. Schaefer 2012; Yamagata et al. 1997).
In early 1995, the federal Glass Ceiling Commission issued the first comprehensive study of barriers to promotion in the United States. The commission found that glass ceilings continue to block women and minority group men from top management positions in the nation’s industries.


One aspect of discrimination that is often overlooked is the privileges that dominant groups enjoy at the expense of others. For instance, we tend to focus more on the difficulty women have getting ahead at work and getting a hand at home than on the ease with which men manage to make their way in the world and avoid household chores. Similarly, we concentrate more on discrimination against racial and ethnic minorities than on the advantages members of the White majority enjoy. Indeed, most White people rarely think about their “Whiteness,” taking their status for granted.
Sociologists and other social scientists are becoming increasingly interested in what it means to be “White,” for White privilege is the other side of the proverbial coin of racial discrimination. In this context, White privilege refers to rights or immunities granted to people as a particular benefit or favor simply because they are White (Ferber and Kimmel 2008). This view of whiteness as a privilege echoes an observation by W. E. B. DuBois, that rather than wanting fair working conditions for all laborers, White workers had accepted the “public and psychological wage” of whiteness ([1935] 1962:700).
The feminist scholar Peggy McIntosh (1988) became interested in White privilege after noticing that most men would not acknowledge that there were privileges attached to being male—even if they would agree that being female had its disadvantages. Did White people suffer from a similar blind spot regarding their racial privilege? she wondered. Intrigued, McIntosh began to list all the ways in which she benefited from her Whiteness. She soon realized that the list of unspoken advantages was long and significant.
McIntosh found that as a White person, she rarely needed to step out of her comfort zone, no matter where she went. If she wished to, she could spend most of her time with people of her race. She could find a good place to live in a pleasant neighborhood, buy the foods she liked to eat from almost any grocery store, and get her hair styled in almost any salon. She could attend a public meeting without feeling that she did not belong, that she was different from everyone else.
McIntosh discovered, too, that her skin color opened doors for her. She could cash checks and use credit cards without suspicion, browse through stores without being shadowed by security guards. She could be seated without difficulty in a restaurant. If she asked to see the manager, she could assume he or she would be of her race. If she needed help from a doctor or a lawyer, she could get it.
McIntosh also realized that her Whiteness made the job of parenting easier. She did not need to worry about protecting her children from people who didn’t like them. She could be sure that their schoolbooks would show pictures of people who looked like them, and that their history texts would describe White people’s achievements. She knew that the television programs they watched would include White characters.
Finally, McIntosh had to admit that others did not constantly evaluate her in racial terms. When she appeared in public, she didn’t need to worry that her clothing or behavior might reflect poorly on White people. If she was recognized for an achievement, it was seen as her achievement, not that of an entire race. And no one ever assumed that the personal opinions she voiced should be those of all White people. Because McIntosh blended in with the people around her, she wasn’t always onstage.
These are not all the privileges White people take for granted as a result of their membership in the dominant racial group in the United States. As Devah Pager’s study showed (see page 234), White job seekers enjoy a tremendous advantage over equally well-qualified—even better-qualified—Blacks. Whiteness does carry privileges—to a much greater extent than most White people realize (Fitzgerald 2008; Picca and Feagin 2007).


Discrimination is practiced not only by individuals in one-to-one encounters but also by institutions in their daily operations. Social scientists are particularly concerned with the ways in which structural factors such as employment, housing, health care, and government operations maintain the social significance of race and ethnicity. Institutional discrimination refers to the denial of opportunities and equal rights to individuals and groups that results from the normal operations of a society. This kind of discrimination consistently affects certain racial and ethnic groups more than others.
The Commission on Civil Rights (1981:9–10) has identified various forms of institutional discrimination:

• Rules requiring that only English be spoken at a place of work, even when it is not a business necessity to restrict the use of other languages.
• Preferences shown by law and medical schools in the admission of children of wealthy and influential alumni, nearly all of whom are White.
• Restrictive employment-leave policies, coupled with prohibitions on part-time work, that make it difficult for the heads of single-parent families (most of whom are women) to obtain and keep jobs.

In some cases, even seemingly neutral institutional standards can have discriminatory effects. African American students at a midwestern state university protested a policy under which fraternities and sororities that wished to use campus facilities for a dance were required to pay a $150 security deposit to cover possible damages. They complained that the policy had a discriminatory impact on minority student organizations. Campus police countered that the university’s policy applied to all student groups interested in using the facilities. However, since the overwhelmingly White fraternities and sororities at the school had their own houses, which they used for dances, the policy indeed affected only the African American and other minority organizations.
Significant potential for institutional discrimination exists in local laws that govern voting. States have begun requiring voters to show a photo ID (see Figure 10-3), presumably to prevent voter fraud. However, there is little evidence that people have been impersonating eligible voters at the polls. Such laws disproportionately disenfranchise members of minority groups simply because they do not have a driver’s license. According to national surveys, 25 percent of African Americans and 16 percent of Latino citizens do not have a valid government-issued photo ID, compared to 8 percent of White citizens (Brennan Center 2006; Dade 2012).
Attempts have also been made to eradicate or compensate for other forms of discrimination in the United States. The 1960s saw the passage of many pioneering civil rights laws, including the landmark 1964 Civil Rights Act (which prohibits discrimination in public accommodations and publicly owned facilities on the basis of race, color, creed, national origin, and gender).
For more than 40 years, affirmative action programs have been instituted to overcome past discrimination. Affirmative action refers to positive efforts to recruit minority group members or women for jobs, promotions, and educational opportunities. Many people resent these programs, arguing that advancing one group’s cause merely shifts the discrimination to another group. By giving priority to African Americans in admissions, for example, schools may overlook more qualified White candidates. In many parts of the country and many sectors of the economy, affirmative action is being rolled back, even though it was never fully implemented.

Discriminatory practices continue to pervade nearly all areas of life in the United States today. In part, that is because various individuals and groups actually benefit from racial and ethnic discrimination in terms of money, status, and influence. Discrimination permits members of the majority to enhance their wealth, power, and prestige at the expense of others. Less qualified people get jobs and promotions simply because they are members of the dominant group. Such individuals and groups will not surrender these advantages easily. We’ll turn now to a closer look at this functionalist analysis, as well as the conflict, labeling, and interactionist perspectives on race and ethnicity.

FIG 10-3


What possible use could racial bigotry have? Functionalist theorists, while agreeing that racial hostility is hardly to be admired, point out that it serves positive functions for those who practice discrimination.
Anthropologist Manning Nash (1962) has identified three functions of racially prejudiced beliefs for the dominant group:

1. Racist views provide a moral justification for maintaining an unequal society that routinely deprives a minority group of its rights and privileges. Southern Whites justified slavery by believing that Africans were physically and spiritually subhuman and devoid of souls.
2. Racist beliefs discourage the subordinate minority from attempting to question its lowly status, which would be to question the very foundations of society.
3. Racial myths suggest that any major societal change (such as an end to discrimination) would only bring greater poverty to the minority and lower the majority’s standard of living. As a result, racial prejudice grows when a society’s value system (one underlying a colonial empire or slavery, for example) is threatened.

Although racial prejudice and discrimination may serve the powerful, such unequal treatment can also be dysfunctional for a society, and even for the dominant group. Sociologist Arnold Rose (1951) has outlined four dysfunctions that are associated with racism:

1. A society that practices discrimination fails to use the resources of all individuals. Discrimination limits the search for talent and leadership to the dominant group.
2. Discrimination aggravates social problems such as poverty, delinquency, and crime, and places the financial burden of alleviating those problems on the dominant group.
3. Society must invest a good deal of time and money to defend its barriers to the full participation of all members. 4. Racial prejudice and discrimination often undercut goodwill and friendly diplomatic relations between nations.


Conflict theorists would certainly agree with Arnold Rose that racial prejudice and discrimination have many harmful consequences for society. Sociologists such as Oliver Cox (1948), Robert Blauner (1972), and Herbert M. Hunter (2000) have used the exploitation theory (or Marxist class theory ) to explain the basis of racial subordination in the United States. Karl Marx viewed the exploitation of the lower class as a basic part of the capitalist economic system. From a Marxist point of view, racism keeps minorities in low-paying jobs, thereby supplying the capitalist ruling class with a pool of cheap labor. Moreover, by forcing racial minorities to accept low wages, capitalists can restrict the wages of all members of the proletariat. Workers from the dominant group who demand higher wages can always be replaced by minorities who have no choice but to accept low-paying jobs.
The conflict view of race relations seems persuasive in a number of instances. Japanese Americans were the object of little prejudice until they began to enter jobs that brought them into competition with Whites. The movement to keep Chinese immigrants out of the United States became most fervent during the latter half of the 19th century, when Chinese and Whites fought over dwindling work opportunities. Both the enslavement of Blacks and the extermination and removal westward of Native Americans were economically motivated.
However, the exploitation theory is too limited to explain prejudice in its many forms. Not all minority groups have been exploited to the same extent. In addition, many groups (such as the Quakers and the Mormons) have been victimized by prejudice for other than economic reasons. Still, as Gordon Allport (1979:210) concludes, the exploitation theory correctly “points a sure finger at one of the factors involved in prejudice, . . . rationalized self-interest of the upper classes.”


One practice that fits both the conflict perspective and labeling theory is racial profiling. Racial profiling is any arbitrary action initiated by an authority based on race, ethnicity, or national origin rather than on a person’s behavior. Generally, racial profiling occurs when law enforcement officers, including customs officials, airport security, and police, assume that people who fit a certain description are likely to be engaged in illegal activities. Beginning in the 1980s with the emergence of the crack cocaine market, skin color became a key characteristic in racial profiling. This practice is often based on very explicit stereotypes. For example, one federal antidrug initiative encouraged officers to look specifically for people with dreadlocks and for Latino men traveling together.
Today, authorities continue to rely on racial profiling, despite overwhelming evidence that it is misleading. A recent study showed that Blacks are still more likely than Whites to be frisked and handled with force when they are stopped. Yet Whites are more likely than Blacks to possess weapons, illegal drugs, or stolen property (A. Farrell and McDevitt 2010).
Research on the ineffectiveness of racial profiling, coupled with calls by minority communities to end the stigmatization, has led to growing demands to end the practice. But these efforts came to an abrupt halt after the September 11, 2001, terrorist attacks on the United States, when suspicions arose about Muslim and Arab immigrants. Foreign students from Arab countries were summoned for special questioning by authorities. Legal immigrants who were identified as Arab or Muslim were scrutinized for possible illegal activity and prosecuted for violations that authorities routinely ignored among immigrants of other ethnicities and faiths. National surveys have found little change since 2001 in public support for profiling of Arab Americans at airports. In 2010, 53 percent of Americans favored “ethnic and religious profiling” of air travelers—even those who are U.S. citizens—together with more intensive security checks of passengers who fit certain profiles (Zogby 2010).


A Hispanic woman is transferred from a job on an assembly line to a similar position working next to a White man. At first, the White man is patronizing, assuming that she must be incompetent. She is cold and resentful; even when she needs assistance, she refuses to admit it. After a week, the growing tension between the two leads to a bitter quarrel. Yet over time, each slowly comes to appreciate the other’s strengths and talents. A year after they begin working together, these two workers become respectful friends. This story is an example of what interactionists call the contact hypothesis in action.
The contact hypothesis states that in cooperative circumstances, interracial contact between people of equal status will cause them to become less prejudiced and to abandon old stereotypes. People begin to see one another as individuals and discard the broad generalizations characteristic of stereotyping. Note the phrases equal status and cooperative circumstances. In the story just told, if the two workers had been competing for one vacancy as a supervisor, the racial hostility between them might have worsened (Allport 1979; Fine 2008).
As Latinos and other minorities slowly gain access to betterpaying and more responsible jobs, the contact hypothesis may take on even greater significance. The trend in our society is toward increasing contact between individuals from dominant and subordinate groups. That may be one way of eliminating—or at least reducing—racial and ethnic stereotyping and prejudice. Another may be the establishment of interracial coalitions, an idea suggested by sociologist William Julius Wilson (1999). To work, such coalitions would obviously need to be built on an equal role for all members.
Table 10-2 summarizes the four major sociological perspectives on race. No matter what the explanation for racial and ethnic distinctions—functionalist, conflict, labeling, or interactionist—these socially constructed inequalities can have powerful consequences in the form of prejudice and discrimination. We will see how inequality based on the ascribed characteristics of race and ethnicity can poison people’s interpersonal relations, depriving whole groups of opportunities others take for granted.

TABLE 10-2


Separate schools, separate seating on buses and in restaurants, separate washrooms, even separate drinking fountains—these were all part of the lives of African Americans in the South when segregation ruled early in the 20th century. Segregation refers to the physical separation of two groups of people in terms of residence, workplace, and social events. Generally, a dominant group imposes this pattern on a minority group. Segregation is rarely complete, however. Intergroup contact inevitably occurs, even in the most segregated societies.
From 1948 (when it received its independence) to 1990, the Republic of South Africa severely restricted the movement of Blacks and other non-Whites by means of a wide-ranging system of segregation known as apartheid. Apartheid even included the creation of separate homelands where Blacks were expected to live. However, decades of local resistance to apartheid, combined with international pressure, led to marked political changes in the 1990s. In 1994, a prominent Black activist, Nelson Mandela, was elected South Africa’s president in the first election in which Blacks (the majority of the nation’s population) were allowed to vote. Mandela had spent almost 28 years in South African prisons for his anti-apartheid activities. His election was widely viewed as the final blow to South Africa’s oppressive policy of segregation.
In contrast to the enforced segregation in South Africa, the United States exemplifies an unmandated but nevertheless persistent separation of the races. In their book, American Apartheid, sociologists Douglas Massey and Nancy Denton (1993) described segregation in U.S. cities using 1990 census data. As the book’s title suggests, the racial makeup of U.S. neighborhoods resembles the rigid government-imposed segregation that prevailed for so long in South Africa.
Analysis of more recent census data shows the continuing segregation of U.S. cities, despite the nation’s growing racial and ethnic diversity. Across the board, the residential segregation of African Americans has declined only modestly since 2000; for Asian Americans and Latinos it has increased. Even allowing for social class, these patterns of residential segregation persist. One implication of this persistent segregation is that affluent Black and Latino households are much more likely than affluent White households to be located in or near poor neighborhoods. As a result, well-to-do minority group members have less immediate access than other groups to high-quality retail stores, restaurants, and services (Bureau of the Census 2010b; Farley 2004; Frey 2011; Logan 2011; Wilkes and Iceland 2004).


Amalgamation happens when a majority group and a minority group combine to form a new group. Through intermarriage over several generations, various groups in society combine to form a new group. This pattern can be expressed as A + B + C → D, where A, B, and C represent different groups in a society, and D signifies the end result, a unique cultural-racial group unlike any of the initial groups (Newman 1973).
The belief in the United States as a “melting pot” became compelling in the first part of the 20th century, particularly since that image suggested that the nation had an almost divine mission to amalgamate various groups into one people. However, in actuality, many residents were not willing to include Native Americans, Jews, African Americans, Asian Americans, and Irish Roman Catholics in the melting pot. Therefore, this pattern does not adequately describe dominant–subordinate relations in the United States. There has been a significant increase in interracial marriage among Whites, Blacks, Asians, and Hispanics in recent years.


In India, many Hindus complain about Indian citizens who copy the traditions and customs of the British. In France, people of Arab and African origin, many of them Muslim, complain they are treated as secondclass citizens—a charge that provoked riots in 2005. In Australia, Aborigines who have become part of the dominant society refuse to acknowledge their darkerskinned grandparents on the street. And in the United States, some Italian Americans, Polish Americans, Hispanics, and Jews have changed their ethnic-sounding family names to names that are typically found among White Protestant families.
Assimilation is the process through which a person forsakes his or her cultural tradition to become part of a different culture. Generally, it is practiced by a minority group member who wants to conform to the standards of the dominant group. Assimilation can be described as a pattern in which A + B + C → A. The majority, A, dominates in such a way that members of minorities B and C imitate it and attempt to become indistinguishable from it (Newman 1973).
A recent comparison study of immigrant groups in the United States, Canada, and Europe found that for the most part, assimilation has progressed further in the United States than in Europe, although more slowly than in Canada. In the United States, the rate of assimilation has generally been constant across groups. However, the recent recession has hampered new groups’ ability to move into a broad range of jobs (Myers and Pitkin 2011; Vigdor 2011).


In a pluralistic society, a subordinate group does not have to forsake its lifestyle and traditions to avoid prejudice or discrimination. Pluralism is based on mutual respect for one another’s cultures among the various groups in a society. This pattern allows a minority group to express its own culture and still participate without prejudice in the larger society. Earlier, we described amalgamation as A + B + C → D, and assimilation as A + B + C → A. Using this same approach, we can conceive of pluralism as A + B + C → A + B + C. All the groups coexist in the same society (Newman 1973).
In the United States, pluralism is more of an ideal than a reality. There are distinct instances of pluralism—the ethnic neighborhoods in major cities, such as Koreatown, Little Tokyo, Andersonville (Swedish Americans), and Spanish Harlem—yet there are also limits to cultural freedom. To survive, a society must promote a certain consensus among its members regarding basic ideals, values, and beliefs. Thus, if a Hungarian immigrant to the United States wants to move up the occupational ladder, he or she cannot avoid learning the English language. Switzerland exemplifies the modern pluralistic state. There, the absence of both a national language and a dominant religious faith leads to a tolerance for cultural diversity. In addition, various political devices safeguard the interests of ethnic groups in a way that has no parallel in the United States. In contrast, Great Britain has had difficulty achieving cultural pluralism in a multiracial society. East Indians, Pakistanis, and Blacks from the Caribbean and Africa experience prejudice and discrimination within the dominant White society there. Some British advocate cutting off all Asian and Black immigration, and a few even call for expulsion of those non-Whites currently living in Britain.


“I am an invisible man,” wrote Black author Ralph Ellison in his novel Invisible Man (1952:3). “I am a man of substance, of flesh and bone, fiber and liquids—and I might even be said to possess a mind. I am invisible, understand, simply because people refuse to see me.”
Over five decades later, many African Americans still feel 243 invisible. Despite their large numbers, they have long been treated as second-class citizens. Currently, by the standards of the federal government, more than 1 out of every 4 African Americans—as opposed to 1 out of every 11 White non-Hispanics—is poor (DeNavas-Walt et al. 2011:15).
Contemporary institutional discrimination and individual prejudice against African Americans are rooted in the history of slavery in the United States. Many other subordinate groups had little wealth and income, but as sociologist W. E. B. DuBois ([1909] 1970) and others have noted, enslaved African Americans were in an even more oppressive situation, because by law they could not own property and could not pass on the benefits of their labor to their children. Today, increasing numbers of African Americans and sympathetic Whites are calling for slave reparations to compensate for the injustices of forced servitude. Reparations could include official expressions of apology from governments such as the United States, ambitious programs to improve African Americans’ economic status, or even direct payments to descendants of slaves (D. Williams and Collins 2004).
The end of the Civil War did not bring genuine freedom and equality for Blacks. The Southern states passed Jim Crow laws to enforce official segregation, and the Supreme Court upheld them as constitutional in 1896. In addition, Blacks faced the danger of lynching campaigns, often led by the Ku Klux Klan, during the late 1800s and early 1900s. From a conflict perspective, Whites maintained their dominance formally through legalized segregation and informally by means of vigilante terror and violence (Franklin and Higginbotham 2011).
During the 1960s, a vast civil rights movement emerged, with many competing factions and strategies for change. The Southern Christian Leadership Conference (SCLC), founded by Dr. Martin Luther King Jr., used nonviolent civil disobedience to oppose segregation. The National Association for the Advancement of Colored People (NAACP) favored use of the courts to press for equality for African Americans. But many younger Black leaders, most notably Malcolm X, turned toward an ideology of Black power. Proponents of Black power rejected the goal of assimilation into White middleclass society. They defended the beauty and dignity of Black and African cultures and supported the creation of Black-controlled political and economic institutions (Ture and Hamilton 1992).
Despite numerous courageous actions to achieve Black civil rights, Black and White citizens are still separate, still unequal. From birth to death, Blacks suffer in terms of their life chances. Life remains difficult for millions of poor Blacks, who must attempt to survive in ghetto areas shattered by high unemployment and abandoned housing. Today the median household income of Blacks is still 60 percent that of Whites, and the unemployment rate among Blacks is more than twice that of Whites.
Some African Americans—especially middle-class men and women—have made economic gains over the past 50 years. For example, data show that the number of African Americans in management increased nationally from 2.4 percent of the total in 1958 to 6.4 percent in 2010. Yet Blacks still represent only 7 percent or less of all physicians, engineers, scientists, lawyers, judges, and marketing managers.


Today, about 2.5 million Native Americans represent a diverse array of cultures distinguishable by language, family organization, religion, and livelihood. The outsiders who came to the United States—European settlers—and their descendants came to know these native peoples’ forefathers as “American Indians.” By the time the Bureau of Indian Affairs (BIA) was organized as part of the War Department in 1824, Indian–White relations had already included more than two centuries of hostile actions that had led to the virtual elimination of native peoples (see Figure 10-1). During the 19th century, many bloody wars wiped out a significant part of the Indian population. By the end of the century, schools for Indians—operated by the BIA or by church missions—prohibited the practice of Native American cultures. Yet at the same time, such schools did little to make the children effective members of White society.
Today, life remains difficult for members of the 554 tribal groups in the United States, whether they live in cities or on reservations. For example, one Native American teenager in six has attempted suicide—a rate four times higher than the rate for other teenagers. Traditionally, some Native Americans have chosen to assimilate and abandon all vestiges of their tribal cultures to escape certain forms of prejudice. However, by the 1990s, an increasing number of people in the United States were openly claiming a Native American identity. Since 1960, the federal government’s count of Native Americans has tripled.
Native Americans have made some progress in redressing their past mistreatment. In 2009, the federal government settled a 13-year-old lawsuit for the recovery of lease payments due on tribal lands used by the government for oil and gas exploration and grazing. Although the $3.4 billion settlement was large, it was long overdue—some of the government’s debts dated back to 1887—and from the perspective of tribal leaders, it was too little, too late. The United States is not the only country that has tried to redress the government’s past actions toward indigenous peoples (Box 10-1, page 244).
The introduction of gambling on Indian reservations has transformed the lives of some Native Americans. Native Americans got into the gaming industry in 1988, when Congress passed the Indian Gambling Regulatory Act. The law stipulates that states must negotiate agreements with tribes interested in commercial gaming; they cannot prevent tribes from engaging in gambling operations, even if state law prohibits such ventures. The income from these lucrative operations is not evenly distributed, however. About two-thirds of recognized Indian tribes are not involved in gambling ventures. Those tribes that earn substantial revenues from gambling constitute a small fraction of Native Americans (Conner and Taggart 2009).
Like Native Americans, many native peoples living in the United States have successfully established their autonomy, gaining control over their resources and business enterprises.


Asian Americans are a diverse group, one of the fastest-growing segments of the U.S. population (up 43 percent between 2000 and 2010). Among the many groups of Americans of Asian descent are Vietnamese Americans, Chinese Americans, Japanese Americans, and Korean Americans ( Figure 10-6 ).
Asian Americans are also economically diverse. There are rich and poor Japanese Americans, rich and poor Filipino Americans, and so forth. In fact, Southeast Asians living in the United States have the highest rate of welfare dependency of any racial or ethnic group. According to a study published in 2011, poverty rates are particularly high among the adult children of Cambodian, Hmong, and Thai immigrants to the United States. Though Asian Americans have substantially more schooling than other ethnic groups, their median income is only slightly higher than Whites’ income, and their poverty rate is higher. In 2010, for every Asian American household with an annual income of $150,000 or more, there was another earning less than $20,000 a year (DeNavas-Walt et al. 2011:38; Takei and Sakamoto 2011).
The fact that as a group, Asian Americans work in the same occupations as Whites suggests that they have been successful—and many have. However, there are some differences between the two groups. Asian immigrants, like other minorities and immigrants before them, are found disproportionately in low-paying service occupations. At the same time, better-educated Asian Americans are concentrated near the top in professional and managerial positions, although they rarely reach the pinnacle. Instead, they hit the glass ceiling, or try to “climb a broken ladder,” as some put it.
Ironically, Asian Americans are often held up as an unqualified success story. According to popular belief, they have succeeded in adapting to mainstream U.S. culture despite past prejudice and discrimination, and without resorting to confrontations with Whites. Box 10-2 compares this common conception with Asian Americans’ actual experience.

Chinese Americans Unlike African slaves and Native Americans, the Chinese were initially encouraged to immigrate to the United States. From 1850 to 1880, thousands of Chinese immigrated to this country, lured by job opportunities created by the discovery of gold. However, as employment possibilities decreased and competition for mining jobs grew, the Chinese became the target of a bitter campaign to limit their numbers and restrict their rights. Chinese laborers were exploited, then discarded.
In 1882, Congress enacted the Chinese Exclusion Act, which prevented Chinese immigration and even forbade Chinese in the United States to send for their families. As a result, the Chinese population declined steadily until after World War II. More recently, the descendants of the 19th-century immigrants have been joined by a new influx from Hong Kong and Taiwan. These groups may contrast sharply in their degree of assimilation, desire to live in Chinatowns, and feelings about this country’s relations with the People’s Republic of China.
Currently, over 3 million Chinese Americans live in the United States. Some Chinese Americans have entered lucrative occupations, yet many immigrants struggle to survive under living and working conditions that belie the model-minority stereotype. New York City’s Chinatown district is filled with illegal sweatshops in which recent immigrants—many of them Chinese women—work for minimal wages. Outside of Chinatown, 23 percent of Asian Americans fall into the low-income category. At the other end of the income distribution, barely 5 percent of Chinatown’s residents earn more than $100,000 a year, compared to 25 percent of Asian Americans who live elsewhere in New York City (Logan et al. 2002; Wong 2006).
Asian Indians After Chinese Americans, the second-largest Asian American group, immigrants from India and their descendants, numbers over 2.9 million. It is difficult to generalize about Asian Indian Americans because Asian Indians are such a diverse population. India, a country of more than 1.2 billion people that is fast becoming the most populous nation in the world, is multiethnic. Perhaps because Asian Indian immigrants feel threatened by mainstream U.S. culture, religious orthodoxy is often stronger among first-generation immigrants to the United States than it is in India. New immigrants try to practice their religion just as they did in India rather than join congregations already established by other immigrant groups.
Maintaining family traditions is a major challenge for Asian Indian immigrants to the United States. Family ties remain strong despite their immigration—so much so that many Asian Indians feel more connected to their relatives in India than Americans do to relatives nearby. These Desi (pronounced daysee, colloquial for people who trace their ancestry to South Asia, especially India) are particularly concerned about the erosion of traditional family authority. Indian American children dress like their peers, go to fast-food restaurants, and even eat hamburgers, rejecting the vegetarian diet typical of both Hindus and many Asian Indian Muslims. Sons do not feel the extent of responsibility to the family that tradition dictates. Daughters, whose occupations and marriage partners the family could control in India, assert their right to choose their careers, and even their husbands (Rangaswamy 2005).

Filipino Americans Filipinos are the third-largest Asian American group in the United States, with over 2.6 million people. For geographic reasons, social scientists consider them to be of Asian extraction, but physically and culturally this group also reflects centuries of Spanish and U.S. colonial rule, as well as the more recent U.S. military occupation.
Filipinos began immigrating to the United States as American nationals when the U.S. government gained possession of the Philippine Islands at the end of the Spanish–American War (1899). When the Philippines gained their independence in 1948, Filipinos lost their unrestricted immigration rights, although farmworkers were welcome to work in Hawai’i’s pineapple groves. Aside from this exception, immigration was restricted to 50 to 100 Filipinos a year until 1965, when the Immigration Act lifted the strict quotas.
Today, a significant percentage of Filipino immigrants are well-educated professionals who work in the field of health care. Although they are a valuable human resource in the United States, their immigration has long drained the medical establishment in the Philippines. When the U.S. Immigration and Naturalization Service stopped giving preference to physicians, Filipino doctors began entering the country as nurses—a dramatic illustration of the incredible income differences between the two countries. Like other immigrant groups, Filipino Americans save much of their income and send a significant amount of money, called remittances, back to their extended families (Zarembo 2004b).
For several reasons, Filipino Americans have not coalesced in a single formal social organization, despite their numbers. Their strong loyalty to the family ( sa pamilya ) and to the church—particularly Roman Catholicism—reduces their need for a separate organization. Moreover, their diversity complicates the task of uniting the Filipino American community, which reflects the same regional, religious, and linguistic distinctions that divide their homeland. Thus, the many groups that Filipino Americans have organized tend to be club-like or fraternal in nature. Because those groups do not represent the general population of Filipino Americans, they remain largely invisible to Anglos. Although Filipinos remain interested in events in their homeland, they also seek to become involved in broader, non-Filipino organizations and to avoid exclusive activities (Bonus 2000; Kang 1996; Lau 2006; Padilla 2008).
Vietnamese Americans Vietnamese Americans came to the United States primarily during and after the Vietnam War—especially after U.S. withdrawal from the conflict in 1975. Refugees from the communist government in Vietnam, assisted by local agencies, settled throughout the United States, tens of thousands of them in small towns. Over time, however, Vietnamese Americans have gravitated toward the larger urban areas, establishing Vietnamese restaurants and grocery stores in their ethnic enclaves there.
In 1995, the United States resumed normal diplomatic relations with Vietnam. Gradually, the Viet Kieu, or Vietnamese living abroad, began to return to their old country to visit, but usually not to take up permanent residence. Today, more than 35 years after the end of the Vietnam War, sharp differences of opinion remain among Vietnamese Americans, especially the older ones, concerning the war and the present government of Vietnam (Pfeifer 2008).

Korean Americans At over 1.4 million, the population of Korean Americans now exceeds that of Japanese Americans. Yet Korean Americans are often overshadowed by other groups from Asia.
Today’s Korean American community is the result of three waves of immigration. The initial wave arrived between 1903 and 1910, when Korean laborers migrated to Hawai’i. The second wave followed the end of the Korean War in 1953; most of those immigrants were wives of U.S. servicemen and war orphans. The third wave, continuing to the present, has reflected the admissions priorities set up in the 1965 Immigration Act. These well-educated immigrants arrive in the United States with professional skills. Yet because of language difficulties and discrimination, many must settle at least initially for positions of lower responsibility than those they held in Korea and must suffer through a period of disenchantment. Stress, loneliness, and family strife may accompany the pain of adjustment.
In the early 1990s, the apparent friction between Korean Americans and another subordinate racial group, African Americans, attracted nationwide attention. Conflict between the two groups was dramatized in Spike Lee’s 1989 movie Do the Right Thing. The situation stemmed from Korean Americans’ position as the latest immigrant group to cater to the needs of innercity populations abandoned by those who have moved up the economic ladder. This type of friction is not new; generations of Jewish, Italian, and Arab merchants have encountered similar hostility from what to outsiders seems an unlikely source—another oppressed minority (Kim 1999).

Japanese Americans Approximately 842,000 Japanese Americans live in the United States. As a people, they are relatively recent arrivals. In 1880, only 148 Japanese lived in the United States, but by 1920 there were more than 110,000. Japanese immigrants—called the Issei (pronounced ee -say), or first generation—were usually males seeking employment opportunities. Many Whites saw them (along with Chinese immigrants) as a “yellow peril” and subjected them to prejudice and discrimination.
In 1941, the attack on Hawai’i’s Pearl Harbor by Japan had severe repercussions for Japanese Americans. The federal government decreed that all Japanese Americans on the West Coast must leave their homes and report to “evacuation camps.” In effect, Japanese Americans became scapegoats for the anger that other people in the United States felt concerning Japan’s role in World War II. By August 1943, in an unprecedented application of guilt by virtue of ancestry, 113,000 Japanese Americans had been forced into hastily built camps. In striking contrast, only a few German Americans and Italian Americans were sent to evacuation camps (Hosokawa 1969).
In 1983, a federal commission recommended government payments to all surviving Japanese Americans who had been held in detention camps. The commission reported that the detention was motivated by “race prejudice, war hysteria, and a failure of political leadership.” It added that “no documented acts of espionage, sabotage, or fifth-column activity were shown to have been committed” by Japanese Americans. In 1988, President Ronald Reagan signed the Civil Liberties Act, which required the federal government to issue individual apologies for all violations of Japanese Americans’ constitutional rights, and established a $1.25 billion trust fund to pay reparations to the approximately 77,500 surviving Japanese Americans who had been interned (Department of Justice 2000).

FIG 10-6


Arab Americans are immigrants, and their descendants, from the 22 nations of the Arab world. As defined by the League of Arab States, these are the nations of North Africa and what is popularly known as the Middle East, including Lebanon, Syria, Palestine, Morocco, Iraq, Saudi Arabia, and Somalia. Not all residents of those countries are Arab; for example, the Kurds, who live in northern Iraq, are not Arab. And some Arab Americans may have immigrated to the United States from non-Arab countries such as Great Britain or France, where their families have lived for generations.
The Arabic language is the single most unifying force among Arabs, although not all Arabs, and certainly not all Arab Americans, can read and speak Arabic. Moreover, the language has evolved over the centuries so that people in different parts of the Arab world speak different dialects. Still, the fact that the Koran (or Qur’an) was originally written in Arabic gives the language special importance to Muslims, just as the Torah’s compilation in Hebrew gives that language special significance to Jews.
Estimates of the size of the Arab American community differ widely. By some estimates, up to 3 million people of Arab ancestry reside in the United States. Among those who identify themselves as Arab Americans, the most common country of origin is Lebanon, followed by Syria, Egypt, and Palestine. In 2000, these four countries of origin accounted for twothirds of all Arab Americans. Their rising numbers have led to the development of Arab retail centers in several cities, including Dearborn and Detroit, Michigan; Los Angeles; Chicago; New York City; and Washington, D.C.
As a group, Arab Americans are extremely diverse. Many families have lived in the United States for several generations; others are foreign born. Their points of origin range from the metropolis of Cairo, Egypt, to the rural villages of Morocco. Despite the stereotype, most Arab Americans are not Muslim ( Figure 10-7 ). Nor can Arab Americans be characterized as having a specific family type, gender role, or occupational pattern (David 2004, 2008).
In spite of this great diversity, profiling of potential terrorists at airports has put Arab and Muslim Americans under special surveillance. For years, a number of airlines and law enforcement authorities have used appearance and ethnic-sounding names to identify and take aside Arab Americans and search their belongings. After the terrorist attacks of September 2001, criticism of this practice declined as concern for the public’s safety mounted.

FIG 10-7



Together, the various groups included under the general category Latinos represent the largest minority in the United States. There are more than 50 million Hispanics in this country, including 30 million Mexican Americans, more than 4 million Puerto Ricans, and smaller numbers of Cuban Americans and people of Central and South American origin ( Figure 10-8 ). The latter group represents the fastest-growing and most diverse segment of the Hispanic community.
According to Census Bureau data, the Latino population now outnumbers the African American population in 6 of the 10 largest metropolitan areas of the United States: New York City, Los Angeles, Chicago, Dallas–Fort Worth, Houston, and Miami–Fort Lauderdale. The rise of the Hispanic population of the United States—fueled by comparatively high birthrates and immigration levels—has Latinos beginning to flex their muscles as voters. In the 2012 presidential election, Latinos accounted for more than 10 percent of eligible voters. As Hispanics age and immigrants become citizens.
The various Latino groups share a heritage of Spanish language and culture, which can cause serious problems in their assimilation. An intelligent student whose first language is Spanish may be presumed slow or even unruly by English-speaking schoolchildren, and frequently by English-speaking teachers as well. The labeling of Latino children as underachievers, as learning disabled, or as emotionally disturbed can act as a selffulfilling prophecy for some children. Bilingual education aims at easing the educational difficulties experienced by Hispanic children and others whose first language is not English.
The educational difficulties of Latino students certainly contribute to Hispanics’ generally low economic status. In 2008, about 17 percent of all Hispanic households earned less than $15,000, compared to 11 percent of White non-Hispanic households; the poverty rate was 25.3 percent for Hispanics, compared to 9.4 percent for White non-Hispanics. Although Latinos are not as affluent as White non-Hispanics, a middle class is beginning to emerge (DeNavas-Walt et al. 2010:15, 35, 38).

Mexican Americans The largest Latino population is Mexican Americans, who can be further subdivided into those descended from residents of the territories annexed after the Mexican American War of 1848 and those who have immigrated from Mexico to the United States. The opportunity for a Mexican to earn in one hour what it would take an entire day to earn in Mexico has pushed millions of legal and illegal immigrants north.
Many people view Mexican Americans as primarily an immigrant group. Since at least 2000, however, the number of Mexican Americans who were born in the United States has far exceeded those who immigrated here. Overall, Mexican Americans accounted for 42 percent of the nation’s population growth in the decade 2000–2010. Two-thirds of them were born here; the other third were new arrivals (Bureau of the Census 2011a:8; Pew Hispanic Center 2011).

Puerto Ricans The second-largest segment of Latinos in the United States is Puerto Ricans. Since 1917, residents of Puerto Rico have held the status of American citizens; many have migrated to New York and other eastern cities. Unfortunately, Puerto Ricans have experienced serious poverty both in the United States and on the island. Those who live in the continental United States earn barely half the family income of Whites. As a result, a reverse migration began in the 1970s, when more Puerto Ricans were leaving for the island than were coming to the mainland (Torres 2008).
Politically, Puerto Ricans in the United States have not been as successful as Mexican Americans in organizing for their rights. For many mainland Puerto Ricans—as for many residents of the island—the paramount political issue is the destiny of Puerto Rico itself: should it continue in its present commonwealth status, petition for admission to the United States as the 51st state, or attempt to become an independent nation? This question has divided Puerto Rico for decades and remains a central issue in Puerto Rican elections. In a 1998 referendum, voters supported a “none of the above” option, effectively favoring continuation of the commonwealth status over statehood or independence.

Cuban Americans Cuban immigration to the United States dates back as far as 1831, but it began in earnest following Fidel Castro’s assumption of power in the Cuban revolution (1959). The first wave of 200,000 Cubans included many professionals with relatively high levels of schooling; these men and women were largely welcomed as refugees from communist tyranny. However, more recent waves of immigrants have aroused growing concern, partly because they were less likely to be skilled professionals. Throughout these waves of immigration, Cuban Americans have been encouraged to locate around the United States. Nevertheless, many continue to settle in (or return to) metropolitan Miami, Florida, with its warm climate and proximity to Cuba.
The Cuban experience in the United States has been mixed. Some detractors worry about the vehement anticommunism of Cuban Americans and the apparent growth of an organized crime syndicate that engages in the drug trade and ganglike violence. Recently, Cuban Americans in Miami have expressed concern over what they view as the indifference of the city’s Roman Catholic hierarchy. Like other Hispanics, Cuban Americans are underrepresented in leadership positions within the church. Also—despite many individual success stories—as a group, Cuban Americans in Miami remain behind Whites in income, rate of employment, and proportion of professionals (Masud-Piloto 2008).

Central and South Americans Immigrants from Central and South America are a diverse population that has not been closely studied. Indeed, most government statistics treat members of this group collectively as “other,” rarely differentiating among them by nationality. Yet people from Chile and Costa Rica have little in common other than their hemisphere of origin and the Spanish language—if that. The fact is, not all Central and South Americans speak Spanish. Immigrants from Brazil, for example, speak Portuguese; immigrants from French Guyana speak French; and immigrants from Suriname speak Dutch.
Racially, many of the nations of Central and South America follow a complex classification system that recognizes a multitude of color gradients. Experience with this multiracial system does not prepare immigrants to the United States for the stark Black–White racial divide that characterizes U.S. society. Beyond their diversity in color and language, immigrants from Central and South America are differentiated by social class distinctions, religious differences, urban or rural upbringings, and dialects. Some of them may come from indigenous populations, especially in Guatemala and Belize. If so, their social identity would be separate from any national allegiance.
In short, social relations among Central and South Americans, who collectively number nearly 7 million people, defy generalization. The same can be said about their relations with other Latinos and with non-Latinos. Central and South Americans do not form, nor should they be expected to form, a cohesive group. Nor do they easily form coalitions with Cuban Americans, Mexican Americans, or Puerto Ricans.

FIG 10-8



Jews constitute about 2 percent of the population of the United States. They play a prominent role in the worldwide Jewish community, because the United States has the world’s largest concentration of Jews. Like the Japanese, many Jewish immigrants came to this country and became white-collar professionals in spite of prejudice and discrimination.

Anti-Semitism —that is, anti-Jewish prejudice—has often been vicious in the United States, although rarely so widespread and never so formalized as in Europe. In many cases, Jews have been used as scapegoats for other people’s failures. Not surprisingly, Jews have not achieved equality in the United States. Despite high levels of education and professional training, they are still conspicuously absent from the top management of large corporations (except for the few firms founded by Jews). Nonetheless, a national survey in 2009 showed that one out of four people in the United States blames “the Jews” for the financial crisis. In addition, private social clubs and fraternal groups frequently continue to limit membership to Gentiles (non-Jews), a practice upheld by the Supreme Court in the 1964 case Bell v. Maryland (Malhotra and Margalit 2009).
The Anti-Defamation League (ADL) of B’nai B’rith coordinates an annual tally of reported anti-Semitic incidents. Although the number has fluctuated, in 2009 the tabulation of the total reported incidents of harassment, threats, vandalism, and assaults came to 1,211. Some incidents were inspired and carried out by neo-Nazi skinheads—groups of young people who champion racist and anti-Semitic ideologies. Such threatening behavior only intensifies the fears of many Jewish Americans, who remember the Holocaust—the extermination of 6 million Jews by the Nazi Third Reich during World War II (Anti-Defamation League 2010).
As is true for other minorities discussed in this article, Jewish Americans face the choice of maintaining ties to their long religious and cultural heritage or becoming as indistinguishable as possible from Gentiles. Many Jews have tended to assimilate, as is evident from the rise in marriages between Jews and Christians. In marriages that occurred in the 1970s, more than 70 percent of Jews married Jews or people who converted to Judaism. In marriages since 1996, that proportion has dropped to 53 percent. This trend means that today, American Jews are almost as likely to marry a Gentile as a Jew. For many, religion is a nonissue—neither parent practices religious rituals. Two-thirds of the children of these Jewish–Gentile marriages are not raised as Jews. Finally, in 2005, two-thirds of Jews felt that the biggest threat to Jewish life was anti-Semitism; only one-third named intermarriage as the biggest threat (American Jewish Committee 2005; Sanua 2007).


A significant segment of the population of the United States is made up of White ethnics whose ancestors arrived from Europe within the past century. The nation’s White ethnic population includes about 49 million people who claim at least partial German ancestry, 36 million Irish Americans, 17 million Italian Americans, and 10 million Polish Americans, as well as immigrants from other European nations. Some of these people continue to live in close-knit ethnic neighborhoods, whereas others have largely assimilated and left the “old ways” behind.
Many White ethnics today identify only sporadically with their heritage. Symbolic ethnicity refers to an emphasis on concerns such as ethnic food or political issues rather than on deeper ties to one’s ethnic heritage. It is reflected in the occasional family trip to an ethnic bakery, the celebration of a ceremonial event such as St. Joseph’s Day among Italian Americans, or concern about the future of Northern Ireland among Irish Americans. Such practices are another example of the social construction of race and ethnicity. Except in cases in which new immigration reinforces old traditions, symbolic ethnicity tends to decline with each passing generation (Alba 1990; Winter 2008).
Although the White ethnic identity may be a point of pride to those who share it, they do not necessarily celebrate it at the expense of disadvantaged minorities. It is all too easy to assume that race relations are a zero-sum game in which one group gains at the expense of the other. Rather, the histories of several White ethnic groups, such as the Irish and the Italians, show that once marginalized people can rise to positions of prestige and influence (Alba 2009).
That is not to say that White ethnics and racial minorities have not been antagonistic toward one another because of economic competition—an interpretation that agrees with the conflict approach to sociology. As Blacks, Latinos, and Native Americans emerge from the lower class, they must compete with working-class Whites for jobs, housing, and educational opportunities. In times of high unemployment or inflation, any such competition can easily generate intense intergroup conflict.
In many respects, the plight of White ethnics raises the same basic issues as that of other subordinate people in the United States. How ethnic can people be—how much can they deviate from an essentially White, Anglo-Saxon, Protestant norm—before society punishes them for their willingness to be different? Our society does seem to reward people for assimilating, yet as we have seen, assimilation is no easy process. In the years to come, more and more people will face the challenge of 251 fitting in, not only in the United States but around the world, as the flow of immigrants from one country to another continues to increase. In the Social Policy section, we focus on global immigration and its implications for the future.