Burial of the dead after the massacre of Wounded Knee, 1891. (Photo: Northwestern Photo Co.)

This paper, written under the title, “U.S. Settler-Colonialism and Genocide Policies,” was delivered at the Organization of American Historians 2015 Annual Meeting in St. Louis, MO on April 18, 2015.

US policies and actions related to Indigenous peoples, though often termed “racist” or “discriminatory,” are rarely depicted as what they are: classic cases of imperialism and a particular form of colonialism — settler colonialism. As anthropologist Patrick Wolfe writes, “The question of genocide is never far from discussions of settler colonialism. Land is life — or, at least, land is necessary for life.”[1] The history of the United States is a history of settler colonialism.

The extension of the United States from sea to shining sea was the intention and design of the country’s founders. “Free” land was the magnet that attracted European settlers. After the war for independence but preceding the writing of the US Constitution, the Continental Congress produced the Northwest Ordinance. This was the first law of the incipient republic, revealing the motive for those desiring independence. It was the blueprint for gobbling up the British-protected Indian Territory (“Ohio Country”) on the other side of the Appalachians and Alleghenies. Britain had made settlement there illegal with the Proclamation of 1763.

In 1801, President Jefferson aptly described the new settler state’s intentions for horizontal and vertical continental expansion, stating: “However our present interests may restrain us within our own limits, it is impossible not to look forward to distant times, when our rapid multiplication will expand itself beyond those limits and cover the whole northern, if not the southern continent, with a people speaking the same language, governed in similar form by similar laws.” This vision of manifest destiny found form a few years later in the Monroe Doctrine, signaling the intention of annexing or dominating former Spanish colonial territories in the Americas and the Pacific, which would be put into practice during the rest of the century.

The form of colonialism that the Indigenous peoples of North America have experienced was modern from the beginning: the expansion of European corporations, backed by government armies, into foreign areas, with subsequent expropriation of lands and resources. Settler colonialism requires a genocidal policy. Native nations and communities, while struggling to maintain fundamental values and collectivity, have from the beginning resisted modern colonialism using both defensive and offensive techniques, including the modern forms of armed resistance of national liberation movements and what now is called terrorism. In every instance they have fought and continue to fight for survival as peoples. The objective of US authorities was to terminate their existence as peoples — not as random individuals. This is the very definition of modern genocide.

The objective of US colonialist authorities was to terminate their existence as peoples — not as random individuals. This is the very definition of modern genocide as contrasted with premodern instances of extreme violence that did not have the goal of extinction. The United States as a socioeconomic and political entity is a result of this centuries-long and ongoing colonial process. Modern Indigenous nations and communities are societies formed by their resistance to colonialism, through which they have carried their practices and histories. It is breathtaking, but no miracle, that they have survived as peoples.

Settler-colonialism requires violence or the threat of violence to attain its goals, which then forms the foundation of the United States’ system. People do not hand over their land, resources, children, and futures without a fight, and that fight is met with violence. In employing the force necessary to accomplish its expansionist goals, a colonizing regime institutionalizes violence. The notion that settler-indigenous conflict is an inevitable product of cultural differences and misunderstandings, or that violence was committed equally by the colonized and the colonizer, blurs the nature of the historical processes. Euro-American colonialism, an aspect of the capitalist economic globalization, had from its beginnings a genocidal tendency.

So, what constitutes genocide? My colleague on the panel, Gary Clayton Anderson, in his recent book, “Ethnic Cleansing and the Indian,” argues: “Genocide will never become a widely accepted characterization for what happened in North America, because large numbers of Indians survived and because policies of mass murder on a scale similar to events in central Europe, Cambodia, or Rwanda were never implemented.”[2] There are fatal errors in this assessment.

The term “genocide” was coined following the Shoah, or Holocaust, and its prohibition was enshrined in the United Nations convention presented in 1948 and adopted in 1951: the UN Convention on the Prevention and Punishment of the Crime of Genocide. The convention is not retroactive but is applicable to US-Indigenous relations since 1988, when the US Senate ratified it. The genocide convention is an essential tool for historical analysis of the effects of colonialism in any era, and particularly in US history.

In the convention, any one of five acts is considered genocide if “committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group”:

(a) killing members of the group; (b) causing serious bodily or mental harm to members of the group; (c) deliberately inflicting on the group conditions of life calculated to bring about its physical destruction in whole or in part; (d) imposing measures intended to prevent births within the group; (e) forcibly transferring children of the group to another group.[3]

The followings acts are punishable:

(a) Genocide; (b) Conspiracy to commit genocide; (c) Direct and public incitement to commit genocide; (d) Attempt to commit genocide; (e) Complicity in genocide.

The term “genocide” is often incorrectly used, such as in Dr. Anderson’s assessment, to describe extreme examples of mass murder, the death of vast numbers of people, as, for instance in Cambodia. What took place in Cambodia was horrific, but it does not fall under the terms of the Genocide Convention, as the Convention specifically refers to a national, ethnical, racial or religious group, with individuals within that group targeted by a government or its agents because they are members of the group or by attacking the underpinnings of the group’s existence as a group being met with the intent to destroy that group in whole or in part. The Cambodian government committed crimes against humanity, but not genocide. Genocide is not an act simply worse than anything else, rather a specific kind of act. The term, “ethnic cleansing,” is a descriptive term created by humanitarian interventionists to describe what was said to be happening in the 1990s wars among the republics of Yugoslavia. It is a descriptive term, not a term of international humanitarian law.

Although clearly the Holocaust was the most extreme of all genocides, the bar set by the Nazis is not the bar required to be considered genocide. The title of the Genocide convention is the “Convention on the Prevention and Punishment of the Crime of Genocide,” so the law is about preventing genocide by identifying the elements of government policy, rather than only punishment after the fact. Most importantly, genocide does not have to be complete to be considered genocide.

US history, as well as inherited Indigenous trauma, cannot be understood without dealing with the genocide that the United States committed against Indigenous peoples. From the colonial period through the founding of the United States and continuing in the twentieth century, this has entailed torture, terror, sexual abuse, massacres, systematic military occupations, removals of Indigenous peoples from their ancestral territories, forced removal of Native American children to military-like boarding schools, allotment, and a policy of termination.

Within the logic of settler-colonialism, genocide was the inherent overall policy of the United States from its founding, but there are also specific documented policies of genocide on the part of US administrations that can be identified in at least four distinct periods: the Jacksonian era of forced removal; the California gold rush in Northern California; during the Civil War and in the post Civil War era of the so-called Indian Wars in the Southwest and the Great Plains; and the 1950s termination period; additionally, there is the overlapping period of compulsory boarding schools, 1870s to 1960s. The Carlisle boarding school, founded by US Army officer Richard Henry Pratt in 1879, became a model for others established by the Bureau of Indian Affairs (BIA). Pratt said in a speech in 1892, “A great general has said that the only good Indian is a dead one. In a sense, I agree with the sentiment, but only in this: that all the Indian there is in the race should be dead. Kill the Indian in him and save the man.”

Cases of genocide carried out as policy may be found in historical documents as well as in the oral histories of Indigenous communities. An example from 1873 is typical, with General William T. Sherman writing, “We must act with vindictive earnestness against the Sioux, even to their extermination, men, women and children . . . during an assault, the soldiers can not pause to distinguish between male and female, or even discriminate as to age.”[4]



The so-called “Indian Wars” technically ended around 1880, although the Wounded Knee massacre occurred a decade later. Clearly an act with genocidal intent, it is still officially considered a “battle” in the annals of US military genealogy. Congressional Medals of Honor were bestowed on twenty of the soldiers involved. A monument was built at Fort Riley, Kansas, to honor the soldiers killed by friendly fire. A battle streamer was created to honor the event and added to other streamers that are displayed at the Pentagon, West Point, and army bases throughout the world. L. Frank Baum, a Dakota Territory settler later famous for writing The Wonderful Wizard of Oz, edited the Aberdeen Saturday Pioneer at the time. Five days after the sickening event at Wounded Knee, on January 3, 1891, he wrote, “The Pioneer has before declared that our only safety depends upon the total extermination of the Indians. Having wronged them for centuries we had better, in order to protect our civilization, follow it up by one or more wrong and wipe these untamed and untamable creatures from the face of the earth.”

Whether 1880 or 1890, most of the collective land base that Native Nations secured through hard fought for treaties made with the United States was lost after that date.

After the end of the Indian Wars, came allotment, another policy of genocide of Native nations as nations, as peoples, the dissolution of the group. Taking the Sioux Nation as an example, even before the Dawes Allotment Act of 1884 was implemented, and with the Black Hills already illegally confiscated by the federal government, a government commission arrived in Sioux territory from Washington, DC, in 1888 with a proposal to reduce the Sioux Nation to six small reservations, a scheme that would leave nine million acres open for Euro-American settlement. The commission found it impossible to obtain signatures of the required three-fourths of the nation as required under the 1868 treaty, and so returned to Washington with a recommendation that the government ignore the treaty and take the land without Sioux consent. The only means to accomplish that goal was legislation, Congress having relieved the government of the obligation to negotiate a treaty. Congress commissioned General George Crook to head a delegation to try again, this time with an offer of $1.50 per acre. In a series of manipulations and dealings with leaders whose people were now starving, the commission garnered the needed signatures. The great Sioux Nation was broken into small islands soon surrounded on all sides by European immigrants, with much of the reservation land a checkerboard with settlers on allotments or leased land.[5] Creating these isolated reservations broke the historical relationships between clans and communities of the Sioux Nation and opened areas where Europeans settled. It also allowed the Bureau of Indian Affairs to exercise tighter control, buttressed by the bureau’s boarding school system. The Sun Dance, the annual ceremony that had brought Sioux together and reinforced national unity, was outlawed, along with other religious ceremonies. Despite the Sioux people’s weak position under late-nineteenth-century colonial domination, they managed to begin building a modest cattle-ranching business to replace their former bison-hunting economy. In 1903, the US Supreme Court ruled, in Lone Wolf v. Hitchcock, that a March 3, 1871, appropriations rider was constitutional and that Congress had “plenary” power to manage Indian property. The Office of Indian Affairs could thus dispose of Indian lands and resources regardless of the terms of previous treaty provisions. Legislation followed that opened the reservations to settlement through leasing and even sale of allotments taken out of trust. Nearly all prime grazing lands came to be occupied by non-Indian ranchers by the 1920s.

By the time of the New Deal–Collier era and nullification of Indian land allotment under the Indian Reorganization Act, non-Indians outnumbered Indians on the Sioux reservations three to one. However, “tribal governments” imposed in the wake of the Indian Reorganization Act proved particularly harmful and divisive for the Sioux.”[6] Concerning this measure, the late Mathew King, elder traditional historian of the Oglala Sioux (Pine Ridge), observed: “The Bureau of Indian Affairs drew up the constitution and by-laws of this organization with the Indian Reorganization Act of 1934. This was the introduction of home rule. . . . The traditional people still hang on to their Treaty, for we are a sovereign nation. We have our own government.”[7] “Home rule,” or neocolonialism, proved a short-lived policy, however, for in the early 1950s the United States developed its termination policy, with legislation ordering gradual eradication of every reservation and even the tribal governments.[8] At the time of termination and relocation, per capita annual income on the Sioux reservations stood at $355, while that in nearby South Dakota towns was $2,500. Despite these circumstances, in pursuing its termination policy, the Bureau of Indian Affairs advocated the reduction of services and introduced its program to relocate Indians to urban industrial centers, with a high percentage of Sioux moving to San Francisco and Denver in search of jobs.[9]



The situations of other Indigenous Nations were similar.

Pawnee Attorney Walter R. Echo-Hawk writes:

In 1881, Indian landholdings in the United States had plummeted to 156 million acres. By 1934, only about 50 million acres remained (an area the size of Idaho and Washington) as a result of the General Allotment Act of 1887. During World War II, the government took 500,000 more acres for military use. Over one hundred tribes, bands, and Rancherias relinquished their lands under various acts of Congress during the termination era of the 1950s. By 1955, the indigenous land base had shrunk to just 2.3 percent of its [size at the end of the Indian wars].[10]



According to the current consensus among historians, the wholesale transfer of land from Indigenous to Euro-American hands that occurred in the Americas after 1492 is due less to British and US American invasion, warfare, refugee conditions, and genocidal policies in North America than to the bacteria that the invaders unwittingly brought with them. Historian Colin Calloway is among the proponents of this theory writing, “Epidemic diseases would have caused massive depopulation in the Americas whether brought by European invaders or brought home by Native American traders.”[11] Such an absolutist assertion renders any other fate for the Indigenous peoples improbable. This is what anthropologist Michael Wilcox has dubbed “the terminal narrative.” Professor Calloway is a careful and widely respected historian of Indigenous North America, but his conclusion articulates a default assumption. The thinking behind the assumption is both ahistorical and illogical in that Europe itself lost a third to one-half of its population to infectious disease during medieval pandemics. The principle reason the consensus view is wrong and ahistorical is that it erases the effects of settler colonialism with its antecedents in the Spanish “Reconquest” and the English conquest of Scotland, Ireland, and Wales. By the time Spain, Portugal, and Britain arrived to colonize the Americas, their methods of eradicating peoples or forcing them into dependency and servitude were ingrained, streamlined, and effective.

Whatever disagreement may exist about the size of precolonial Indigenous populations, no one doubts that a rapid demographic decline occurred in the sixteenth and seventeenth centuries, its timing from region to region depending on when conquest and colonization began. Nearly all the population areas of the Americas were reduced by 90 percent following the onset of colonizing projects, decreasing the targeted Indigenous populations of the Americas from a one hundred million to ten million. Commonly referred to as the most extreme demographic disaster — framed as natural — in human history, it was rarely called genocide until the rise of Indigenous movements in the mid-twentieth century forged new questions.

US scholar Benjamin Keen acknowledges that historians “accept uncritically a fatalistic ‘epidemic plus lack of acquired immunity’ explanation for the shrinkage of Indian populations, without sufficient attention to the socioeconomic factors . . . which predisposed the natives to succumb to even slight infections.”[12] Other scholars agree. Geographer William M. Denevan, while not ignoring the existence of widespread epidemic diseases, has emphasized the role of warfare, which reinforced the lethal impact of disease. There were military engagements directly between European and Indigenous nations, but many more saw European powers pitting one Indigenous nation against another or factions within nations, with European allies aiding one or both sides, as was the case in the colonization of the peoples of Ireland, Africa and Asia, and was also a factor in the Holocaust. Other killers cited by Denevan are overwork in mines, frequent outright butchery, malnutrition and starvation resulting from the breakdown of Indigenous trade networks, subsistence food production and loss of land, loss of will to live or reproduce (and thus suicide, abortion, and infanticide), and deportation and enslavement.[13] Anthropologist Henry Dobyns has pointed to the interruption of Indigenous peoples’ trade networks. When colonizing powers seized Indigenous trade routes, the ensuing acute shortages, including food products, weakened populations and forced them into dependency on the colonizers, with European manufactured goods replacing Indigenous ones. Dobyns has estimated that all Indigenous groups suffered serious food shortages one year in four. In these circumstances, the introduction and promotion of alcohol proved addictive and deadly, adding to the breakdown of social order and responsibility.[14] These realities render the myth of “lack of immunity,” including to alcohol, pernicious.

Historian Woodrow Wilson Borah focused on the broader arena of European colonization, which also brought severely reduced populations in the Pacific Islands, Australia, Western Central America, and West Africa.[15] Sherburne Cook — associated with Borah in the revisionist Berkeley School, as it was called — studied the attempted destruction of the California Indians. Cook estimated 2,245 deaths among peoples in Northern California — the Wintu, Maidu, Miwak, Omo, Wappo, and Yokuts nations — in late eighteenth-century armed conflicts with the Spanish while some 5,000 died from disease and another 4,000 were relocated to missions. Among the same people in the second half of the nineteenth century, US armed forces killed 4,000, and disease killed another 6,000. Between 1852 and 1867, US citizens kidnapped 4,000 Indian children from these groups in California. Disruption of Indigenous social structures under these conditions and dire economic necessity forced many of the women into prostitution in goldfield camps, further wrecking what vestiges of family life remained in these matriarchal societies.

Historians and others who deny genocide emphasize population attrition by disease, weakening Indigenous peoples ability to resist. In doing so they refuse to accept that the colonization of America was genocidal by plan, not simply the tragic fate of populations lacking immunity to disease. If disease could have done the job, it is not clear why the United States found it necessary to carry out unrelenting wars against Indigenous communities in order to gain every inch of land they took from them — along with the prior period of British colonization, nearly three hundred years of eliminationist warfare.

In the case of the Jewish Holocaust, no one denies that more Jews died of starvation, overwork, and disease under Nazi incarceration than died in gas ovens or murdered by other means, yet the acts of creating and maintaining the conditions that led to those deaths clearly constitute genocide. And no one recites the terminal narrative associated with Native Americans, or Armenians, or Bosnian.

Not all of the acts iterated in the genocide convention are required to exist to constitute genocide; any one of them suffices. In cases of United States genocidal policies and actions, each of the five requirements can be seen.

First, Killing members of the group: The genocide convention does not specify that large numbers of people must be killed in order to constitute genocide, rather that members of the group are killed because they are members of the group. Assessing a situation in terms of preventing genocide, this kind of killing is a marker for intervention.

Second, Causing serious bodily or mental harm to members of the group: such as starvation, the control of food supply and withholding food as punishment or as reward for compliance, for instance, in signing confiscatory treaties. As military historian John Grenier points out in his First Way of War:

For the first 200 years of our military heritage, then, Americans depended on arts of war that contemporary professional soldiers supposedly abhorred: razing and destroying enemy villages and fields; killing enemy women and children; raiding settlements for captives; intimidating and brutalizing enemy noncombatants; and assassinating enemy leaders. . . . In the frontier wars between 1607 and 1814, Americans forged two elements — unlimited war and irregular war — into their first way of war.[16]



Grenier argues that not only did this way of war continue throughout the 19th century in wars against the Indigenous nations, but continued in the 20th century and currently in counterinsurgent wars against peoples in Latin America, the Caribbean and Pacific, Southeast Asia, Middle and Western Asia and Africa.

Deliberately inflicting on the group conditions of life calculated to bring about its physical destruction in whole or in part: Forced removal of all the Indigenous nations east of the Mississippi to Indian Territory during the Jackson administration was a calculated policy intent on destroying those peoples ties to their original lands, as well as declaring Native people who did not remove to no longer be Muskogee, Sauk, Kickapoo, Choctaw, destroying the existence of up to half of each nation removed. Mandatory boarding schools, Allotment and Termination — all official government policies—also fall under this category of the crime of genocide. The forced removal and four year incarceration of the Navajo people resulted in the death of half their population.

Imposing measures intended to prevent births within the group: Famously, during the Termination Era, the US government administrated Indian Health Service made the top medical priority the sterilization of Indigenous women. In 1974, an independent study by one the few Native American physicians, Dr. Connie Pinkerton-Uri, Choctaw/Cherokee, found that one in four Native women had been sterilized without her consent. Pnkerton-Uri’s research indicated that the Indian Health Service had “singled out full-blooded Indian women for sterilization procedures.” At first denied by the Indian Health Service, two years later, a study by the U.S. General Accounting Office found that 4 of the 12 Indian Health Service regions sterilized 3,406 Native women without their permission between 1973 and 1976. The GAO found that 36 women under age 21 had been forcibly sterilized during this period despite a court-ordered moratorium on sterilizations of women younger than 21.

Forcibly transferring children of the group to another group: Various governmental entities, mostly municipalities, counties, and states, routinely removed Native children from their families and put them up for adoption. In the Native resistance movements of the 1960s and 1970s, the demand to put a stop to the practice was codified in the Indian Child Welfare Act of 1978. However, the burden of enforcing the legislation lay with Tribal Government, but the legislation provided no financial resources for Native governments to establish infrastructure to retrieve children from the adoption industry, in which Indian babies were high in demand. Despite these barriers to enforcement, the worst abuses had been curbed over the following three decades. But, on June 25, 2013, the U.S. Supreme Court, in a 5-4 ruling drafted by Justice Samuel Alito, used provisions of the Indian Child Welfare Act (ICWA) to say that a child, widely known as Baby Veronica, did not have to live with her biological Cherokee father. The high court’s decision paved the way for Matt and Melanie Capobianco, the adoptive parents, to ask the South Carolina Courts to have the child returned to them. The court gutted the purpose and intent of the Indian Child Welfare Act, missing the concept behind the ICWA, the protection of cultural resource and treasure that are Native children; it’s not about protecting so-called traditional or nuclear families. It’s about recognizing the prevalence of extended families and culture.[17]



So, why does the Genocide Convention matter? Native nations are still here and still vulnerable to genocidal policy. This isn’t just history that predates the 1948 Genocide Convention. But, the history is important and needs to be widely aired, included in public school texts and public service announcements. The Doctrine of Discovery is still law of the land. From the mid-fifteenth century to the mid-twentieth century, most of the non-European world was colonized under the Doctrine of Discovery, one of the first principles of international law Christian European monarchies promulgated to legitimize investigating, mapping, and claiming lands belonging to peoples outside Europe. It originated in a papal bull issued in 1455 that permitted the Portuguese monarchy to seize West Africa. Following Columbus’s infamous exploratory voyage in 1492, sponsored by the king and queen of the infant Spanish state, another papal bull extended similar permission to Spain. Disputes between the Portuguese and Spanish monarchies led to the papal-initiated Treaty of Tordesillas (1494), which, besides dividing the globe equally between the two Iberian empires, clarified that only non-Christian lands fell under the discovery doctrine.[18] This doctrine on which all European states relied thus originated with the arbitrary and unilateral establishment of the Iberian monarchies’ exclusive rights under Christian canon law to colonize foreign peoples, and this right was later seized by other European monarchical colonizing projects. The French Republic used this legalistic instrument for its nineteenth- and twentieth-century settler colonialist projects, as did the newly independent United States when it continued the colonization of North America begun by the British.

In 1792, not long after the US founding, Secretary of State Thomas Jefferson claimed that the Doctrine of Discovery developed by European states was international law applicable to the new US government as well. In 1823 the US Supreme Court issued its decision in Johnson v. McIntosh. Writing for the majority, Chief Justice John Marshall held that the Doctrine of Discovery had been an established principle of European law and of English law in effect in Britain’s North American colonies and was also the law of the United States. The Court defined the exclusive property rights that a European country acquired by dint of discovery: “Discovery gave title to the government, by whose subjects, or by whose authority, it was made, against all other European governments, which title might be consummated by possession.” Therefore, European and Euro-American “discoverers” had gained real-property rights in the lands of Indigenous peoples by merely planting a flag. Indigenous rights were, in the Court’s words, “in no instance, entirely disregarded; but were necessarily, to a considerable extent, impaired.” The court further held that Indigenous “rights to complete sovereignty, as independent nations, were necessarily diminished.” Indigenous people could continue to live on the land, but title resided with the discovering power, the United States. The decision concluded that Native nations were “domestic, dependent nations.”

The Doctrine of Discovery is so taken for granted that it is rarely mentioned in historical or legal texts published in the Americas. The UN Permanent Forum on Indigenous Peoples, which meets annually for two weeks, devoted its entire 2012 session to the doctrine.[19] But few US citizens are aware of the precarity of the situation of Indigenous Peoples in the United States.

Footnotes:

1. Patrick Wolfe, “Settler Colonialism and the Elimination of the Native,” Journal of Genocide Research 8, vol. 4 (December 2006), 387.

2. Gary Clayton Anderson, Ethnic Cleansing and the Indian: The Crime that Should Haunt America. (Norman: University of Oklahoma Press, 2014.), 4.

3. “Convention on the Prevention and Punishment of the Crime of Genocide, Paris, 9 December 1948,” Audiovisual Library of International Law, https://untreaty.un.org/cod/avl/ha/cppcg/cppcg.html (accessed December 6, 2012). See also Josef L. Kunz, “The United Nations Convention on Genocide,” American Journal of International Law 43, no. 4 (October 1949) 738–46.

4. April 17, 1873, quoted in John F. Marszalek, Sherman: A Soldier’s Passion for Order (New York: Free Press, 1992), 379.

5. See Testimony of Pat McLaughlin, Chairman of the Standing Rock Sioux government, Fort Yates, North Dakota (May 8, 1976), at hearings of the American Indian Policy Review Commission, established by Congress in the Act of January 3, 1975.

6. See: Kenneth R. Philp, John Collier’s Crusade for Indian Reform, 1920-1954.

7. King quoted in Roxanne Dunbar-Ortiz, The Great Sioux Nation: Sitting in Judgment on America (Lincoln: University of Nebraska Press, 2013), 156.

8. For a lucid discussion of neocolonialism in relation to American Indians and the reservation system, see Joseph Jorgensen, The Sun Dance Religion: Power for the Powerless (Chicago: University of Chicago Press, 1977), 89–146.

9. There is continuous migration from reservations to cities and border towns and back to the reservations, so that half the Indian population at any time is away from the reservation. Generally, however, relocation is not permanent and resembles migratory labor more than permanent relocation. This conclusion is based on my personal observations and on unpublished studies of the Indigenous populations in the San Francisco Bay Area and Los Angeles.

10. Walter R. Echo-Hawk, In the Courts of the Conqueror (Golden, CO: Fulcrum, 2010), 77–78.

11. Colin G. Calloway, review of Julian Granberry, The Americas That Might Have Been: Native American Social Systems through Time (Tuscaloosa: University of Alabama Press, 2005), Ethnohistory 54, no. 1 (Winter 2007), 196.

12. Benjamin Keen, “The White Legend Revisited,” Hispanic American Historical Review 51 (1971): 353.

13. Denevan, “The Pristine Myth,” 4–5.

14. Henry F. Dobyns, Their Number Become Thinned: Native American Population Dynamics in Eastern North America (Knoxville: University of Tennessee Press in cooperation with the Newberry Library, 1983), 2. See also Dobyns, Native American Historical Demography, and Dobyns, “Estimating Aboriginal American Population: An Appraisal of Techniques with a New Hemispheric Estimate,” Current Anthropology 7 (1966), 295–416, and “Reply,” 440–44.

15. Woodrow Wilson Borah, “America as Model: The Demographic Impact of European Expansion upon the Non-European World,” in Actas y Morías XXXV Congreso Internacional de Americanistas, México 1962,3 vols. (Mexico City: Editorial Libros de México, 1964), 381.

16. John Grenier, The First Way of War: American War Making on the Frontier, 1607–1814 (New York: Cambridge University Press, 2005), 5, 10.

17. https://indiancountrytodaymedianetwork.com/2013/06/25/supreme-court-thwarts-icwa-intent-baby-veronica-case-150103

18. Robert J. Miller, “The International Law of Colonialism: A Comparative Analysis,” in “Symposium of International Law in Indigenous Affairs: The Doctrine of Discovery, the United Nations, and the Organization of Americans States,” special issue, Lewis and Clark Law Review 15, no. 4 (Winter 2011), 847–922. See also Vine Deloria Jr., Of Utmost Good Faith (San Francisco: Straight Arrow Books, 1971), 6–39; Steven T. Newcomb, Pagans in the Promised Land: Decoding the Doctrine of Christian Discovery (Golden, CO: Fulcrum, 2008).

19. Eleventh Session, United Nations Permanent Forum on Indigenous Issues, https://social.un.org/index/IndigenousPeoples/UNPFIISessions/Eleventh.aspx (accessed October 3, 2013).