mercoledì 21 marzo 2007


Guantanamo Bay, a U.S. military base on the island of Cuba, is a detainment camp in a permanent state of exception, for while the United States have “complete jurisdiction” over the base, the ultimate sovereignty is reserved to Cuba, a country that has no diplomatic relations with the USA and, by the terms of the 1903 agreement, cannot evict the U.S. military without their consent.

According to the U.S. government, conventional American legal norms do not apply to what is technically foreign soil, and enemy combatants can be held prisoners incommunicado until the end of the ongoing war that, in the words of Vice-president Dick Cheney, may well last for generations. Critics have argued that the need to find a proper balance between security and freedom does not justify violations of humanitarian law.

These very same questions have been debated for centuries. Alexander Hamilton once remarked in Federalist 8 (1787) that “safety from external danger is the most powerful director of national conduct. Even the ardent love of liberty will, after a time, give way to its dictates. The violent destruction of life and property…a state of continual danger, will compel nations the most attached to liberty to resort…to institutions which have a tendency to destroy their civil and political rights.”

Indeed, during WWI and WWII, the U.S. government imprisoned thousands of dissidents and of citizens whose only fault was to belong to the “wrong” ethnic group. McCarthyism and the McCarran Internal Security Act (1950), somehow revived the witch-hunting spirit of the provisions included in the notorious Alien and Sedition Acts (1798), allowing the detention, expulsion and de-naturalization of persons suspected to be engaged in “un-American” activities.

The Bush Administration’s response to the terrorist attacks of September 11, 2001, as illustrated by the August 2002 memo issued by the office of legal counsel of the justice department, which proclaimed that “the President enjoys complete discretion in the exercise of his Commander-in-Chief authority and in conducting operations against hostile forces”, is the logical continuation of this approach to the issue of national security.

The war on terror has already caused a number of patent violations of the writ of habeas corpus, one of the fundamental rights defined by the U.S. Constitution, and of other constitutional guarantees such as due process of law, trial by jury, and right to counsel. Under such conditions, the process of moral inversion, whereby what is normally judged to be deplorable becomes not only provisionally acceptable but desirable, is more likely to set in. The controversy surrounding the Guantanamo Bay detainment camp is a case in point.

Guantanamo Bay is one of the selected locations where, according to the current American administration, international obligations should no longer apply on grounds of national security and the protection of sensitive information. “Unlawful combatants”, so it is argued, should be considered as outside the Geneva Conventions on the treatment of prisoners of war.

Numerous allegations of torture and mistreatment have attracted the attention of international human rights organizations and several cases have been presented to the U.S. Supreme Court for its consideration. On June 29, 2006, in Hamdan v. Rumsfeld 548 U.S. _ 2006, the Supreme Court has ruled that: (a) the ad hoc military commissions established by President Bush on 13 November 2001 with Military Order No. 1, entitled “Detention, Treatment, and Trial of Certain Non-Citizens in the War against Terrorism”, to try the Guantanamo detainees and other non-nationals suspected to have committed acts of terrorism, are unauthorized by the Congress and illegal under both military justice law and the four Geneva Conventions of 1949; (b) the detainees are protected by Common Article 3 of the Geneva Conventions, prohibiting “violence to life and person . . . cruel treatment and torture” and “outrages upon personal dignity, in particular humiliating and degrading treatment;” (c) and, reiterating the conclusions and recommendations of Hamdi v. Rumsfeld 542 U.S. 507 (2004) and Rasul v. Bush 542 U.S. 466 (2004), which stated that foreign nationals detained outside US borders could file writs of habeas corpus and contest the lawfulness of their detention before a U.S. federal court, that their cases can be heard in federal courts.

Some have pointed out that the fact that the June 29 ruling was generally described as “extraordinary”, rather than simply pertinent and opportune, reflects the nonessential role of international human rights and legal standards in U.S. current foreign policy. Indeed, the White House’s reaction to the ruling – “it requires little more than having Congress put its stamp of approval” – shows a dismaying unconcern with the separation of executive, legislative, and judicial powers, namely one of the major achievements of constitutional democracy.

Slavery and the Constitution

Chattel slavery is the extension of market relations to the human person, as a marketable property and human commodity (res commerciabilis) or legal tender, outside the domain of mutual obligation.

In 1550, Spanish emperor Charles V called a halt to military operations in the New World, until the status of Native Americans, together with the morality and legality of the Spanish conquest, had been thoroughly debated. A group of theologians and jurists was convoked in Valladolid, to listen to the arguments of Bartolomé de Las Casas and Juan Ginés de Sepúlveda and settle this question once and for all.

The bull “Sublimis Deus”, issued in 1537 by Pope Paul III, had already clarified the Holy See’s official position on the subject. The Pope condemned slavery and the portrayal of Indians as “dumb brutes created for our service”, incapable of exercising self-government, free will, and rational thinking, and therefore of receiving the message of Christ. Las Casas, elaborating on this bull and on the writings of Francisco de Vitoria, a Dominican professor at the prestigious university of Salamanca, decried the barbarity of Spaniards by contrasting it with the meekness, humbleness and good-heartedness of the Indians. Sustained by an unswerving faith in the essential unity of humankind and by his conviction that a commitment to global justice was a moral imperative, he argued that Indians were fully capable of governing themselves and were entitled to certain basic rights, regardless of the nature of their practices and beliefs, which should anyhow be understood from an indigenous point of view.

Sepúlveda, who knew very little of the Spanish colonial subjects, drew on the doctrine of natural law and on pragmatic realism to marshal most of the arguments which would be later deployed by American anti-abolitionists. He explained that, given their innate physical and intellectual inferiority, Indians should be assimilated to Aristotle’s “natural slaves.” In consequence of their being ruled by passions rather than reason, Indians were born to be slave. As men ruled over women, and adults ruled over children, so inferior races should be subordinated to the will of superior races. This allowed for the enslavement of indigenous people and Africans.

Puritan settlers in North America took the same stance as Sepúlveda, and resorted to natural laws to validate the claim that religious conversion could not change the legal status of African slaves or Indians. Their status would instead be anchored to biological, and thus inalterable, attributes, which justified hereditary slavery and perpetual bondage for African-Americans and Native Americans. Because their status was equivalent to that of domestic animals or infants, slaves enjoyed no legal protection: only their commercial value shielded them from the structural violence of a society that regarded them as less than human. Slavery was described as a necessary evil, not the result of greed, callousness, and “human parasitism.”

In 1700, Indians accounted for one fourth of all South Carolina’s slaves and in 1790, when the Naturalization Act was introduced, which restricted American citizenship to “free white persons,” nearly one third of the people living in the American South were black slaves. The paradox of a society fighting for freedom and for pre-social individual rights and granting juridical personality to corporations, ships and states, but at the same time tolerating that 20 percent of the population was subject to chattel slavery, is epitomized by John Adams’ protest against the English yoke, in 1765: “We won’t be their Negroes.”

Even though they may have expressed doubts about the morality of slave ownership, George Washington, James Madison, and Thomas Jefferson profited from it and were evidently prepared to live with it. When southern delegates refused to support a government and a Constitution hostile to their economic and political interests, northern delegates sought all sorts of compromises to avoid conflict. Abolitionist William Lloyd Garrison called the final draft of the Constitution “a covenant with death, and an agreement with hell.” Indeed, consensus was bought at the highest price: the constitutional sanctioning of slavery, an institution irreconcilable with the libertarian spirit of the Constitution, but not with its protection and promotion of private property and enterprise.

Also, the “Fugitive slave clause” of Article IV of the Constitution prevented fugitive slaves who had reached an abolitionist state from being freed. What is more, the southern delegates managed to persuade their northern equivalents to include slaves in the determination of the apportionment of the members of the House of Representatives. Irrespective of their status as property, each slave would be counted as three-fifths of a free man, thus greatly augmenting the political power of the Southern States. As a result, twelve out of the first sixteenth American presidents were Southern slaveholders although, by 1804, slavery had been substantially abolished in the Northern states, where an inclusive civic nationalism had taken root.

Thus, for all intents and purposes, it may be said that the American Constitution protected the peculiar institution and the Southern caste society until the outbreak of the Civil War. In Dred Scott v. Sandford 60 U.S. (19 How.) 393 (1857), Roger Brooke Taney (1777-1864), Chief Justice from 1836 until his death in 1864, a Catholic and a former slave-owner, played a significant role in furthering the sectional interests of those who interpreted the meaning of the Constitution as precluding the federal government from applying restrictions to the citizens’ right to property. This would include the property of slaves, which was protected by the due process clause of the Fifth Amendment. Echoing Sepulveda’s arguments, he maintained that, in 1787, blacks were not expected to become citizens and to exercise the associated rights, for they were considered as “a subordinate and inferior class of beings, who had been subjugated by the dominant race, and, whether emancipated or not, yet remained subject to their authority, and had no rights or privileges but such as those who held the power and the government might choose to grant them.” This infamous, though formally and historically not incongruous, decision, which foreshadowed a future Supreme Court ruling nullifying the abolition of slavery in the North, strengthened the resolve of abolitionists. Eight years and over 600,000 casualties afterwards, institutional racial slavery and involuntary servitude officially ended with the passage of the Thirteenth Amendment.

Race and the Constitution

Thomistic philosophy and Enlightenment anthropology were premised on the tenet that reason was the distinguishing feature of humanity. Over time, however, a different theoretical orientation emerged, which stressed the importance of physiological constraints to rationality: not all people and not all races possessed intellectual faculties to the same degree; hence, many could not govern themselves and strive for excellence. This approach reaffirmed the continuity of biological explanations of social inequality, justifying a hierarchical conception of racial differentiation based on naturalistic premises and the subjugation and exploitation of non-European peoples for their own good – Rudyard Kipling’s “white man’s burden”.

Thus, the belief in human biological determinism, which was originally confined to the circles of seventeenth century’s European aristocrats, was retained and adapted to serve the political aspirations of the middle-class. Furthermore, in the United States, the Founding Fathers were particularly fond of Platonic and Aristotelian political theories and, like their Greek precursors, they feared that racial diversity would lead to political fragmentation and, ultimately, to the dissolution of the Republic.

It is therefore unsurprising that the word “miscegenation” was first introduced into legal discourse in the United States, the only country in the world employing the “one drop of blood” rule and the hypodescent criterion, whereby people are black if they have some degree of African ancestry, irrespective of their phenotype, and that the 1935 Nazi Nuremberg laws, the 1937 racial laws of Italian fascism, and the 1949 miscegenation statute of South Africa were all inspired by the U.S. miscegenation laws.

“American exceptionalism” began in 1691, when Virginia legally banned interracial marriage for the first time in human history. By 1776, 12 of the 13 original states had followed Virginia’s example. Throughout the nineteenth century, racial segregation became a national phenomenon. In Roberts v. Boston 59 Mass. 198, 5 Cush. 198 (1849), the Massachusetts Supreme Judicial Court ruled that segregation in Boston’s public schools was permissible. Similar legal provisions were enacted in California and in other states which, in the late nineteenth and early twentieth century, barred South and Far Eastern Asians, as well as Native Americans and African Americans, from public schools. Meanwhile, Indiana, Illinois, Iowa, Michigan, Massachusetts, and New Jersey passed the black immigrant exclusion laws.

Following Lincoln’s Emancipation Proclamation, the ratification of the 13th, 14th and 15th amendments to the U.S. Constitution, and the Civil Rights Act of 1875, Southern and border states ratified the Jim Crow statutes, which applied to African Americans, Asian Americans, and mixed-race individuals, prohibited interracial marriages, and led to the segregation of most public places, including railroad passenger seating, and residential districts. These statutes were upheld by the Civil Rights Cases 109 U.S. 3 (1883), which also declared the Civil Rights Act unconstitutional, for its provisions could not apply to private acts, and by Plessy v. Ferguson 163 U.S. 537 (1896), a 7-1 decision which put forth the “separate but equal doctrine”, to the effect that racial separation could be deemed constitutional, provided that facilities for each race were equal. By 1910, the South had established a virtual caste system prescribing separate telephone booths, elevators, Bibles, and water coolers for blacks and whites.

In those same years, fear of immigrants from Ireland, Southern and Eastern Europe, and from Asia reached a high point. These immigrants were believed to pose a genetic threat to the vitality of the nation. It was argued that, due to a menacing differential birth rate, their blood might pollute the American gene pool and their lax morality might deteriorate white Protestant virtues. This called for immigration restrictions to halt the flux of unfit immigrants, as well as enforced institutionalization and compulsory state sterilization to prevent contamination from those already in the United States.

The 1920s and 1930s were the heyday of sterilization programs, mental testing, marriage restrictions, racial segregation, and discriminatory immigration quotas, which were allegedly needed to pre-determine the ingredients of the American melting pot. The anti-miscegenation Racial Integrity Act, based on the dubious claims of racial science, was passed in Virginia in 1924, the same year that the severely restrictive and racially selective Johnson-Reed Immigration Act was promulgated by the Congress., following the Chinese Exclusion Act of 1882 and the 1917 Immigration Act, which excluded “all idiots, imbeciles, feeble-minded persons, epileptics, insane persons…professional beggars, vagrants…polygamists, and anarchists.”

Racial laws went largely unquestioned until the Fifties and Sixties, a period of economic expansion and relative prosperity. Then, Brown v. Board of Education 347 U.S. 483 (1954) held that segregated public education denied equal protection. But this decision could not abolish “de facto” segregation: in 1965, only two per cent of blacks attended desegregated schools. With the 1964 Civil Rights Act all public facilities and public education were desegregated, and discrimination in employment on the grounds of race, color, sex, or national origin was made illegal. The voting rights act of 1965 invalidated racially discriminatory measures that, prior to WWI, had restricted voting to less than four percent of Black citizens, with only 150,000 blacks being allowed to vote in 1940. Finally, in Loving v. Virginia 388 U.S. 1 (1967), the concept of “racial purity” was removed from the American legal discourse and, in 1968, the Civil Rights Act ended “de jure” housing segregation.

This necessarily cursory examination of the racial question in the United States shows the extent of the devaluation of constitutionally sanctioned individual liberties for the members of ethnic and racial minorities, “the un-American others” who, from the perspective of the ruling majority, did not appear to act like truly independent moral agents, and therefore were in no position to fully exercise those rights. It is a bitter irony that a nation founded on the centrality of civil liberties has been historically so fearful of certain categories of individuals and their freedoms. In many ways, this is also witnessed by the startling imprisonment rates in the United States, unparalleled in Western countries. Rising incarceration reflects a zero-tolerance trend in law enforcement and sentencing that is in stark contrast with the corresponding European descending rates.

Over the past thirty years, the number or people incarcerated in the USA increased fourfold. As a result, despite falling crime rates, the U.S. current imprisonment rate is the second highest in the world after post-genocide Rwanda, and seven million American citizens are now in jail, on probation or on parole, that is, 1 in 32 American adults. Furthermore, over four million detainees or former detainees are denied the right to vote. Racial disparities are flagrant. In the 1940s, 77 percent of all detainees were white, and 22 per cent were black. Nowadays, 60 percent of state and federal inmates are black or Hispanic and about one in 13 blacks in the 25-29 age group is in jail. One third of black males born today are likely to serve time in prison during their lifetime. This “great confinement”, namely the over-representation of blacks behind bars in a state of mass imprisonment, has been explained as the result of the stronger emphasis that is placed in American culture on individual responsibility in deviant behavior – the rhetoric of “crime as a moral choice” –, and as a means by which the justice system manages ostensibly “unruly” human groups, neutralizing them by physical removal. In this sense, mass incarceration of racial and ethnic minorities as social cleansing by penal means falls along the same logical continuum from slavery, to segregation, to urban ghettoes, one premised on the belief in the ultimate unassimilability of most African Americans into American society.

Civil Rights

Democracies enable people to best express their differences, dissension and protest, through a system of political checks and balances, and a basic commitment to freedom and justice for all. This is why the struggle for democracy is at the same time a fight for the extension of civil rights.
Civil rights, as part of the social contract, protect the entitlements and personal liberties of those who fall within the legal jurisdiction of a country, regardless of ethnic, sexual, social, religious or other attributes unrelated to individual capacity. They limit the action of the State within the private and public spheres but can only be secured by the laws enacted by the sovereign state itself.
There are more civil rights than fundamental rights because the State and the judicial body overseeing the correct implementation of the principles contained in a country’s constitution have the power to expand the purview of fundamental rights, sometimes in response to the mobilization of civil rights movements, which seek to sensitize the public about existing discriminations and misapplications of the law.
Thus, for instance, today all American citizens are endowed with the right to vote, which was not granted by either the original Constitution or the Bill of Rights. Also, freedom from racial, gender, or physical discrimination was not contemplated by the Founding Fathers. The legal protection of these civil liberties was sanctioned by the Thirteenth (1865), Fourteenth (1868), and Fifteenth (1870) “civil war” amendments, adopted during the Reconstruction period. They abolished slavery, guaranteed the due process of law and equal protection of the laws, and forbade racial discrimination as far as voting rights were concerned.
However, one must also bear in mind that the cultural climate of a country exerts a powerful pressure on the judiciary and on the legislative power. In the United States, as elsewhere, the doctrine, advocated among others by Scottish Enlightenment philosophers, that human beings are driven by their instincts and self-interest, apparently dissipated most of the lingering doubts about the contradictions of the modernity project, which failed to deliver on its promise of bounty and unrestricted opportunities. At the same time, a rather cynical distrust in human nature infiltrated the debate on the Constitution, because it was generally presumed that the masses were ignorant, selfish, and did not posses the required natural virtues. James Madison’s illuminating passage in Federalist No. 10, 1787, which encapsulates the inherent tension between liberty and equality and the fundamental class-bias of the American Constitution, clarified that the first object of government would be “the protection of different and unequal faculties of acquiring property” and the securing of this property.
It followed that in the newly independent states only white male property owners, namely those who could be trusted as basically virtuous, were allowed to cast their vote and make their voice heard on crucial issues. Another logical outcome was that social and economic rights like those described by F.D. Roosevelt in his State of the Union address of January 11, 1944, would not be pursued with vigor.
It is also noteworthy that Thomas Jefferson and John Adams maintained that the rise of a natural aristocracy of men, an aristocracy of virtue and talent that would select the best political leadership, was hindered by the existence of an artificial aristocracy of men. Yet this line of reasoning could also be used to sanction an elitist understanding of human nature and human affairs. An entrenched sense of pre-eminence would translate into a propensity to turn a blind eye to social, occupational and economic disparities which undermined those very same fundamental rights that the Founding Fathers themselves so forcefully advocated.
Accordingly, if one interprets the term “equality” rather narrowly, as equality before the law, and regards the principles and laws of market economy as the guiding hand of social and economic growth, the resulting social contract will define the promotion of equal opportunities and the reality of wide disparities as compatible. As German-British sociologist Ralf Dahrendorf discerningly put it, “all men are equal before the law but they are no longer equal after it”. One of the long-term consequences was the early twenty-century spate of mental testing, which classified people according to ostensibly objective means of assessment and dramatically narrowed the educational and employment opportunities of scores of new American citizens.
The notable omission of political equality and social justice in the drafting of the Constitution cannot be explained simply in terms of class relations. The notion of cosmic retribution was another factor at play. It was held that a correlation existed, between virtue and happiness, vice and misery and that, because there were no insurmountable barriers to hinder upward social mobility, individuals could rose just as far as their skills and determination would enable them to. In other words, wealth was always earned and, with few exceptions, the poor deserved their station in life, owing to chronic, and possibly inheritable, flaws in their character, leading them to moral failing and economic destitution. Thus, following Locke, citizens without property were not as fully entitled to constitutional rights because it was doubtful that they would live up to the moral and rational demands of modern society.
Back in the eighteenth and nineteenth centuries, this rationale urged American citizens to pursue the colonial expansion towards the West and stake claims on Indian land, for Native Americans were thought to be an inferior race incapable of fulfilling the moral mission of harnessing natural resources. As remarked by Justice Jackson in Northwestern Bands of Shoshone Indians v. United States 324 U.S. 335 (1945), “acquisitiveness, which develops a law of real property, is an accomplishment only of the ‘civilized’.”
But if Indians were inferior to Whites, they could not be invested with the same legal rights. These were indeed the conclusions of the Supreme Court in the landmark cases Johnson v. McIntosh 21 U.S. (8 Wheat.) 543 (1823), Cherokee Nation v. the State of Georgia, 30 U.S. 1 (1831), and Worcester v. Georgia, 31 U.S. 515 (1832), known as the Marshall Trilogy, after John Marshall, at that time chief justice of the Supreme Court, which laid out the fundamental principles for current judicial decisions on disputes between Indian tribes and the federal government. They formalized the imperialist and fictional rule of law behind the Doctrine of Discovery, whereby Native Americans, as “domestic dependent nations,” lost the sovereignty of the land in which they had lived “from time immemorial” and could only exercise a right of occupancy, under the American “protection and pupilage”. Europeans had “discovered” the land, and therefore they owned it.
On the other hand, assimilated Indians, who lived a “civilized life” and paid the taxes, were denied citizenship with the Supreme Court ruling in Elk v. Wilkins 112 U.S. 94 (1884), despite the Fourteenth Amendment. Eventually, forty years later, the Indian Citizenship Act of 1924 extended US citizenship to all Native Americans, leaving unresolved the question of whether this provision, which applies to a self-governing people and involves an uncalled-for collective naturalization, is constitutional. The fatal incongruence of the Act, which granted the US citizenship to individuals who were also citizens of an Indian nation, derived from the tension between two divergent aspirations: one the one hand the government’s intention to reorganize the Indian communal land through allotment and to encourage them to embrace the American way of life, and on the other hand the unwillingness of most Indians to be assimilated.
It would be another four decades before, in 1968, the Indian Civil Rights Act extended to individuals under the jurisdiction of Indian tribal governments the protections of the Bill of Rights and imposed some of the US legal standards to the Indian nations.

Fundamental Rights

Fundamental rights are those rights that are explicitly or implicitly sanctioned by a country’s constitution, are guaranteed to all citizens and most residents of that country and, given their importance, are accorded special protection by the judiciary. Historically, they coincide with the natural rights assigned to every human being and with those rights listed in the US Bill of Rights. They prescribe the limits of intervention and interference of the central government in citizens’ lives and can only be partially and transitorily restricted under a state of exception; for instance, in case of national emergencies like a war, a spate of terrorist attacks, or civil unrest, when security concerns prevail over certain civil liberties and personal entitlements. However, even a partial suspension or infringement of these rights must be strictly scrutinized and approved by the highest judicial body of a country which, in the United States, is called the Supreme Court.

These rights include the right to life and to generate life, to marry and to raise a family, the right to vote, to personal privacy, and to expect equality of treatment before the law – the “due process of law” of American Constitutional Law –, together with a series of civil liberties such as freedom of thought, speech and press, assembly and association, petition and religious affiliation and practice. Civil liberties are what the English jurist William Blackstone (1723-1780) described as “the great end of all human society and government…the state in which each individual has the power to pursue his own happiness according to his own views and of his interest, and the dictates of his conscience, unrestrained, except by equal, just, and impartial laws.”

This obviously implies that civil liberties may sharply contrast with the self-interest of the majority, which is required to refrain from arbitrary actions. When a majority of citizens strongly believe that their views and reasons are so logically compelling and morally persuasive that everyone in their right mind should agree on a given course of action, the consequences for civil liberties can be devastating, for the provisions contained in the Constitution are subject to interpretation and amendment. This is precisely what occurred in the early twentieth century, with Prohibition and eugenics, at a time when many grew intensely preoccupied with what they believed to be unmistakable indications of moral failure and biological degeneration. For some time, an intransigent and self-righteous moralism and the conviction that what stood in the way of the modernizing process was the result of ignorance, parochialism, and retrograde outlooks, resonated with the beliefs of a “moral majority.” In order to justify the adoption of measures that infringed basic liberties, some reformers who regarded themselves as modern, progressive and boldly experimentalists, resorted to humanitarian, utilitarian, pragmatic and medical arguments, and appealed to the quest for social perfection, which would take its inception from the internalization of the rules of a proper conduct. These sentiments generated a constellation of habits of thought, norms and ideals, a secularized version of the Protestant ethic, emphasising self-sufficiency, purity, conscientiousness, self-discipline and social planning, which obliterated the separation of the legal and the moral spheres.

This ostensibly progressive civic religion was seen by many as essentially fair and morally unassailable. Resistance to ethical self-scrutiny was particularly strong. Various representatives of the judicial branch became self-appointed guardians of the public morality and urged state governments to intrude in people’s private lives “for their own good”. Wayward citizens, namely those who could not be converted to an acceptable lifestyle, and whose behaviour remained unpredictable, were liable to being sterilized or institutionalized. This kind of society, at once ready to embrace an abstract notion of humankind and reluctant to put up with certain categories of human beings, was so insecure, apprehensive, and self-doubting, that it was willing to carry out self-mutilation in order to become risk-free, while refusing to consider the motives of the offenders and “miscreants”.

By 1914, marriage restriction laws targeting “feeble-minded” citizens had been enacted in more than half the states and, by 1917, 15 states had passed sterilization laws. But “only” a few thousand sterilizations had been actually performed, mainly because nearly half of such laws had been struck down on the ground that they violated due process, freedom from cruel and unusual punishment, and the equal protection clause. A second wave of eugenics laws coincided with the Immigration Restriction Act (1924) and Virginia’s Act to Preserve Racial Integrity (1924). In 1924, Virginia also passed a law authorizing the involuntary sterilization of alleged mental defectives. This law was upheld, 8-1 by the Supreme Court, in Buck v. Bell 274 U.S. 200 (1927). Justice Oliver Wendell Holmes, who was joined by Louis D. Brandeis and William Howard Taft, influenced by a crude scientific naturalism, a pessimistic anthropology, and a corrosive skepticism, wrote in the now infamous opinion for the Court that “the principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes…Three generations of imbeciles are enough”. As a result of this decision, taken in a country that prided itself on its commitment to individual freedom but chose to substitute clear constitutional principle for scientifically unverifiable notions of social progress, nearly half the U.S. states passed eugenics laws authorizing compulsory and non-voluntary sterilization. Sterilization rates dramatically increased, especially during the Depression, when few families were prepared to put up with the social protection of what was perceived to be a disproportionate number of dependent people, and did not protest against the systematic infringement of their fundamental rights.

The theoretical foundations of constitutional rights were undermined by prominent legal scholars in North America and Europe who favored the notion of “ethical state”, one whose “intrinsic morality” and thorough grasp of the laws of historical and biological necessity made its arbitration and dictates on public matters virtually indisputable. As a source of a morality more in line with the demands of modernity, the state was not necessarily bound by constitutional principles and norms. Radically realist and functionalist jurists argued that personal rights were not inalienable, for they really were culturally and historically relative legal fictions or superstitions, their existence being, to a large extent, contingent on the majority’s willingness to uphold them, that is, on considerations of general welfare and public utility. This Machiavellian interpretation of public law made ethics the handmaid of politics: rights could only be granted by law, and social utility overruled the “untenable notion” of human rights. Therefore virtues, rather than the rights prescribed by the American Constitution, were the defining attribute of citizenship. Enlightened governments were expected to foster virtues and restrict personal rights for the sake of communal rights and civic responsibility. Instead of protecting the citizens, law legitimized the persecution of certain categories of people, supposedly unable to enjoy freedom and to pursue happiness, by gradually stripping them of their rights and legal protections. Such policies were described as politically necessary and ethically indisputable. In a tragic reversal of roles, according to the dominant “discourse of truth” those who violated the physical integrity of other citizens were fulfilling a constitutionally sanctioned civic duty, while the victims of involuntary sterilization and confinement were a social threat and, as such, subject to legally mandated sterilization or confinement “for the good of society.”

Most such laws were only repealed in the late 1960s and 1970s, even though the Supreme Court ruling in Skinner v. Oklahoma 316 U.S. 535 (1942) defined procreation “one of the basic civil rights of man” and sterilization an invasion of fundamental interests which, according to Justice William O. Douglas, “in evil or reckless hands,” could have genocidal consequences.


The logo of the Third International Congress of Eugenics, held in New York in 1932, defined eugenics as ‘the self direction of human evolution’. Negative eugenics was concerned with the elimination of inheritable diseases and malformations and involved prenuptial certificates, birth control, selective abortion, sterilization, castration, immigration restriction and, in Nazi-occupied Europe, involuntary ‘euthanasia’. Positive eugenics would instead encourage the propagation of desirable characteristics via tax incentives for ‘fit parents’, assortative mating and, in the years to come, cloning and germline engineering.

The term ‘eugenics’ was coined in 1883 by Sir Francis Galton (1822–1911), after the Greek εύγενής, meaning ‘wellborn’. Galton mistakenly assumed that all traits are passed down unaffected from our ancestors (‘law of ancestral heredity’) and envisioned eugenics as a naturalistic religion antagonistic to Christianity. An extreme version of this theory, called Ahnenerbe (‘ancestral inheritance’), which described individual life as the epiphenomenon of perpetual bloodlines, was deployed by Heinrich Himmler to justify his plans for a New European Order.

Traces of this erroneous understanding of genealogies in terms of genetic continuity were evident in the writings of Nietzsche, Ernst Haeckel, the most influential German popularizer of evolutionary theory, American biologist Charles Davenport, who held that predispositions to social deviance were inherited from ‘ape-like ancestors’, and British biostatistician R. A. Fisher, who once (1929) remarked that, if King Solomon’s line was not extinct, he was ‘in the ancestry of all of us, and in nearly equal proportions, however unequally his wisdom may be distributed’. A similar combination of Eternal Recurrence – human beings as expressions of the immortal germplasm – and natural teleology of history – biology as destiny – stamped their positions.

Similar convictions informed Cesare Lombroso’s theory of atavism and those of various Social and Racial Darwinists, animal breeders and pedigree researchers. Among them was the American psychologist Herbert Goddard, who authored a study of hereditary feeble-mindedness. The Kallikak Family (1912), based on patently manufactured data, proved so influential on both sides of the Atlantic that the German translation (1914) was reprinted in 1933. Other genealogical studies stressed the linkage between folk hereditarian beliefs about the transmission of patrimonial and biological inheritance and the religious notion of the inheritability of sins, concurring to foster notions of evolutionary throwbacks and of populations as bundles of lineages, together with the equation of genealogical perpetuation with social distinction.

Most early eugenicists were raised in deeply religious families and, between 1907 and 1940, eugenics laws were only promulgated in those countries where Puritan, Pietistic and Calvinist denominations were stronger. In fact, religious affiliations and the heterogeneity of Western cultures led to different styles of eugenics. The rediscovery of Mendelian laws, August Weismann’s experiments on inheritability, T. H. Morgan’s gene theory and the publication of Wilhelm Johannsen’s seminal scientific review ‘The Genotype conception of heredity’ (1911), established the new orthodoxy of genetic research. ‘Germplasm’ (reproductive tissues) and ‘somatoplasm’ (non-reproductive tissues) were distinct and separate. In Romance and Far Eastern countries, the prevalent interpretation was that the distinction between soma and germplasm did not imply the one between germplasm and environment. Accordingly, medical researchers focused on external, physiochemical mutagenic factors, rather than on differential genetic predispositions and hereditary transmission, which had strong eugenic implications. Given these conflicting allegiances, it was inevitable that the large Italian and French delegations attending the first and second international eugenics congresses (London 1912, New York 1921) would criticize those scholars who posited the existence of biologically distinct human groups with differential disease susceptibility.

The tenor of most of the papers presented at the first International Congress of Social Eugenics, held in Milan in 1924, made it clear that Latin eugenicists had opted for hygienism, social medicine and pro-natalism. They urged greater caution before drawing conclusions from a limited sample of data on inheritance transmission. On that occasion, Russian biologist N. K. Kol’tsov juxtaposed the unnecessary timidity of Italian eugenicists with the hasty determination of American eugenicists.

In 1929, while Eugene Gosney and Paul Popenoe’s Sterilization for Human Betterment, extolling the achievements of Californian eugenicists, was being translated into German and would soon become an important source of inspiration for European race hygienists, Corrado Gini, a world-famous Italian demographist and eugenicist, and Mussolini’s confidant, presided over the second Italian Congress of Genetics and Eugenics in Rome. This international meeting was attended by race hygienists Alfred J. Mjoen (Norway), Eugen Fischer and Fritz Lenz (Germany), who hoped to persuade Mussolini to become the saviour of the white race’s genetic pool. However, Mussolini’s conception of totalitarian demography privileged quantity over quality, and did not regard the genetic enhancement of the Italian stock as a priority. Gini himself decried the determinism, racialism, and coerciveness of the American and ‘Germanic’ eugenics proposals. Lenz’s dispirited comment in the Archiv für Rassen- und Gesellschaftsbiologie was that whenever the subject of eugenics was brought up at the Italian congress, it was only to stigmatize it.

This rift, aggravated by comparable contrasts between North American and Latin American eugenicists that had emerged at the 1927 Pan-American Eugenics Conference in Havana, led, in 1934, to the creation of the short-lived Latin Federation of Eugenics. Its first and last congress, held in Paris in 1937, restated the strong commitment to ameliorative social reforms, public hygiene, and the Hippocratic/Galenic ethos of compassionate care.

Fragmented and with its credibility increasingly eroded, the international eugenics movement was on the wane already in the late 1920s, so that mainline eugenics gave way to ‘reform eugenics’, family planning and population control, characterized by a greater emphasis on environmental factors, voluntary sterilizations, birth control, the rational management of human resources, and the repudiation of an overtly racist language.This tactic made eugenics far more palatable and effective: if the impact of nurture was so important, then children should only be raised in healthy home environments. In order to redress nature’s essential randomness and synchronize biological and socioeconomic processes, irresponsible citizens unable to meet the challenges of modern society could be forced, blackmailed, or cajoled into accepting sterilization or castration. Neo-Malthusianism replaced biological determinism, and the Hardy-Weinberg theorem (1908), which demonstrated that sterilizing or segregating the ‘mentally unfit’ would not appreciably reduce the incidence of ‘feeble-mindedness’, was now deemed irrelevant.

Consequently, by the early 1930s, eugenics sterilisations programmes were in full swing. Following the moral panic generated by the Great Depression, few families were prepared to put up with the social protection of what was perceived to be a disproportionate number of dependent people. It was argued that under exceptional circumstances, basic rights could be withheld and that social services should only be granted to those whose social usefulness and biological capability were certain. Consequently, the international political climate proved very receptive to the arguments of those prominent American, German, and Swedish jurists who agreed that personal and social rights were culturally and historically relative legal fictions or superstitions, their existence depending, to a large extent, on the majority’s willingness to uphold them. Enlightened governments were expected to foster virtues and restrict personal rights for the sake of communal rights and civic responsibility.

This led to the paradoxical result that involuntary sterilizations and confinements were almost exclusively carried out in the most advanced and progressive democracies, the only exception being Nazi Germany. Most such laws would only be repealed in the late 1960s and 1970s. By contrast, in those same years, legislators and jurists in Latin, Eastern European, Far Eastern, and developing countries, as well as in Holland and Britain, objected to selective breeding, involuntary sterilization, the assault on the notion of free will, and the linear extension of natural laws into the social sphere. Eugenics and the marriage between bureaucratic rationality and scientism did not resonate with every local repertoires of values and symbols.

After World War 2, the transnational eugenics movement was forced underground by the popularity of the human rights discourse, and pro-eugenics journals and organizations adopted new names. But eugenics never died out. While some eugenicists espoused the aims of Planned Parenthood and addressed questions of population quantity, celebrated scientists at the 1962 CIBA-sponsored London conference and 1998 UCLA ‘Engineering the human germline’ symposium advocated programmes of involuntary sterilization, human cloning, and genetic enhancement. Meanwhile, Singapore adopted incentives to support assortative mating among college graduates and voluntary sterilization among the poor and, in 1995, the Chinese ‘Maternal and Infant Health Law’ urged premarital certification of a healthy constitution, mostly targeting ethnic minorities and the rural population.

Western societies themselves are on the verge of a eugenics revival in the form of reprogenetics, germline engineering, and cloning, a trend which is indirectly reinforced by the courts’ recognition of wrongful birth and wrongful life claims, by the commodification of healthcare, by the diffusion of testing for genetic predispositions, and by the rhetoric of genetic responsibility, involving new forms of discrimination and exclusion.

Great Depression

The most dramatic economic shock that the world has ever known began on October 24, 1929, also known as “Black Thursday”. After years of large-scale speculations, with millions of investors borrowing money to chase the dream of easy riches and hundreds of banks willing to accept stocks as collateral, stock prices far exceeded the companies’ actual productivity, and the bubble eventually burst. The collapse of the New York stock exchange continued through October 29 (“Black Tuesday”) and, during the following days and weeks, countless investors found themselves suddenly broke, while hundreds of banks were forced to default on their loans.

By the early 1940s, the Dow Jones stock prices were still approximately 75 percent below the 1929 peak, a level which was only reached again in 1955, and the number of banks in operation almost halved over the next four years driving thousands of business owners to the wall, as the banks called in loans to stay afloat. Furthermore, because the banking system could no longer supply the necessary liquidity, new business enterprises could not be undertaken and millions of workers lost their jobs with little hope of regaining them in the near future. In 1933 and 1934, one half of the total US workforce was jobless or underemployed. To make things worse, home mortgages and loans had produced a huge amount of consumer debt and, while incomes decreased, debts did not. Predictably, consumer spending declined dramatically: between 1929 and 1933 expenses on food and housing went down by more than 40 percent, with crop prices following the same downward path. The crisis of the financial markets had set off a domino reaction.

Unfortunately, the American president Herbert Hoover was a steadfast advocate of laissez-faire principles and believed that the “invisible hand” of the market and the moral fiber of American people would ensure that everything would eventually work out just fine. A rugged individualist and a champion of the values of frugality, hard work, plain life and self-reliance, he was persuaded that the economic Depression could be treated like any clinical depression because the whole problem came down to a large degree to a “a question of fear”. In keeping with the contemporary tendency to manage economic problems by trade measures, Hoover adopted austerity measures and, on June 17, 1930, he signed the Hawley-Smoot Tariff Act, which doubled import duties on selected goods, causing other Western countries, already burdened by war debts and reparations dating back to the Versailles Treaty of 1919, which had ended World War 1, to react by raising their own import tariffs. This provoked a major disruption of world trade, with an overall reduction of about forty percent and widespread unemployment and hardships, especially in those countries whose economy was heavily dependent on international trade. Worse still, the Federal Reserve refused to provide emergy lending to help key banks to at least partially recover from their losses.

Internationally, a combination of high external debt, falling export prices, government fiscal difficulties, and internal banking crises spelt disaster for the world economy. Latin American countries, the most dependent on selling raw materials to U.S. industries and therefore the most sensitive to the fluctuations of the American economy, were the first to default on their debts. Bolivia defaulted in January 1931, Peru in March of the same year, Chile in July, and Brazil and Colombia in October. Europe was hit in 1931, when several banking crises translated into foreign-exchange and fiscal crises. Hungary, Yugoslavia, and Greece were forced to default in 1932, followed by Austria and Germany in 1933. Austria’s largest bank, Vienna’s Kreditanstalt, which financed two-thirds of Austrian industrial production and controlled 70 percent of Austrian bank assets, had failed in May 1931, an event which had sent shockwaves across Europe. Depositors rushed to take out their money from banks that were perceived to be in weak financial conditions and, in so doing, they compromised the stability of the entire banking sysyem of several countries. By mid-June many German banks had collapsed. Major Italian banks were rescued by the fascist regime, which corporatized the national economy and started to drift away from international rules and agreements and to shift towards racist and imperialist policies.

One of the main consequences of this chain reaction was that trust in sovereign loans was shattered. The social and political repercussions were catastrophic. Industrial unemployment in the United States averaged 37.6 percent in 1933, while Germany reached its highest rate at 43.8 percent, the United Kingdom at 22.1, Sweden at 23.7, Denmark at 28.8, and Belgium at 20.4 percent. In Western Canada more than one fifth of the labour force remained unemployed throughtout the 1930s and even the introduction of important measures of welfare assistance did not help boost the economy. Farmers publicly protested against new immigrants, calling them “the scum of Continental Europe”, “men and women who have behind them a thousand years of ignorance, superstition, anarchy, filth and immorality”. Meanwhile, in the United States, the penal system became increasingly punitive. More executions were carried out than in any other decade in American history – an average of 167 per year – and there also took place a sharp rise in imprisonment. In point of fact crime rates did not significantly rise, but the mass media popularized the idea that the social order was on the verge of collapse, and generated a “crime wave” frenzy among the public.

By the early 1930s, the economic slump had destabilized the international political order, the erosion of liberal values was at an advanced stage, and welfarist cost-benefit analysis had gained appeal and credibility. Prompted by the need to cut down on public spending and the moral panic generated by the Great Depression, several governments of the most advanced democratic countries lost confidence in the effectiveness of social reforms and undertook programmes for the involuntary sterilization of thousands of citizens. It was argued that, under exceptional circumstances, basic rights could be withheld and that, in order to reduce the burden on the public purse, social services should only be granted to those whose social usefulness and biological capability were past doubt. In the Weimar Republic, the hardest hit by the Depression, this ideological shift produced a radicalisation of medical views on racial hygiene and “euthanasia”.

Trade protectionism, nationalism, and fascism were among the most tragic results of the Depression, which had persuaded international observers that conventional economic policies should be reformed and that national economies needed increased state control. Earlier enthusiasm for internationalism, cosmopolitan law, and international institutions completely disappeared, replaced by the feeling that large-scale conflicts between powers were once again inevitable.

Nationalism, imperialism, militarism, social Darwinism, autarchic protectionism, a centralized economy, corporative statism, a meticulous division of labour, large industrial cartels, and the coordination of transnational economic blocks under the leadership of a few hegemonic powers were the central tenets of fascism and grew in opposition to the pre-Depression world order. Impermeable national and ethnic frontiers were described as a defense mechanism against the evils of globalization and would defuse the threat of global market competition, speculative mobile international financial flows of capital, Bolshevik class struggle and intra-national ethnic conflicts. They would also provide individuals with organic identities, rooted in history and nature, and therefore far more solid (and less free) than the contingent and impalpable identities generated by the dialectics of democracy and by mass consumerism.

A comprehensive scheme of socio-demographic, economic and geo-political reconfiguration of the planet, in open violation of international norms and treatises, was envisioned, which would bring about a hierarchical and standardized systematization of the relationships between peoples, countries, and territories on a relativistic and particularist basis. National sovereignty and ethnic specificities would justify all sorts of abuses. Within this perspective, because no common rule of law could exist, international statutes ceased to be applicable from the moment fascist powers decided not to comply with them. This spelled the end of the international order of the League of Nations and of a model of cosmopolitan law founded on the principles of liberal universalism.

In the Far East, during the 1920s, hundreds of villages in the Chinese hinterland had seen their consumption patterns change dramatically as a consequence of the marketing campaigns of transnational corporations, which employed hundreds of thousands of Chinese peasants. However, the progressive internationalization and connectedness of the Chinese economy meant that it became increasingly vulnerable to trade fluctuations and, when the Depression took place, the entire structure of Chinese agricultural production was hit with unprecedented force and the process of pauperization of the countryside population seemed unstoppable. There followed two major consequences: the strengthening of the Communist Party and a major diaspora of Chinese emigrants seeking a better future abroad.

In Japan, a country which was heavily dependent on foreign trade, unemployment soared, labor disputes became more and more frequent and violent and so did anti-Japanese insurgent movements in Korea and Taiwan. Rural debt forced poor tenant farmers to sell their daughters as prostitutes and thousands of small businesses were gradually absorbed by the zaibatsu, that is, huge financial combines which pushed for more authoritarian and imperialistic policies.

On a global scale, the Depression proved far-reaching because of the degree of interdependency of the world economy and because colonial empires acted as relays for the transmission of its effects from the centre to the periphery. But how did goverments handle this massive crisis?

In the United States, Hoover’s seeming idleness was interpreted by millions of American voters as callousness and the presidential candidate for the Democrats, Franklin D. Roosevelt, who evoked a more interventionist and caring state won a landslide victory in 1932. His presidency will be for ever identified with the New Deal, comprising a series of “Keynesian” relief, recovery and reform measures, in the understanding that “deficits in time of recession help alleviate the downturn.” This program revitalized the economy by reinvigorating mass consumption through deficit spending and restored psychological confidence and people’s trust in the American institutions and in the future by effectively reshaping their expectations.

Sorokin once righlty noticed that calamities tend to expand the scope of governmental regulation of social relationships. Indeed, deficit spending for government-funded public works programmes was successfully used for economic recovery in Social-Democratic Sweden, but also in Nazi Germany, fascist Italy and imperialist Japan. These countries were among the first to overcome the crisis. Instead, in Britain and France, two countries whose currencies were pegged to the Gold Standard, mostly for reasons of national pride, a genuine recovery only began when large scale rearmament was undertaken as a reaction to the National Socialist threat. It is noteworhty that those countries which remained on the Gold Standard fared far worse than those which did not.

In the final analysis, the Depression lasted for about a decade and was aggravated by a steadfast and self-defeating loyalty to the Gold Standard, as well as by increased wealth inequality and financial speculation. It was not brought to an end by the concerted effort of fair-minded and judicious leaders committed to the cause of world prosperity and peace, but by a vast military build-up leading straight into World War 2.

The final question we need to ask is whether a Great Depression could happen again. If a crisis of that magnitude is ever to occur, it is unlikely that it will be triggered by the same factors. The Depression had such a lasting impact on the lives of those who experienced it, that policy makers and analysts have thoroughly canvassed the subject so as to define those steps that governments should or should not take to prevent a repeat. These comprise higher inflation and lower productivity, more social justice, a greater emphasis on employment targets rather than exchange rates, and the protection of farmers.

Juan Ginés de Sepúlveda

Sixteenth-century Spanish humanist theologian. He pursued theological, philosophical and juridical studies in Córdoba, Alcalá de Henares and Bologna, where he developed a keen interest in the philosophy of Aristotle.
Appointed royal chaplain, court historiographer, and tutor of Philip II by the Spanish emperor Charles V in the mid-1530s, his reactionary views drew him into numerous disputations, in which he sought to safeguard orthodoxy and stifle ecclesiastic reforms.
Besides those of Erasmus and Luther, he most famously attacked the progressive and humanitarian views of the Dominican friar Bartolomé de las Casas (1474–1566), the most outspoken advocate of indigenous rights in the Americas. Opposed to the so-called New Laws (1542), which banned slavery and regulated the encomienda, a neo-feudal institution that granted free Indian labour to Spanish landowners, Sepúlveda persuaded the emperor to revoke them. Las Casas, one of the inspirers of the New Laws, immediately sailed back to Spain to repel the assault of those, among the Spanish intelligentsia, who sided with the Conquistadors and justified the killing and oppression of the Indians.
Sepúlveda was one of them. A self-appointed champion of the interests of slavers and landowners, he had authored a treatise entitled “Concerning the Just Cause of the War Against the Indians” (1547) to provide solid philosophical underpinnings for Spanish imperialism and just war theory. In doing so, he treaded dangerously close to heresy. His heterodox outlook, tinged with naturalistic paganism and militaristic chauvinism, alienated him from the most significant academic circles of Spain. Even so, thanks to his impressive scholarship and to the support of economic potentates, he retained much of his influence.
These two intellectual giants were thus set on a collision course. In 1550, Charles V called a halt to military operations in the New World, until the status of Native Americans, together with the morality and legality of the Spanish conquest, had been thoroughly debated. A group of theologians and jurists (junta) was convoked in Valladolid, to listen to the arguments of Las Casas and Sepúlveda and settle this question once and for all. This dispute is of paramount importance because it constituted the first major articulate attempt on the part of Europeans to understand and define human variability and cultural diversity, and marked the crucial universalist/racialist bifurcation of anthropological philosophy at the dawn of modernity.
The bull “Sublimis Deus”, issued in 1537 by Pope Paul III, had already clarified the Holy See’s official position on the subject. The Pope condemned slavery and the portrayal of Indians as “dumb brutes created for our service”, incapable of exercising self-government, free will, and rational thinking, and therefore of receiving the message of Christ.
Las Casas, elaborating on this bull and on the writings of Francisco de Victoria, a Dominican professor at the prestigious university of Salamanca, as well as one of the precursors of international law and human rights theory, decried the barbarity of Spaniards by contrasting it with the meekness, humbleness and good-heartedness of the Indians. Sustained by an unswerving faith in the essential unity of humankind and by his conviction that a commitment to global justice was a moral imperative, he argued that Indians were fully capable of governing themselves and were entitled to certain basic rights, regardless of the nature of their practices and beliefs, which should anyhow be understood from an indigenous point of view.
While Las Casas, who had spent most of his life in the colonies, sided with the poor and disenfranchised, Sepúlveda, who knew very little of the Spanish colonial subjects, drew on the doctrine of natural law and on pragmatic realism to marshal most of the arguments which would be later deployed by anti-abolitionists, segregationists and imperialists. He explained that, for all intents and purposes, given their innate physical and intellectual inferiority, Indians should be assimilated to Aristotle’s “natural slaves.” For Sepúlveda, Christian blood was the only vessel of reason, therefore Indians were naturally impervious to conversion. In consequence of their being ruled by passions rather than reason, Indians were actually born to be slave and should actually be grateful that, in spite of their sinfulness, barbarism, licentiousness, and relative indifference to the institute of private property, their new masters acted as God’s instrument of redemption and regeneration. Finally, as men ruled over women, and adults ruled over children, so inferior races should be subordinated to the will of superior races. This line of reasoning clearly allowed for the virtual enslavement of indigenous people, and authorised the violent reprisals whenever the Indians refused to accept Spanish rules.
Officially, neither Las Casas nor Sepúlveda won the dispute, but the monarchy made common cause with the Church against the encomenderos, for there was a growing concern that the power of colonial landowners was rising disproportionately, and that their unwillingness to re-invest their considerable revenues was harming the Spanish economy. It is also fair to say that the Crown was motivated by sincere moral qualms.
With the benefit of the hindsight, we now know that Sepúlveda’s theses were both modern – as when he implied that the spheres of politics and religion should be kept separate and that law should reflect the reality of actual human relationships (legal realism) – and anachronistic, given that he relied on the notion of a natural causation of society and politics that was already obsolete at the time. Consequently, his propositions could not be reconciled with Spanish legal thinking, which had already taken a clear anti-slavery position, and consistently refused to sanction the exploitation of American natives under the guise of outmoded and undignified medieval contracts.
Nevertheless, exploitation and abuse continued, in Potosì as in Mexico, because the cold logic of pragmatism and greed prevailed. Only those natives who learned to avail themselves of colonial laws and acted as their own attorneys could successfully fight their exploiters.

Germanic Invasions

In the fourth century, Roman society was not on the brink of collapse. Otherwise, it would be hard to explain why so many Germanic tribes and families aspired to the status of foederati (relatively autonomous vassals) and laeti (immigrants).
Contemporary German scholars are therefore entirely correct in disputing the label “barbaric invasions” to designate what, for centuries, had been a flux of immigrants that enriched and re-invigorated Roman society. African and Arabic nomadic populations were allowed to cross the borders, Bedouins were sometimes hired to garrison the frontier and their chiefs could adopt Roman names and have their villas built next to Roman villas. Most of the time, frontier garrisons only controlled immigration and prevented violent confrontations between communities living along the border.
Along the Rhine frontier, Germanic peoples had gradually become sedentary and were less interested in waging wars, than in establishing profitable commercial exchanges with the Mediterranean basin. Thousands of them served in the Roman army and died for the Roman emperors as mercenaries or soldiers and, more often than not, their skills and valour enabled them to reach the highest ranks of the military elite.
In the West, leaving aside the occasional forays of more aggressive Germanic populations like the Alamanni and the Franks, the situation was substantially under control and the rhetoric of the “Germanic menace” was generally regarded as outmoded. The Romans knew that they could strike hostile tribes and subdue them through ferocious retaliations, enslavement and the forcible recruitment into the army of their offspring. Furthermore, they had realised that the so-called “Barbarians” represented a limitless source of cheap labour and valiant recruits.
These “Barbarians” were often poor people looking for a better life and, albeit the Roman/Greek elites never used terms such as “immigrants” or “refugees”, as we do now, their conceptualization of these foreigners living on Roman soil was reminiscent of today’s portrayal of refugees, asylum seekers, and legal or illegal immigrants in affluent countries.
The very same agencies that had been established to relocate Roman families who had lost their farms and shops during Germanic incursions, were also entrusted with the task of ensuring that these families and groups of “immigrants” could find a place to live, a job – as farmers, soldiers, craftsmen, miners, merchants, etc. – and education. In exchange, they were expected to “romanize” their manners, pay taxes, and have their children join the army. It was mainly thanks to the contribution of these new Roman citizens that the imperial economy recovered from the disastrous third century, and their absorption does not seem to have had dramatically destabilizing effects.
But things were different along the Danube frontier. The Eastern plains were ruled by nomadic tribes that had troubled for centuries all classic civilizations: the Chinese, the Indians, the Persians and the Romans. Such peoples were ill at ease with the Roman sedentary lifestyle. Their restlessness was altogether incompatible with Roman civilization and incomprehensible to its chroniclers and amateur ethnographers, who identified civilization with sedentarianism and urban life. For the Romans, human beings were defined less by their lineage than by the place where they were born. They were truly disconcerted by the nomadic way of life. For them, as for most Mediterranean peoples, including the Greeks, the only life worth living was the one led by city- and town-dwellers. True virtues, like Roman justice and “liberality”, could only be learned and practiced in a space enclosed by moenia (walls), that is, by adhering to the lifestyle of “civil society” (societas civilis). In Roman opinion, cities had the power to transform all those who entered it, from “savage” to “civilized”.
We can now understand why, at first, Romans were persuaded that the Asiatic Huns were inassimilable. The iconography of Attila, the greatest of all Hunnish kings, bears testimony to this confrontation, for it borrowed from the Christian iconography of the devil.
Hellenic historian Ammianus Marcellinus sensed the novelty of the Huns’ appearance. Germanic peoples were partly uncivilized, but nobody seriously questioned their membership in the human species. Instead, at least initially, the Huns were portrayed as anthropoids, shorter and chubbier than the average human being, and their description tended to be more vivid and sensationalistic than rigorous and inquisitive. Huns never settled down, seldom dismounted from their horses, slept on them, negotiated on them and tenderized raw meat beneath their saddles. They were irreligious and only craved for gold.
Nevertheless, Romans had little or no interest in racial taxonomies, and preferred to deploy sociological categories in their classification of “the Other”. On this count, they were more anthropologically cognizant than many nineteenth-century Euro-American anthropologists and historians. What they quickly discerned was that peoples were not self-contained, impermeable wholes, and that Barbarian tribes were as ethnically heterogeneous as the Romans, and alternated migrations with long periods of respite, assimilation of Roman cultural traits and “Romanophobia”.
That their analysis was substantially correct is testified by the Gothic roots of the name “Uldin”, the first known Hunnish king, which corresponded to the Gothic Wölflein (young wolf), and the Gothic termination of the name of the most famous Hunnish king, Attila. This is a clear indication of the fact that these these diverse ethnic groupings were generally brought together by charismatic and successful leaders, and rapidly dissolved after their death.
On the other hand, because of this sociological-geographical bias, the Romans, who could not rely on modern philological comparisons, believed that all Germanic “Barbarians” lived in Western Europe. They therefore classified Eastern Goths, who really were Germanic, as nomadic people from the steppes, because they lived in the Balkans and were skilful horsemen, and could not easily distinguish them from the Huns, even though these latter were short and somatically Asian.
Nevertheless, because they admired their fighting skills, the Romans were willing to periodically hire Goths as mercenaries. They would transport them by boat, down the Danube, across the Black Sea and to the Eastern front, to die in Armenia, Syria, or Mesopotamia. In return, Gothic chiefs would receive money, subsidies, gifts and food provisions for their tribes, and gradually assume the functions, the attire, and the mentality of princes.
Without either people becoming aware of the long-term repercussions of this relationship, they grew mutually dependent, almost symbiotic: Romans needed Gothic warriors, Goths needed Roman supplies and converted to Christianity in the thousands; especially when Wulfila (311-383), a Gothic bishop, devised a special Gothic alphabet to translate the Bible from Greek into Gothic. This was an astonishing accomplishment, given that it occurred prior to the completion of the Latin translation, but it was also the trigger of a series of persecutions of Gothic Christian preachers and believers on the part of those Goths who saw adherence to Christian values and principles as a betrayal of what stood for “Gothness” itself.
Even so, mutual suspicions and xenophobic sentiments could at times run deep, if anything because the development of an “ethnic consciousness” best served the political aims and vested interests of the respective elites. The rhetorical device of an eternalized Roman/Germanic antagonism was so serviceable that, in AD 962, approximately five centuries after the demise of the Western Roman Empire, Liutprand, the bishop of Cremona, in Northern Italy, thought it necessary to lambaste the Romans with a remarkable vehemence: “we Lombards, Saxons, Franks, Lotharingians, Bavarians, Swabians, and Burgundians have such utter contempt for the Romans that when we try to express out indignation we can find no term with which to insult our enemies more damaging than that of Romans. This single word means for us all that is ignoble, cowardly, sordid, obscene”. Several prominent Romans held Germanic and Hunnish peoples in much the same contempt. Take, for instance, Saint Ambrose, bishop of Milan and one of the founders of Roman Catholicism, who argued that promoting the abuse of alcohol among barbarians was a sensible way to sap their strength and spirit.
Yet, there is no proof that ideological constructs of ethnicity were very popular amidst ordinary people. Most contemporary scholars agree that the Byzantine ruling classes created the Slavs, while the Slavs themselves were not aware of their being members of an allegedly cohesive and self-assertive ethnic group. In all likelihood, Germanic and Roman lay-people learnt to co-exist and accommodate to the presence of strangers, as their ancestors had done for centuries, from the time of the Roman invasion of Gaul, and became increasingly open to intermarriage and cultural syncretism.
After all, Barbarian resettlements in Roman provinces were carefully planned operations, executed with the collaboration of Barbaric chiefs, who had no interest in shedding their people’s blood when they could reach a mutually advantageous agreement with the Romans and move to areas of the empire that were comparatively underpopulated. At least up to the end of the fourth century, no large-scale arbitrary expropriation took place.
But what about the Huns? When the Huns migrated into the Balkans from the Asian steppes, they signed a pact with the Romans: they would live in Hungary and protect the empire as Foederati against the common enemy, the Goths. Therefore, initially, the relationship between Romans and Huns was one of full collaboration. In the year 400, a Gothic contingent of the Roman army under the command of Gainas attempted to occupy Costantinople. Most of them had to flee due to a popular uprising, while hundreds of them were slain by civilians inside an Orthodox church. Eventually, Gainas was defeated by the Huns, and Uldin had his head delivered to the emperor.
This was not an isolated instance of Hunnish aid to the Roman Empire. Thousands of Huns served the Roman emperors and groups of them even fought against African Bedouins to protect the southern Roman border. In 406, a Barbaric coalition that was heading south was defeated by Stilicho at Fiesole, near Florence, also with the help of the Huns. Aetius himself, the “Last of the Romans”, lived three years among the Huns, learnt their language and customs and became the commander of a large contingent of Hunnish mercenaries who participated in the periodical confrontations between emperors and usurpers.
In the 420s the Huns federated under the leadership of Attila’s uncle, Rua (Ruga), and gradually extended their hegemony from the Black Sea to the Rhine by means of a “carrot-and-stick strategy”. Attila (406AD–453AD) ascended the throne in 434 and negotiated the Roman cession of Pannonia to the Hunnish empire. Then, in 437, together with the Romans led by Aetius, now commander in chief of the Western army, the Huns crushed the Burgundians, an unruly population of federati residing in South-West Germany.
This was to be the last alliance between Huns and Romans. Later on, Aetius struck several agreements with the Burgundians, the Franks, the Visigoths and other Germanic populations, who were allowed to peacefully resettle within the confines of the Western Empire. These separate agreements demonstrated Aetius’s foresight, for when Attila ordered his vassals – the Alans, the Burgundians, the Gepids, the Ostrogoths, the Thuringians, etc. – to march westwards, across the Gaul, with the Hunnish army, the Romans could rely on the Frank and Visigoth armies to stop his advance at Chalons en Champagne in 451.
Ironically, Attila’s death sealed the fate of both the Roman and the Hunnish empires. Afterwards, the nominally subject peoples of the Roman and the Hunnish empire broke away from the former allies and took the initiative, bringing the international balance of power to an abrupt end, together with the residual political authority of Rome. Their peoples finally settled in various western provinces, Germanic kings obtained from the Eastern Emperors the authorization to rule over Western Romans on behalf of Constantinople. This marked the end of Late Antiquity and of Rome as a secular, imperial power.