text
stringlengths
151
4.06k
During the 1800s Christian missionaries from Great Britain and the United States followed traders to the Hawaiian islands. Long-termly, the Anglo-Saxon presence negatively impacted the level of regard Hawaiian royal women held for their own indigenous looks. For centuries prior the arrival of Christians, first nation Hawaiian aesthetics, such as dark skin and ample bodies, had been considered signs of nobility. No matter how much they adapted their mannerisms to Western standard, some of the Anglo-Saxon missionaries were relentless in referring to the indigenous women as "Hawaiian squaws." By the last half of the 19th century, some Hawaiian women began marrying European men who found them exotic. The men, however, selected Hawaiian women who were thinner and paler in complexion.
Racial discrimination continued to be enacted in new laws in the 20th century, for instance the one-drop rule was enacted in Virginia's 1924 Racial Integrity Law and in other southern states, in part influenced by the popularity of eugenics and ideas of racial purity. People buried fading memories that many whites had multiracial ancestry. Many families were multiracial. Similar laws had been proposed but not passed in the late nineteenth century in South Carolina and Virginia, for instance. After regaining political power in Southern states by disenfranchising blacks, white Democrats passed laws to impose Jim Crow and racial segregation to restore white supremacy. They maintained these until forced to change in the 1960s and after by enforcement of federal legislation authorizing oversight of practices to protect the constitutional rights of African Americans and other minority citizens.
The phenomenon known as "passing as white" is difficult to explain in other countries or to foreign students. Typical questions are: "Shouldn't Americans say that a person who is passing as white is white, or nearly all white, and has previously been passing as black?" or "To be consistent, shouldn't you say that someone who is one-eighth white is passing as black?" ... A person who is one-fourth or less American Indian or Korean or Filipino is not regarded as passing if he or she intermarries with and joins fully the life of the dominant community, so the minority ancestry need not be hidden. ... It is often suggested that the key reason for this is that the physical differences between these other groups and whites are less pronounced than the physical differences between African blacks and whites, and therefore are less threatening to whites. ... [W]hen ancestry in one of these racial minority groups does not exceed one-fourth, a person is not defined solely as a member of that group.
Population testing is still being done. Some Native American groups that have been sampled may not have shared the pattern of markers being searched for. Geneticists acknowledge that DNA testing cannot yet distinguish among members of differing cultural Native American nations. There is genetic evidence for three major migrations into North America, but not for more recent historic differentiation. In addition, not all Native Americans have been tested, so scientists do not know for sure that Native Americans have only the genetic markers they have identified.
Some multiracial individuals feel marginalized by U.S. society. For example, when applying to schools or for a job, or when taking standardized tests, Americans are sometimes asked to check boxes corresponding to race or ethnicity. Typically, about five race choices are given, with the instruction to "check only one." While some surveys offer an "other" box, this choice groups together individuals of many different multiracial types (ex: European Americans/African-Americans are grouped with Asian/Native American Indians).
Prior to the one-drop rule, different states had different laws regarding color. More importantly, social acceptance often played a bigger role in how a person was perceived and how identity was construed than any law. In frontier areas, there were fewer questions about origins. The community looked at how people performed, whether they served in the militia and voted, which were the responsibilities and signs of free citizens. When questions about racial identity arose because of inheritance issues, for instance, litigation outcomes often were based on how people were accepted by neighbors.
Since the late twentieth century, the number of African and Caribbean ethnic African immigrants have increased in the United States. Together with publicity about the ancestry of President Barack Obama, whose father was from Kenya, some black writers have argued that new terms are needed for recent immigrants. They suggest that the term "African-American" should refer strictly to the descendants of African slaves and free people of color who survived the slavery era in the United States. They argue that grouping together all ethnic Africans regardless of their unique ancestral circumstances would deny the lingering effects of slavery within the American slave descendant community. They say recent ethnic African immigrants need to recognize their own unique ancestral backgrounds.
In the 1980s, parents of mixed-race children began to organize and lobby for the addition of a more inclusive term of racial designation that would reflect the heritage of their children. When the U.S. government proposed the addition of the category of "bi-racial" or "multiracial" in 1988, the response from the public was mostly negative. Some African-American organizations, and African-American political leaders, such as Congresswoman Diane Watson and Congressman Augustus Hawkins, were particularly vocal in their rejection of the category, as they feared the loss of political and economic power if African Americans reduced their numbers by self-identification.
The social identity of the children was strongly determined by the tribe's kinship system. Among the matrilineal tribes of the Southeast, the mixed-race children generally were accepted as and identified as Indian, as they gained their social status from their mother's clans and tribes, and often grew up with their mothers and their male relatives. By contrast, among the patrilineal Omaha, for example, the child of a white man and Omaha woman was considered "white"; such mixed-race children and their mothers would be protected, but the children could formally belong to the tribe as members only if adopted by a man.
In the late 19th century, three European-American middle-class female teachers married Indigenous American men they had met at Hampton Institute during the years when it ran its Indian program. In the late nineteenth century, Charles Eastman, a physician of European and Sioux ancestry who trained at Boston University, married Elaine Goodale, a European-American woman from New England. They met and worked together in Dakota Territory when she was Superintendent of Indian Education and he was a doctor for the reservations. His maternal grandfather was Seth Eastman, an artist and Army officer from New England, who had married a Sioux woman and had a daughter with her while stationed at Fort Snelling in Minnesota.
The writer Sherrel W. Stewart's assertion that "most" African Americans have significant Native American heritage, is not supported by genetic researchers who have done extensive population mapping studies. The TV series on African-American ancestry, hosted by the scholar Henry Louis Gates, Jr., had genetics scholars who discussed in detail the variety of ancestries among African Americans. They noted there is popular belief in a high rate of Native American admixture that is not supported by the data that has been collected. (Reference is coming)
Interracial relationships have had a long history in North America and the United States, beginning with the intermixing of European explorers and soldiers, who took native women as companions. After European settlement increased, traders and fur trappers often married or had unions with women of native tribes. In the 17th century, faced with a continuing, critical labor shortage, colonists primarily in the Chesapeake Bay Colony, imported Africans as laborers, sometimes as indentured servants and, increasingly, as slaves. African slaves were also imported into New York and other northern ports by the Dutch and later English. Some African slaves were freed by their masters during these early years.
Of numerous relationships between male slaveholders, overseers, or master's sons and women slaves, the most notable is likely that of President Thomas Jefferson with his slave Sally Hemings. As noted in the 2012 collaborative Smithsonian-Monticello exhibit, Slavery at Monticello: The Paradox of Liberty, Jefferson, then a widower, took Hemings as his concubine for nearly 40 years. They had six children of record; four Hemings children survived into adulthood, and he freed them all, among the very few slaves he freed. Two were allowed to "escape" to the North in 1822, and two were granted freedom by his will upon his death in 1826. Seven-eighths white by ancestry, all four of his Hemings children moved to northern states as adults; three of the four entered the white community, and all their descendants identified as white. Of the descendants of Madison Hemings, who continued to identify as black, some in future generations eventually identified as white and "married out", while others continued to identify as African American. It was socially advantageous for the Hemings children to identify as white, in keeping with their appearance and the majority proportion of their ancestry. Although born into slavery, the Hemings children were legally white under Virginia law of the time.
After the Civil War, racial segregation forced African Americans to share more of a common lot in society than they might have given widely varying ancestry, educational and economic levels. The binary division altered the separate status of the traditionally free people of color in Louisiana, for instance, although they maintained a strong Louisiana Créole culture related to French culture and language, and practice of Catholicism. African Americans began to create common cause—regardless of their multiracial admixture or social and economic stratification. In 20th-century changes, during the rise of the Civil Rights and Black Power movements, the African-American community increased its own pressure for people of any portion of African descent to be claimed by the black community to add to its power.
Chinese men entered the United States as laborers, primarily on the West Coast and in western territories. Following the Reconstruction era, as blacks set up independent farms, white planters imported Chinese laborers to satisfy their need for labor. In 1882, the Chinese Exclusion Act was passed, and Chinese workers who chose to stay in the U.S. were unable to have their wives join them. In the South, some Chinese married into the black and mulatto communities, as generally discrimination meant they did not take white spouses. They rapidly left working as laborers, and set up groceries in small towns throughout the South. They worked to get their children educated and socially mobile.
Multiracial people who wanted to acknowledge their full heritage won a victory of sorts in 1997, when the Office of Management and Budget (OMB) changed the federal regulation of racial categories to permit multiple responses. This resulted in a change to the 2000 United States Census, which allowed participants to select more than one of the six available categories, which were, in brief: "White," "Black or African American," "Asian," "American Indian or Alaskan Native," "Native Hawaiian or other Pacific Islander," and "Other." Further details are given in the article: Race (U.S. census). The OMB made its directive mandatory for all government forms by 2003.
Laws dating from 17th-century colonial America defined children of African slave mothers as taking the status of their mothers, and born into slavery regardless of the race or status of the father, under partus sequitur ventrem. The association of slavery with a "race" led to slavery as a racial caste. But, most families of free people of color formed in Virginia before the American Revolution were the descendants of unions between white women and African men, who frequently worked and lived together in the looser conditions of the early colonial period. While interracial marriage was later prohibited, white men frequently took sexual advantage of slave women, and numerous generations of multiracial children were born. By the late 1800s it had become common among African Americans to use passing to gain educational opportunities as did the first African-American graduate of Vassar College Anita Florence Hemmings. Some 19th-century categorization schemes defined people by proportion of African ancestry: a person whose parents were black and white was classified as mulatto, with one black grandparent and three white as quadroon, and with one black great-grandparent and the remainder white as octoroon. The latter categories remained within an overall black or colored category, but before the Civil War, in Virginia and some other states, a person of one-eighth or less black ancestry was legally white. Some members of these categories passed temporarily or permanently as white.
Reacting to media criticism of Michelle Obama during the 2008 presidential election, Charles Kenzie Steele, Jr., CEO of the Southern Christian Leadership Conference said, "Why are they attacking Michelle Obama, and not really attacking, to that degree, her husband? Because he has no slave blood in him." He later claimed his comment was intended to be "provocative" but declined to expand on the subject. Former Secretary of State Condoleezza Rice (who was famously mistaken for a "recent American immigrant" by French President Nicolas Sarkozy), said "descendants of slaves did not get much of a head start, and I think you continue to see some of the effects of that." She has also rejected an immigrant designation for African Americans and instead prefers the term "black" or "white" .
Some early male settlers married Indigenous American women and had informal unions with them. Early contact between Indigenous Americans and Europeans was often charged with tension, but also had moments of friendship, cooperation, and intimacy. Marriages took place in both English and Latin colonies between European men and Native women. For instance, on April 5, 1614, Pocahontas, a Powhatan woman in present-day Virginia, married the Englishman John Rolfe of Jamestown. Their son Thomas Rolfe was an ancestor to many descendants in First Families of Virginia. As a result, English laws did not exclude people with some Indigenous American ancestry from being considered English or white.
Colonial records of French and Spanish slave ships and sales, and plantation records in all the former colonies, often have much more information about slaves, from which researchers are reconstructing slave family histories. Genealogists have begun to find plantation records, court records, land deeds and other sources to trace African-American families and individuals before 1870. As slaves were generally forbidden to learn to read and write, black families passed along oral histories, which have had great persistence. Similarly, Native Americans did not generally learn to read and write English, although some did in the nineteenth century. Until 1930, census enumerators used the terms free people of color and mulatto to classify people of apparent mixed race. When those terms were dropped, as a result of the lobbying by the Southern Congressional bloc, the Census Bureau used only the binary classifications of black or white, as was typical in segregated southern states.
European colonists created treaties with Indigenous American tribes requesting the return of any runaway slaves. For example, in 1726, the British governor of New York exacted a promise from the Iroquois to return all runaway slaves who had joined them. This same promise was extracted from the Huron Nation in 1764, and from the Delaware Nation in 1765, though there is no record of slaves ever being returned. Numerous advertisements requested the return of African Americans who had married Indigenous Americans or who spoke an Indigenous American language. The primary exposure that Africans and Indigenous Americans had to each other came through the institution of slavery. Indigenous Americans learned that Africans had what Indigenous Americans considered 'Great Medicine' in their bodies because Africans were virtually immune to the Old-World diseases that were decimating most native populations. Because of this, many tribes encouraged marriage between the two groups, to create stronger, healthier children from the unions.
Interracial relationships, common-law marriages, and marriages occurred since the earliest colonial years, especially before slavery hardened as a racial caste associated with people of African descent in the British colonies. Virginia and other English colonies passed laws in the 17th century that gave children the social status of their mother, according to the principle of partus sequitur ventrem, regardless of the father's race or citizenship. This overturned the principle in English common law by which a man gave his status to his children – this had enabled communities to demand that fathers support their children, whether legitimate or not. The change increased white men's ability to use slave women sexually, as they had no responsibility for the children. As master as well as father of mixed-race children born into slavery, the men could use these people as servants or laborers or sell them as slaves. In some cases, white fathers provided for their multiracial children, paying or arranging for education or apprenticeships and freeing them, particularly during the two decades following the American Revolution. (The practice of providing for the children was more common in French and Spanish colonies, where a class of free people of color developed who became educated and property owners.) Many other white fathers abandoned the mixed-race children and their mothers to slavery.
Many Latin American migrants have been mestizo, Amerindian, or other mixed race. Multiracial Latinos have limited media appearance; critics have accused the U.S. Hispanic media of overlooking the brown-skinned indigenous and multiracial Hispanic and black Hispanic populations by over-representation of blond and blue/green-eyed white Hispanic and Latino Americans (who resemble Scandinavians and other Northern Europeans rather than they look like white Hispanic and Latino Americans mostly of typical Southern European features), and also light-skinned mulatto and mestizo Hispanic and Latino Americans (often deemed as white persons in U.S. Hispanic and Latino populations if achieving the middle class or higher social status), especially some of the actors on the telenovelas.
In Virginia prior to 1920, for example, a person was legally white if having seven-eights or more white ancestry. The one-drop rule originated in some Southern United States in the late 19th century, likely in response to whites' attempt to maintain white supremacy and limit black political power following the Democrats' regaining control of state legislatures in the late 1870s. The first year in which the U.S. Census dropped the mulatto category was 1920; that year enumerators were instructed to classify people in a binary way as white or black. This was a result of the Southern-dominated Congress convincing the Census Bureau to change its rules.
Stanley Crouch wrote in a New York Daily News piece "Obama's mother is of white U.S. stock. His father is a black Kenyan," in a column entitled "What Obama Isn't: Black Like Me." During the 2008 campaign, the African-American columnist David Ehrenstein of the LA Times accused white liberals of flocking to Obama because he was a "Magic Negro", a term that refers to a black person with no past who simply appears to assist the mainstream white (as cultural protagonists/drivers) agenda. Ehrenstein went on to say "He's there to assuage white 'guilt' they feel over the role of slavery and racial segregation in American history."
The 2000 U.S. Census in the write-in response category had a code listing which standardizes the placement of various write-in responses for automatic placement within the framework of the U.S. Census's enumerated races. Whereas most responses can be distinguished as falling into one of the five enumerated races, there remains some write-in responses which fall into the "Mixture" heading which cannot be racially categorized. These include "Bi Racial, Combination, Everything, Many, Mixed, Multi National, Multiple, Several and Various".
Interracial relations between Indigenous Americans and African Americans is a part of American history that has been neglected. The earliest record of African and Indigenous American relations in the Americas occurred in April 1502, when the first Africans kidnapped were brought to Hispaniola to serve as slaves. Some escaped, and somewhere inland on Santo Domingo, the first Black Indians were born. In addition, an example of African slaves' escaping from European colonists and being absorbed by Indigenous Americans occurred as far back as 1526. In June of that year, Lucas Vasquez de Ayllon established a Spanish colony near the mouth of the Pee Dee River in what is now eastern South Carolina. The Spanish settlement was named San Miguel de Gualdape. Amongst the settlement were 100 enslaved Africans. In 1526, the first African slaves fled the colony and took refuge with local Indigenous Americans.
Some biographical accounts include the autobiography Life on the Color Line: The True Story of a White Boy Who Discovered He Was Black by Gregory Howard Williams; One Drop: My Father's Hidden Life—A Story of Race and Family Secrets written by Bliss Broyard about her father Anatole Broyard; the documentary Colored White Boy about a white man in North Carolina who discovers that he is the descendant of a white plantation owner and a raped African slave; and the documentary on The Sanders Women of Shreveport, Louisiana.
By the 1980s, parents of mixed-race children (and adults of mixed-race ancestry) began to organize and lobby for the ability to show more than one ethnic category on Census and other legal forms. They refused to be put into just one category. When the U.S. government proposed the addition of the category of "bi-racial" or "multiracial" in 1988, the response from the general public was mostly negative. Some African-American organizations and political leaders, such as Senator Diane Watson and Representative Augustus Hawkins, were particularly vocal in their rejection of the category. They feared a loss in political and economic power if African Americans abandoned their one category.
In the early 19th century, the Indigenous American woman Sacagawea, who would help translate for and guide the Lewis and Clark Expedition in the West, married the French trapper Toussaint Charbonneau. Most marriages between Europeans and Indigenous Americans were between European men and Indigenous American women. Depending on the kinship system of the woman's tribe, their children would be more or less easily assimilated into the tribe. Nations that had matrilineal systems, such as the Creek and Cherokee in the Southeast, gave the mixed-race children status in their mother's clans and tribes. If the tribe had a patrilineal system, like the Omaha, the children of white fathers were considered white. Unless they were specifically adopted into the tribe by an adult male, they could have no social status in it.
For African Americans, the one-drop rule was a significant factor in ethnic solidarity. African Americans generally shared a common cause in society regardless of their multiracial admixture, or social/economic stratification. Additionally, African Americans found it, near, impossible to learn about their Indigenous American heritage as many family elders withheld pertinent genealogical information. Tracing the genealogy of African Americans can be a very difficult process, especially for descendants of Indigenous Americans, because African Americans who were slaves were forbidden to learn to read and write, and a majority of Indigenous Americans neither spoke English, nor read or wrote it.
The figure of the "tragic octoroon" was a stock character of abolitionist literature: a mixed-race woman raised as if a white woman in her white father's household, until his bankruptcy or death has her reduced to a menial position She may even be unaware of her status before being reduced to victimization. The first character of this type was the heroine of Lydia Maria Child's "The Quadroons" (1842), a short story. This character allowed abolitionists to draw attention to the sexual exploitation in slavery and, unlike portrayals of the suffering of the field hands, did not allow slaveholders to retort that the sufferings of Northern mill hands were no easier. The Northern mill owner would not sell his own children into slavery.
By the 1890s the profound effect of adrenal extracts on many different tissue types had been discovered, setting off a search both for the mechanism of chemical signalling and efforts to exploit these observations for the development of new drugs. The blood pressure raising and vasoconstrictive effects of adrenal extracts were of particular interest to surgeons as hemostatic agents and as treatment for shock, and a number of companies developed products based on adrenal extracts containing varying purities of the active substance. In 1897 John Abel of Johns Hopkins University identified the active principle as epinephrine, which he isolated in an impure state as the sulfate salt. Industrial chemist Jokichi Takamine later developed a method for obtaining epinephrine in a pure state, and licensed the technology to Parke Davis. Parke Davis marketed epinephrine under the trade name Adrenalin. Injected epinephrine proved to be especially efficacious for the acute treatment of asthma attacks, and an inhaled version was sold in the United States until 2011 (Primatene Mist). By 1929 epinephrine had been formulated into an inhaler for use in the treatment of nasal congestion.
While highly effective, the requirement for injection limited the use of norepinephrine[clarification needed] and orally active derivatives were sought. A structurally similar compound, ephedrine, was identified by Japanese chemists in the Ma Huang plant and marketed by Eli Lilly as an oral treatment for asthma. Following the work of Henry Dale and George Barger at Burroughs-Wellcome, academic chemist Gordon Alles synthesized amphetamine and tested it in asthma patients in 1929. The drug proved to have only modest anti-asthma effects, but produced sensations of exhilaration and palpitations. Amphetamine was developed by Smith, Kline and French as a nasal decongestant under the trade name Benzedrine Inhaler. Amphetamine was eventually developed for the treatment of narcolepsy, post-encepheletic parkinsonism, and mood elevation in depression and other psychiatric indications. It received approval as a New and Nonofficial Remedy from the American Medical Association for these uses in 1937 and remained in common use for depression until the development of tricyclic antidepressants in the 1960s.
A series of experiments performed from the late 1800s to the early 1900s revealed that diabetes is caused by the absence of a substance normally produced by the pancreas. In 1869, Oskar Minkowski and Joseph von Mering found that diabetes could be induced in dogs by surgical removal of the pancreas. In 1921, Canadian professor Frederick Banting and his student Charles Best repeated this study, and found that injections of pancreatic extract reversed the symptoms produced by pancreas removal. Soon, the extract was demonstrated to work in people, but development of insulin therapy as a routine medical procedure was delayed by difficulties in producing the material in sufficient quantity and with reproducible purity. The researchers sought assistance from industrial collaborators at Eli Lilly and Co. based on the company's experience with large scale purification of biological materials. Chemist George Walden of Eli Lilly and Company found that careful adjustment of the pH of the extract allowed a relatively pure grade of insulin to be produced. Under pressure from Toronto University and a potential patent challenge by academic scientists who had independently developed a similar purification method, an agreement was reached for non-exclusive production of insulin by multiple companies. Prior to the discovery and widespread availability of insulin therapy the life expectancy of diabetics was only a few months.
In 1903 Hermann Emil Fischer and Joseph von Mering disclosed their discovery that diethylbarbituric acid, formed from the reaction of diethylmalonic acid, phosphorus oxychloride and urea, induces sleep in dogs. The discovery was patented and licensed to Bayer pharmaceuticals, which marketed the compound under the trade name Veronal as a sleep aid beginning in 1904. Systematic investigations of the effect of structural changes on potency and duration of action led to the discovery of phenobarbital at Bayer in 1911 and the discovery of its potent anti-epileptic activity in 1912. Phenobarbital was among the most widely used drugs for the treatment of epilepsy through the 1970s, and as of 2014, remains on the World Health Organizations list of essential medications. The 1950s and 1960s saw increased awareness of the addictive properties and abuse potential of barbiturates and amphetamines and led to increasing restrictions on their use and growing government oversight of prescribers. Today, amphetamine is largely restricted to use in the treatment of attention deficit disorder and phenobarbital in the treatment of epilepsy.
In 1911 arsphenamine, the first synthetic anti-infective drug, was developed by Paul Ehrlich and chemist Alfred Bertheim of the Institute of Experimental Therapy in Berlin. The drug was given the commercial name Salvarsan. Ehrlich, noting both the general toxicity of arsenic and the selective absorption of certain dyes by bacteria, hypothesized that an arsenic-containing dye with similar selective absorption properties could be used to treat bacterial infections. Arsphenamine was prepared as part of a campaign to synthesize a series of such compounds, and found to exhibit partially selective toxicity. Arsphenamine proved to be the first effective treatment for syphilis, a disease which prior to that time was incurable and led inexorably to severe skin ulceration, neurological damage, and death.[citation needed]
The modern pharmaceutical industry traces its roots to two sources. The first of these were local apothecaries that expanded from their traditional role distributing botanical drugs such as morphine and quinine to wholesale manufacture in the mid 1800s. Rational drug discovery from plants started particularly with the isolation of morphine, analgesic and sleep-inducing agent from opium, by the German apothecary assistant Friedrich Sertürner, who named the compound after the Greek god of dreams, Morpheus. Multinational corporations including Merck, Hoffman-La Roche, Burroughs-Wellcome (now part of Glaxo Smith Kline), Abbott Laboratories, Eli Lilly and Upjohn (now part of Pfizer) began as local apothecary shops in the mid-1800s. By the late 1880s, German dye manufacturers had perfected the purification of individual organic compounds from coal tar and other mineral sources and had also established rudimentary methods in organic chemical synthesis. The development of synthetic chemical methods allowed scientists to systematically vary the structure of chemical substances, and growth in the emerging science of pharmacology expanded their ability to evaluate the biological effects of these structural changes.
Ehrlich’s approach of systematically varying the chemical structure of synthetic compounds and measuring the effects of these changes on biological activity was pursued broadly by industrial scientists, including Bayer scientists Josef Klarer, Fritz Mietzsch, and Gerhard Domagk. This work, also based in the testing of compounds available from the German dye industry, led to the development of Prontosil, the first representative of the sulfonamide class of antibiotics. Compared to arsphenamine, the sulfonamides had a broader spectrum of activity and were far less toxic, rendering them useful for infections caused by pathogens such as streptococci. In 1939, Domagk received the Nobel Prize in Medicine for this discovery. Nonetheless, the dramatic decrease in deaths from infectious diseases that occurred prior to World War II was primarily the result of improved public health measures such as clean water and less crowded housing, and the impact of anti-infective drugs and vaccines was significant mainly after World War II.
Early progress toward the development of vaccines occurred throughout this period, primarily in the form of academic and government-funded basic research directed toward the identification of the pathogens responsible for common communicable diseases. In 1885 Louis Pasteur and Pierre Paul Émile Roux created the first rabies vaccine. The first diphtheria vaccines were produced in 1914 from a mixture of diphtheria toxin and antitoxin (produced from the serum of an inoculated animal), but the safety of the inoculation was marginal and it was not widely used. The United States recorded 206,000 cases of diphtheria in 1921 resulting in 15,520 deaths. In 1923 parallel efforts by Gaston Ramon at the Pasteur Institute and Alexander Glenny at the Wellcome Research Laboratories (later part of GlaxoSmithKline) led to the discovery that a safer vaccine could be produced by treating diphtheria toxin with formaldehyde. In 1944, Maurice Hilleman of Squibb Pharmaceuticals developed the first vaccine against Japanese encephelitis. Hilleman would later move to Merck where he would play a key role in the development of vaccines against measles, mumps, chickenpox, rubella, hepatitis A, hepatitis B, and meningitis.
In 1937 over 100 people died after ingesting "Elixir Sulfanilamide" manufactured by S.E. Massengill Company of Tennessee. The product was formulated in diethylene glycol, a highly toxic solvent that is now widely used as antifreeze. Under the laws extant at that time, prosecution of the manufacturer was possible only under the technicality that the product had been called an "elixir", which literally implied a solution in ethanol. In response to this episode, the U.S. Congress passed the Federal Food, Drug, and Cosmetic Act of 1938, which for the first time required pre-market demonstration of safety before a drug could be sold, and explicitly prohibited false therapeutic claims.
The aftermath of World War II saw an explosion in the discovery of new classes of antibacterial drugs including the cephalosporins (developed by Eli Lilly based on the seminal work of Giuseppe Brotzu and Edward Abraham), streptomycin (discovered during a Merck-funded research program in Selman Waksman's laboratory), the tetracyclines (discovered at Lederle Laboratories, now a part of Pfizer), erythromycin (discovered at Eli Lilly and Co.) and their extension to an increasingly wide range of bacterial pathogens. Streptomycin, discovered during a Merck-funded research program in Selman Waksman's laboratory at Rutgers in 1943, became the first effective treatment for tuberculosis. At the time of its discovery, sanitoriums for the isolation of tuberculosis-infected people were an ubiquitous feature of cities in developed countries, with 50% dying within 5 years of admission.
During the years 1940-1955, the rate of decline in the U.S. death rate accelerated from 2% per year to 8% per year, then returned to the historical rate of 2% per year. The dramatic decline in the immediate post-war years has been attributed to the rapid development of new treatments and vaccines for infectious disease that occurred during these years. Vaccine development continued to accelerate, with the most notable achievement of the period being Jonas Salk's 1954 development of the polio vaccine under the funding of the non-profit National Foundation for Infantile Paralysis. The vaccine process was never patented, but was instead given to pharmaceutical companies to manufacture as a low-cost generic. In 1960 Maurice Hilleman of Merck Sharp & Dohme identified the SV40 virus, which was later shown to cause tumors in many mammalian species. It was later determined that SV40 was present as a contaminant in polio vaccine lots that had been administered to 90% of the children in the United States. The contamination appears to have originated both in the original cell stock and in monkey tissue used for production. In 2004 the United States Cancer Institute announced that it had concluded that SV40 is not associated with cancer in people.
On 2 July 2012, GlaxoSmithKline pleaded guilty to criminal charges and agreed to a $3 billion settlement of the largest health-care fraud case in the U.S. and the largest payment by a drug company. The settlement is related to the company's illegal promotion of prescription drugs, its failure to report safety data, bribing doctors, and promoting medicines for uses for which they were not licensed. The drugs involved were Paxil, Wellbutrin, Advair, Lamictal, and Zofran for off-label, non-covered uses. Those and the drugs Imitrex, Lotronex, Flovent, and Valtrex were involved in the kickback scheme.
In the US, starting in 2013, under the Physician Financial Transparency Reports (part of the Sunshine Act), the Centers for Medicare & Medicaid Services has to collect information from applicable manufacturers and group purchasing organizations in order to report information about their financial relationships with physicians and hospitals. Data are made public in the Centers for Medicare & Medicaid Services website. The expectation is that relationship between doctors and Pharmaceutical industry will become fully transparent.
A Federal Trade Commission report issued in 1958 attempted to quantify the effect of antibiotic development on American public health. The report found that over the period 1946-1955, there was a 42% drop in the incidence of diseases for which antibiotics were effective and only a 20% drop in those for which antibiotics were not effective. The report concluded that "it appears that the use of antibiotics, early diagnosis, and other factors have limited the epidemic spread and thus the number of these diseases which have occurred". The study further examined mortality rates for eight common diseases for which antibiotics offered effective therapy (syphilis, tuberculosis, dysentery, scarlet fever, whooping cough, meningococcal infections, and pneumonia), and found a 56% decline over the same period. Notable among these was a 75% decline in deaths due to tuberculosis.
In March 2001, 40 multi-national pharmaceutical companies brought litigation against South Africa for its Medicines Act, which allowed the generic production of antiretroviral drugs (ARVs) for treating HIV, despite the fact that these drugs were on-patent. HIV was and is an epidemic in South Africa, and ARVs at the time cost between 10,000 and 15,000 USD per patient per year. This was unaffordable for most South African citizens, and so the South African government committed to providing ARVs at prices closer to what people could afford. To do so, they would need to ignore the patents on drugs and produce generics within the country (using a compulsory license), or import them from abroad. After international protest in favour of public health rights (including the collection of 250,000 signatures by MSF), the governments of several developed countries (including The Netherlands, Germany, France, and later the US) backed the South African government, and the case was dropped in April of that year.
Prior to the 20th century drugs were generally produced by small scale manufacturers with little regulatory control over manufacturing or claims of safety and efficacy. To the extent that such laws did exist, enforcement was lax. In the United States, increased regulation of vaccines and other biological drugs was spurred by tetanus outbreaks and deaths caused by the distribution of contaminated smallpox vaccine and diphtheria antitoxin. The Biologics Control Act of 1902 required that federal government grant premarket approval for every biological drug and for the process and facility producing such drugs. This was followed in 1906 by the Pure Food and Drugs Act, which forbade the interstate distribution of adulterated or misbranded foods and drugs. A drug was considered misbranded if it contained alcohol, morphine, opium, cocaine, or any of several other potentially dangerous or addictive drugs, and if its label failed to indicate the quantity or proportion of such drugs. The government's attempts to use the law to prosecute manufacturers for making unsupported claims of efficacy were undercut by a Supreme Court ruling restricting the federal government's enforcement powers to cases of incorrect specification of the drug's ingredients.
Patents have been criticized in the developing world, as they are thought to reduce access to existing medicines. Reconciling patents and universal access to medicine would require an efficient international policy of price discrimination. Moreover, under the TRIPS agreement of the World Trade Organization, countries must allow pharmaceutical products to be patented. In 2001, the WTO adopted the Doha Declaration, which indicates that the TRIPS agreement should be read with the goals of public health in mind, and allows some methods for circumventing pharmaceutical monopolies: via compulsory licensing or parallel imports, even before patent expiration.
Pharmaceutical fraud involves deceptions which bring financial gain to a pharmaceutical company. It affects individuals and public and private insurers. There are several different schemes used to defraud the health care system which are particular to the pharmaceutical industry. These include: Good Manufacturing Practice (GMP) Violations, Off Label Marketing, Best Price Fraud, CME Fraud, Medicaid Price Reporting, and Manufactured Compound Drugs. Of this amount $2.5 billion was recovered through False Claims Act cases in FY 2010. Examples of fraud cases include the GlaxoSmithKline $3 billion settlement, Pfizer $2.3 billion settlement and Merck & Co. $650 million settlement. Damages from fraud can be recovered by use of the False Claims Act, most commonly under the qui tam provisions which rewards an individual for being a "whistleblower", or relator (law).
A 2009 Cochrane review concluded that thiazide antihypertensive drugs reduce the risk of death (RR 0.89), stroke (RR 0.63), coronary heart disease (RR 0.84), and cardiovascular events (RR 0.70) in people with high blood pressure. In the ensuring years other classes of antihypertensive drug were developed and found wide acceptance in combination therapy, including loop diuretics (Lasix/furosemide, Hoechst Pharmaceuticals, 1963), beta blockers (ICI Pharmaceuticals, 1964) ACE inhibitors, and angiotensin receptor blockers. ACE inhibitors reduce the risk of new onset kidney disease [RR 0.71] and death [RR 0.84] in diabetic patients, irrespective of whether they have hypertension.
Others have argued that excessive regulation suppresses therapeutic innovation, and that the current cost of regulator-required clinical trials prevents the full exploitation of new genetic and biological knowledge for the treatment of human disease. A 2012 report by the President's Council of Advisors on Science and Technology made several key recommendations to reduce regulatory burdens to new drug development, including 1) expanding the FDA's use of accelerated approval processes, 2) creating an expedited approval pathway for drugs intended for use in narrowly defined populations, and 3) undertaking pilot projects designed to evaluate the feasibility of a new, adaptive drug approval process.
In 1952 researchers at Ciba discovered the first orally available vasodilator, hydralazine. A major shortcoming of hydralazine monotherapy was that it lost its effectiveness over time (tachyphylaxis). In the mid-1950s Karl H. Beyer, James M. Sprague, John E. Baer, and Frederick C. Novello of Merck and Co. discovered and developed chlorothiazide, which remains the most widely used antihypertensive drug today. This development was associated with a substantial decline in the mortality rate among people with hypertension. The inventors were recognized by a Public Health Lasker Award in 1975 for "the saving of untold thousands of lives and the alleviation of the suffering of millions of victims of hypertension".
In the U.S., a push for revisions of the FD&C Act emerged from Congressional hearings led by Senator Estes Kefauver of Tennessee in 1959. The hearings covered a wide range of policy issues, including advertising abuses, questionable efficacy of drugs, and the need for greater regulation of the industry. While momentum for new legislation temporarily flagged under extended debate, a new tragedy emerged that underscored the need for more comprehensive regulation and provided the driving force for the passage of new laws.
Other notable new vaccines of the period include those for measles (1962, John Franklin Enders of Children's Medical Center Boston, later refined by Maurice Hilleman at Merck), Rubella (1969, Hilleman, Merck) and mumps (1967, Hilleman, Merck) The United States incidences of rubella, congenital rubella syndrome, measles, and mumps all fell by >95% in the immediate aftermath of widespread vaccination. The first 20 years of licensed measles vaccination in the U.S. prevented an estimated 52 million cases of the disease, 17,400 cases of mental retardation, and 5,200 deaths.
The thalidomide tragedy resurrected Kefauver's bill to enhance drug regulation that had stalled in Congress, and the Kefauver-Harris Amendment became law on 10 October 1962. Manufacturers henceforth had to prove to FDA that their drugs were effective as well as safe before they could go on the US market. The FDA received authority to regulate advertising of prescription drugs and to establish good manufacturing practices. The law required that all drugs introduced between 1938 and 1962 had to be effective. An FDA - National Academy of Sciences collaborative study showed that nearly 40 percent of these products were not effective. A similarly comprehensive study of over-the-counter products began ten years later.
The firm continued to pressure Kelsey and the agency to approve the application—until November 1961, when the drug was pulled off the German market because of its association with grave congenital abnormalities. Several thousand newborns in Europe and elsewhere suffered the teratogenic effects of thalidomide. Though the drug was never approved in the USA, the firm distributed Kevadon to over 1,000 physicians there under the guise of investigational use. Over 20,000 Americans received thalidomide in this "study," including 624 pregnant patients, and about 17 known newborns suffered the effects of the drug.[citation needed]
Prior to the second world war, birth control was prohibited in many countries, and in the United States even the discussion of contraceptive methods sometimes led to prosecution under Comstock laws. The history of the development of oral contraceptives is thus closely tied to the birth control movement and the efforts of activists Margaret Sanger, Mary Dennett, and Emma Goldman. Based on fundamental research performed by Gregory Pincus and synthetic methods for progesterone developed by Carl Djerassi at Syntex and by Frank Colton at G.D. Searle & Co., the first oral contraceptive, Enovid, was developed by E.D. Searle and Co. and approved by the FDA in 1960. The original formulation incorporated vastly excessive doses of hormones, and caused severe side effects. Nonetheless, by 1962, 1.2 million American women were on the pill, and by 1965 the number had increased to 6.5 million. The availability of a convenient form of temporary contraceptive led to dramatic changes in social mores including expanding the range of lifestyle options available to women, reducing the reliance of women on men for contraceptive practice, encouraging the delay of marriage, and increasing pre-marital co-habitation.
In April 1994, the results of a Merck-sponsored study, the Scandinavian Simvastatin Survival Study, were announced. Researchers tested simvastatin, later sold by Merck as Zocor, on 4,444 patients with high cholesterol and heart disease. After five years, the study concluded the patients saw a 35% reduction in their cholesterol, and their chances of dying of a heart attack were reduced by 42%. In 1995, Zocor and Mevacor both made Merck over US$1 billion. Endo was awarded the 2006 Japan Prize, and the Lasker-DeBakey Clinical Medical Research Award in 2008. For his "pioneering research into a new class of molecules" for "lowering cholesterol,"[sentence fragment]
Drug discovery is the process by which potential drugs are discovered or designed. In the past most drugs have been discovered either by isolating the active ingredient from traditional remedies or by serendipitous discovery. Modern biotechnology often focuses on understanding the metabolic pathways related to a disease state or pathogen, and manipulating these pathways using molecular biology or biochemistry. A great deal of early-stage drug discovery has traditionally been carried out by universities and research institutions.
Drug discovery and development is very expensive; of all compounds investigated for use in humans only a small fraction are eventually approved in most nations by government appointed medical institutions or boards, who have to approve new drugs before they can be marketed in those countries. In 2010 18 NMEs (New Molecular Entities) were approved and three biologics by the FDA, or 21 in total, which is down from 26 in 2009 and 24 in 2008. On the other hand, there were only 18 approvals in total in 2007 and 22 back in 2006. Since 2001, the Center for Drug Evaluation and Research has averaged 22.9 approvals a year. This approval comes only after heavy investment in pre-clinical development and clinical trials, as well as a commitment to ongoing safety monitoring. Drugs which fail part-way through this process often incur large costs, while generating no revenue in return. If the cost of these failed drugs is taken into account, the cost of developing a successful new drug (new chemical entity, or NCE), has been estimated at about 1.3 billion USD(not including marketing expenses). Professors Light and Lexchin reported in 2012, however, that the rate of approval for new drugs has been a relatively stable average rate of 15 to 25 for decades.
Some of these estimates also take into account the opportunity cost of investing capital many years before revenues are realized (see Time-value of money). Because of the very long time needed for discovery, development, and approval of pharmaceuticals, these costs can accumulate to nearly half the total expense. A direct consequence within the pharmaceutical industry value chain is that major pharmaceutical multinationals tend to increasingly outsource risks related to fundamental research, which somewhat reshapes the industry ecosystem with biotechnology companies playing an increasingly important role, and overall strategies being redefined accordingly. Some approved drugs, such as those based on re-formulation of an existing active ingredient (also referred to as Line-extensions) are much less expensive to develop.
In 1971, Akira Endo, a Japanese biochemist working for the pharmaceutical company Sankyo, identified mevastatin (ML-236B), a molecule produced by the fungus Penicillium citrinum, as an inhibitor of HMG-CoA reductase, a critical enzyme used by the body to produce cholesterol. Animal trials showed very good inhibitory effect as in clinical trials, however a long term study in dogs found toxic effects at higher doses and as a result mevastatin was believed to be too toxic for human use. Mevastatin was never marketed, because of its adverse effects of tumors, muscle deterioration, and sometimes death in laboratory dogs.
Every major company selling the antipsychotics — Bristol-Myers Squibb, Eli Lilly, Pfizer, AstraZeneca and Johnson & Johnson — has either settled recent government cases, under the False Claims Act, for hundreds of millions of dollars or is currently under investigation for possible health care fraud. Following charges of illegal marketing, two of the settlements set records last year for the largest criminal fines ever imposed on corporations. One involved Eli Lilly's antipsychotic Zyprexa, and the other involved Bextra. In the Bextra case, the government also charged Pfizer with illegally marketing another antipsychotic, Geodon; Pfizer settled that part of the claim for $301 million, without admitting any wrongdoing.
In contrast to this viewpoint, an article and associated editorial in the New England Journal of Medicine in May 2015 emphasized the importance of pharmaceutical industry-physician interactions for the development of novel treatments, and argued that moral outrage over industry malfeasance had unjustifiably led many to overemphasize the problems created by financial conflicts of interest. The article noted that major healthcare organizations such as National Center for Advancing Translational Sciences of the National Institutes of Health, the President’s Council of Advisors on Science and Technology, the World Economic Forum, the Gates Foundation, the Wellcome Trust, and the Food and Drug Administration had encouraged greater interactions between physicians and industry in order to bring greater benefits to patients.
An investigation by ProPublica found that at least 21 doctors have been paid more than $500,000 for speeches and consulting by drugs manufacturers since 2009, with half of the top earners working in psychiatry, and about $2 billion in total paid to doctors for such services. AstraZeneca, Johnson & Johnson and Eli Lilly have paid billions of dollars in federal settlements over allegations that they paid doctors to promote drugs for unapproved uses. Some prominent medical schools have since tightened rules on faculty acceptance of such payments by drug companies.
Often, large multinational corporations exhibit vertical integration, participating in a broad range of drug discovery and development, manufacturing and quality control, marketing, sales, and distribution. Smaller organizations, on the other hand, often focus on a specific aspect such as discovering drug candidates or developing formulations. Often, collaborative agreements between research organizations and large pharmaceutical companies are formed to explore the potential of new drug substances. More recently, multi-nationals are increasingly relying on contract research organizations to manage drug development.
In the UK, the Medicines and Healthcare Products Regulatory Agency approves drugs for use, though the evaluation is done by the European Medicines Agency, an agency of the European Union based in London. Normally an approval in the UK and other European countries comes later than one in the USA. Then it is the National Institute for Health and Care Excellence (NICE), for England and Wales, who decides if and how the National Health Service (NHS) will allow (in the sense of paying for) their use. The British National Formulary is the core guide for pharmacists and clinicians.
There are special rules for certain rare diseases ("orphan diseases") in several major drug regulatory territories. For example, diseases involving fewer than 200,000 patients in the United States, or larger populations in certain circumstances are subject to the Orphan Drug Act. Because medical research and development of drugs to treat such diseases is financially disadvantageous, companies that do so are rewarded with tax reductions, fee waivers, and market exclusivity on that drug for a limited time (seven years), regardless of whether the drug is protected by patents.
Ben Goldacre has argued that regulators – such as the Medicines and Healthcare products Regulatory Agency (MHRA) in the UK, or the Food and Drug Administration (FDA) in the United States – advance the interests of the drug companies rather than the interests of the public due to revolving door exchange of employees between the regulator and the companies and friendships develop between regulator and company employees. He argues that regulators do not require that new drugs offer an improvement over what is already available, or even that they be particularly effective.
In many non-US western countries a 'fourth hurdle' of cost effectiveness analysis has developed before new technologies can be provided. This focuses on the efficiency (in terms of the cost per QALY) of the technologies in question rather than their efficacy. In England and Wales NICE decides whether and in what circumstances drugs and technologies will be made available by the NHS, whilst similar arrangements exist with the Scottish Medicines Consortium in Scotland, and the Pharmaceutical Benefits Advisory Committee in Australia. A product must pass the threshold for cost-effectiveness if it is to be approved. Treatments must represent 'value for money' and a net benefit to society.
The top ten best-selling drugs of 2013 totaled $75.6 billion in sales, with the anti-inflammatory drug Humira being the best-selling drug worldwide at $10.7 billion in sales. The second and third best selling were Enbrel and Remicade, respectively. The top three best-selling drugs in the United States in 2013 were Abilify ($6.3 billion,) Nexium ($6 billion) and Humira ($5.4 billion). The best-selling drug ever, Lipitor, averaged $13 billion annually and netted $141 billion total over its lifetime before Pfizer's patent expired in November 2011.
Depending on a number of considerations, a company may apply for and be granted a patent for the drug, or the process of producing the drug, granting exclusivity rights typically for about 20 years. However, only after rigorous study and testing, which takes 10 to 15 years on average, will governmental authorities grant permission for the company to market and sell the drug. Patent protection enables the owner of the patent to recover the costs of research and development through high profit margins for the branded drug. When the patent protection for the drug expires, a generic drug is usually developed and sold by a competing company. The development and approval of generics is less expensive, allowing them to be sold at a lower price. Often the owner of the branded drug will introduce a generic version before the patent expires in order to get a head start in the generic market. Restructuring has therefore become routine, driven by the patent expiration of products launched during the industry's "golden era" in the 1990s and companies' failure to develop sufficient new blockbuster products to replace lost revenues.
In the United States, new pharmaceutical products must be approved by the Food and Drug Administration (FDA) as being both safe and effective. This process generally involves submission of an Investigational New Drug filing with sufficient pre-clinical data to support proceeding with human trials. Following IND approval, three phases of progressively larger human clinical trials may be conducted. Phase I generally studies toxicity using healthy volunteers. Phase II can include pharmacokinetics and dosing in patients, and Phase III is a very large study of efficacy in the intended patient population. Following the successful completion of phase III testing, a New Drug Application is submitted to the FDA. The FDA review the data and if the product is seen as having a positive benefit-risk assessment, approval to market the product in the US is granted.
Advertising is common in healthcare journals as well as through more mainstream media routes. In some countries, notably the US, they are allowed to advertise directly to the general public. Pharmaceutical companies generally employ sales people (often called 'drug reps' or, an older term, 'detail men') to market directly and personally to physicians and other healthcare providers. In some countries, notably the US, pharmaceutical companies also employ lobbyists to influence politicians. Marketing of prescription drugs in the US is regulated by the federal Prescription Drug Marketing Act of 1987.
There has been increasing controversy surrounding pharmaceutical marketing and influence. There have been accusations and findings of influence on doctors and other health professionals through drug reps, including the constant provision of marketing 'gifts' and biased information to health professionals; highly prevalent advertising in journals and conferences; funding independent healthcare organizations and health promotion campaigns; lobbying physicians and politicians (more than any other industry in the US); sponsorship of medical schools or nurse training; sponsorship of continuing educational events, with influence on the curriculum; and hiring physicians as paid consultants on medical advisory boards.
The rivalries between the Arab tribes had caused unrest in the provinces outside Syria, most notably in the Second Muslim Civil War of 680–692 CE and the Berber Revolt of 740–743 CE. During the Second Civil War, leadership of the Umayyad clan shifted from the Sufyanid branch of the family to the Marwanid branch. As the constant campaigning exhausted the resources and manpower of the state, the Umayyads, weakened by the Third Muslim Civil War of 744–747 CE, were finally toppled by the Abbasid Revolution in 750 CE/132 AH. A branch of the family fled across North Africa to Al-Andalus, where they established the Caliphate of Córdoba, which lasted until 1031 before falling due to the Fitna of al-Ándalus.
Ali was assassinated in 661 by a Kharijite partisan. Six months later in the same year, in the interest of peace, Hasan ibn Ali, highly regarded for his wisdom and as a peacemaker, and the Second Imam for the Shias, and the grandson of Muhammad, made a peace treaty with Muawiyah I. In the Hasan-Muawiya treaty, Hasan ibn Ali handed over power to Muawiya on the condition that he be just to the people and keep them safe and secure, and after his death he not establish a dynasty. This brought to an end the era of the Rightly Guided Caliphs for the Sunnis, and Hasan ibn Ali was also the last Imam for the Shias to be a Caliph. Following this, Mu'awiyah broke the conditions of the agreement and began the Umayyad dynasty, with its capital in Damascus.
At the time, the Umayyad taxation and administrative practice were perceived as unjust by some Muslims. The Christian and Jewish population had still autonomy; their judicial matters were dealt with in accordance with their own laws and by their own religious heads or their appointees, although they did pay a poll tax for policing to the central state. Muhammad had stated explicitly during his lifetime that abrahamic religious groups (still a majority in times of the Umayyad Caliphate), should be allowed to practice their own religion, provided that they paid the jizya taxation. The welfare state of both the Muslim and the non-Muslim poor started by Umar ibn al Khattab had also continued. Muawiya's wife Maysum (Yazid's mother) was also a Christian. The relations between the Muslims and the Christians in the state were stable in this time. The Umayyads were involved in frequent battles with the Christian Byzantines without being concerned with protecting themselves in Syria, which had remained largely Christian like many other parts of the empire. Prominent positions were held by Christians, some of whom belonged to families that had served in Byzantine governments. The employment of Christians was part of a broader policy of religious assimilation that was necessitated by the presence of large Christian populations in the conquered provinces, as in Syria. This policy also boosted Muawiya's popularity and solidified Syria as his power base.
The Umayyad Caliphate (Arabic: الخلافة الأموية‎, trans. Al-Khilāfat al-ʾumawiyya) was the second of the four major Islamic caliphates established after the death of Muhammad. This caliphate was centered on the Umayyad dynasty (Arabic: الأمويون‎, al-ʾUmawiyyūn, or بنو أمية, Banū ʾUmayya, "Sons of Umayya"), hailing from Mecca. The Umayyad family had first come to power under the third caliph, Uthman ibn Affan (r. 644–656), but the Umayyad regime was founded by Muawiya ibn Abi Sufyan, long-time governor of Syria, after the end of the First Muslim Civil War in 661 CE/41 AH. Syria remained the Umayyads' main power base thereafter, and Damascus was their capital. The Umayyads continued the Muslim conquests, incorporating the Caucasus, Transoxiana, Sindh, the Maghreb and the Iberian Peninsula (Al-Andalus) into the Muslim world. At its greatest extent, the Umayyad Caliphate covered 15 million km2 (5.79 million square miles), making it the largest empire (in terms of area - not in terms of population) the world had yet seen, and the fifth largest ever to exist.
Most historians[who?] consider Caliph Muawiyah (661–80) to have been the second ruler of the Umayyad dynasty, even though he was the first to assert the Umayyads' right to rule on a dynastic principle. It was really the caliphate of Uthman Ibn Affan (644–656), a member of Umayyad clan himself, that witnessed the revival and then the ascendancy of the Umayyad clan to the corridors of power. Uthman placed some of the trusted members of his clan at prominent and strong positions throughout the state. Most notable was the appointment of Marwan ibn al-Hakam, Uthman's first cousin, as his top advisor, which created a stir among the Hashimite companions of Muhammad, as Marwan along with his father Al-Hakam ibn Abi al-'As had been permanently exiled from Medina by Muhammad during his lifetime. Uthman also appointed as governor of Kufa his half-brother, Walid ibn Uqba, who was accused by Hashmites of leading prayer while under the influence of alcohol. Uthman also consolidated Muawiyah's governorship of Syria by granting him control over a larger area and appointed his foster brother Abdullah ibn Saad as the Governor of Egypt. However, since Uthman never named an heir, he cannot be considered the founder of a dynasty.
Following the death of Husayn, Ibn al-Zubayr, although remaining in Mecca, was associated with two opposition movements, one centered in Medina and the other around Kharijites in Basra and Arabia. Because Medina had been home to Muhammad and his family, including Husayn, word of his death and the imprisonment of his family led to a large opposition movement. In 683, Yazid dispatched an army to subdue both movements. The army suppressed the Medinese opposition at the Battle of al-Harrah. The Grand Mosque in Medina was severely damaged and widespread pillaging caused deep-seated dissent. Yazid's army continued on and laid siege to Mecca. At some point during the siege, the Kaaba was badly damaged in a fire. The destruction of the Kaaba and Grand Mosque became a major cause for censure of the Umayyads in later histories of the period.
According to tradition, the Umayyad family (also known as the Banu Abd-Shams) and Muhammad both descended from a common ancestor, Abd Manaf ibn Qusai, and they originally came from the city of Mecca. Muhammad descended from Abd Manāf via his son Hashim, while the Umayyads descended from Abd Manaf via a different son, Abd-Shams, whose son was Umayya. The two families are therefore considered to be different clans (those of Hashim and of Umayya, respectively) of the same tribe (that of the Quraish). However Muslim Shia historians suspect that Umayya was an adopted son of Abd Shams so he was not a blood relative of Abd Manaf ibn Qusai. Umayya was later discarded from the noble family. Sunni historians disagree with this and view Shia claims as nothing more than outright polemics due to their hostility to the Umayyad family in general. They point to the fact that the grand sons of Uthman, Zaid bin amr bin uthman bin affan and Abdullah bin Amr bin Uthman got married to the Sukaina and Fatima the daughters of Hussein son of Ali to show closeness of Banu hashem and Bani Ummayah.
Following this battle, Ali fought a battle against Muawiyah, known as the Battle of Siffin. The battle was stopped before either side had achieved victory, and the two parties agreed to arbitrate their dispute. After the battle Amr ibn al-As was appointed by Muawiyah as an arbitrator, and Ali appointed Abu Musa Ashaari. Seven months later, in February 658, the two arbitrators met at Adhruh, about 10 miles north west of Maan in Jordon. Amr ibn al-As convinced Abu Musa Ashaari that both Ali and Muawiyah should step down and a new Caliph be elected. Ali and his supporters were stunned by the decision which had lowered the Caliph to the status of the rebellious Muawiyah I. Ali was therefore outwitted by Muawiyah and Amr. Ali refused to accept the verdict and found himself technically in breach of his pledge to abide by the arbitration. This put Ali in a weak position even amongst his own supporters. The most vociferous opponents in Ali's camp were the very same people who had forced Ali into the ceasefire. They broke away from Ali's force, rallying under the slogan, "arbitration belongs to God alone." This group came to be known as the Kharijites ("those who leave"). In 659 Ali's forces and the Kharijites met in the Battle of Nahrawan. Although Ali won the battle, the constant conflict had begun to affect his standing, and in the following years some Syrians seem to have acclaimed Muawiyah as a rival caliph.
The Quran and Muhammad talked about racial equality and justice as in The Farewell Sermon. Tribal and nationalistic differences were discouraged. But after Muhammad's passing, the old tribal differences between the Arabs started to resurface. Following the Roman–Persian Wars and the Byzantine–Sassanid Wars, deep rooted differences between Iraq, formally under the Persian Sassanid Empire, and Syria, formally under the Byzantine Empire, also existed. Each wanted the capital of the newly established Islamic State to be in their area. Previously, the second caliph Umar was very firm on the governors and his spies kept an eye on them. If he felt that a governor or a commander was becoming attracted to wealth, he had him removed from his position.
While the Umayyads and the Hashimites may have had bitterness between the two clans before Muhammad, the rivalry turned into a severe case of tribal animosity after the Battle of Badr. The battle saw three top leaders of the Umayyad clan (Utba ibn Rabi'ah, Walid ibn Utbah and Shaybah) killed by Hashimites (Ali, Hamza ibn ‘Abd al-Muttalib and Ubaydah ibn al-Harith) in a three-on-three melee. This fueled the opposition of Abu Sufyan ibn Harb, the grandson of Umayya, to Muhammad and to Islam. Abu Sufyan sought to exterminate the adherents of the new religion by waging another battle with Muslims based in Medina only a year after the Battle of Badr. He did this to avenge the defeat at Badr. The Battle of Uhud is generally believed by scholars to be the first defeat for the Muslims, as they had incurred greater losses than the Meccans. After the battle, Abu Sufyan's wife Hind, who was also the daughter of Utba ibn Rabi'ah, is reported to have cut open the corpse of Hamza, taking out his liver which she then attempted to eat. Within five years after his defeat in the Battle of Uhud, however, Muhammad took control of Mecca and announced a general amnesty for all. Abu Sufyan and his wife Hind embraced Islam on the eve of the conquest of Mecca, as did their son (the future caliph Muawiyah I).
Umar is honored for his attempt to resolve the fiscal problems attendant upon conversion to Islam. During the Umayyad period, the majority of people living within the caliphate were not Muslim, but Christian, Jewish, Zoroastrian, or members of other small groups. These religious communities were not forced to convert to Islam, but were subject to a tax (jizyah) which was not imposed upon Muslims. This situation may actually have made widespread conversion to Islam undesirable from the point of view of state revenue, and there are reports that provincial governors actively discouraged such conversions. It is not clear how Umar attempted to resolve this situation, but the sources portray him as having insisted on like treatment of Arab and non-Arab (mawali) Muslims, and on the removal of obstacles to the conversion of non-Arabs to Islam.
After the assassination of Uthman in 656, Ali, a member of the Quraysh tribe and the cousin and son-in-law of Muhammad, was elected as the caliph. He soon met with resistance from several factions, owing to his relative political inexperience. Ali moved his capital from Medina to Kufa. The resulting conflict, which lasted from 656 until 661, is known as the First Fitna ("civil war"). Muawiyah I, the governor of Syria, a relative of Uthman ibn al-Affan and Marwan I, wanted the culprits arrested. Marwan I manipulated everyone and created conflict. Aisha, the wife of Muhammad, and Talhah and Al-Zubayr, two of the companions of Muhammad, went to Basra to tell Ali to arrest the culprits who murdered Uthman. Marwan I and other people who wanted conflict manipulated everyone to fight. The two sides clashed at the Battle of the Camel in 656, where Ali won a decisive victory.
Early Muslim armies stayed in encampments away from cities because Umar feared that they might get attracted to wealth and luxury. In the process, they might turn away from the worship of God and start accumulating wealth and establishing dynasties. When Uthman ibn al-Affan became very old, Marwan I, a relative of Muawiyah I, slipped into the vacuum, became his secretary, slowly assumed more control and relaxed some of these restrictions. Marwan I had previously been excluded from positions of responsibility. In 656, Muhammad ibn Abi Bakr, the son of Abu Bakr, the adopted son of Ali ibn Abi Talib, and the great grandfather of Ja'far al-Sadiq, showed some Egyptians the house of Uthman ibn al-Affan. Later the Egyptians ended up killing Uthman ibn al-Affan.
In 680 Ibn al-Zubayr fled Medina for Mecca. Hearing about Husayn's opposition to Yazid I, the people of Kufa sent to Husayn asking him to take over with their support. Al-Husayn sent his cousin Muslim bin Agail to verify if they would rally behind him. When the news reached Yazid I, he sent Ubayd-Allah bin Ziyad, ruler of Basrah, with the instruction to prevent the people of Kufa rallying behind Al-Husayn. Ubayd-Allah bin Ziyad managed to disperse the crowd that gathered around Muslim bin Agail and captured him. Realizing Ubayd-Allah bin Ziyad had been instructed to prevent Husayn from establishing support in Kufa, Muslim bin Agail requested a message to be sent to Husayn to prevent his immigration to Kufa. The request was denied and Ubayd-Allah bin Ziyad killed Muslim bin Agail. While Ibn al-Zubayr would stay in Mecca until his death, Husayn decided to travel on to Kufa with his family, unaware of the lack of support there. Husayn and his family were intercepted by Yazid I's forces led by Amru bin Saad, Shamar bin Thi Al-Joshan, and Hussain bin Tamim, who fought Al-Husayn and his male family members until they were killed. There were 200 people in Husayn's caravan, many of whom were women, including his sisters, wives, daughters and their children. The women and children from Husayn's camp were taken as prisoners of war and led back to Damascus to be presented to Yazid I. They remained imprisoned until public opinion turned against him as word of Husayn's death and his family's capture spread. They were then granted passage back to Medina. The sole adult male survivor from the caravan was Ali ibn Husayn who was with fever too ill to fight when the caravan was attacked.
In the year 712, Muhammad bin Qasim, an Umayyad general, sailed from the Persian Gulf into Sindh in Pakistan and conquered both the Sindh and the Punjab regions along the Indus river. The conquest of Sindh and Punjab, in modern-day Pakistan, although costly, were major gains for the Umayyad Caliphate. However, further gains were halted by Hindu kingdoms in India in the battle of Rajasthan. The Arabs tried to invade India but they were defeated by the north Indian king Nagabhata of the Pratihara Dynasty and by the south Indian Emperor Vikramaditya II of the Chalukya dynasty in the early 8th century. After this the Arab chroniclers admit that the Caliph Mahdi "gave up the project of conquering any part of India."
The second major event of the early reign of Abd al-Malik was the construction of the Dome of the Rock in Jerusalem. Although the chronology remains somewhat uncertain, the building seems to have been completed in 692, which means that it was under construction during the conflict with Ibn al-Zubayr. This had led some historians, both medieval and modern, to suggest that the Dome of the Rock was built as a destination for pilgrimage to rival the Kaaba, which was under the control of Ibn al-Zubayr.
Muawiyah also encouraged peaceful coexistence with the Christian communities of Syria, granting his reign with "peace and prosperity for Christians and Arabs alike", and one of his closest advisers was Sarjun, the father of John of Damascus. At the same time, he waged unceasing war against the Byzantine Roman Empire. During his reign, Rhodes and Crete were occupied, and several assaults were launched against Constantinople. After their failure, and faced with a large-scale Christian uprising in the form of the Mardaites, Muawiyah concluded a peace with Byzantium. Muawiyah also oversaw military expansion in North Africa (the foundation of Kairouan) and in Central Asia (the conquest of Kabul, Bukhara, and Samarkand).
Yazid died while the siege was still in progress, and the Umayyad army returned to Damascus, leaving Ibn al-Zubayr in control of Mecca. Yazid's son Muawiya II (683–84) initially succeeded him but seems to have never been recognized as caliph outside of Syria. Two factions developed within Syria: the Confederation of Qays, who supported Ibn al-Zubayr, and the Quda'a, who supported Marwan, a descendant of Umayya via Wa'il ibn Umayyah. The partisans of Marwan triumphed at a battle at Marj Rahit, near Damascus, in 684, and Marwan became caliph shortly thereafter.
Marwan was succeeded by his son, Abd al-Malik (685–705), who reconsolidated Umayyad control of the caliphate. The early reign of Abd al-Malik was marked by the revolt of Al-Mukhtar, which was based in Kufa. Al-Mukhtar hoped to elevate Muhammad ibn al-Hanafiyyah, another son of Ali, to the caliphate, although Ibn al-Hanafiyyah himself may have had no connection to the revolt. The troops of al-Mukhtar engaged in battles both with the Umayyads in 686, defeating them at the river Khazir near Mosul, and with Ibn al-Zubayr in 687, at which time the revolt of al-Mukhtar was crushed. In 691, Umayyad troops reconquered Iraq, and in 692 the same army captured Mecca. Ibn al-Zubayr was killed in the attack.
Geographically, the empire was divided into several provinces, the borders of which changed numerous times during the Umayyad reign. Each province had a governor appointed by the khalifah. The governor was in charge of the religious officials, army leaders, police, and civil administrators in his province. Local expenses were paid for by taxes coming from that province, with the remainder each year being sent to the central government in Damascus. As the central power of the Umayyad rulers waned in the later years of the dynasty, some governors neglected to send the extra tax revenue to Damascus and created great personal fortunes.
Hisham suffered still worse defeats in the east, where his armies attempted to subdue both Tokharistan, with its center at Balkh, and Transoxiana, with its center at Samarkand. Both areas had already been partially conquered, but remained difficult to govern. Once again, a particular difficulty concerned the question of the conversion of non-Arabs, especially the Sogdians of Transoxiana. Following the Umayyad defeat in the "Day of Thirst" in 724, Ashras ibn 'Abd Allah al-Sulami, governor of Khurasan, promised tax relief to those Sogdians who converted to Islam, but went back on his offer when it proved too popular and threatened to reduce tax revenues. Discontent among the Khurasani Arabs rose sharply after the losses suffered in the Battle of the Defile in 731, and in 734, al-Harith ibn Surayj led a revolt that received broad backing from Arabs and natives alike, capturing Balkh but failing to take Merv. After this defeat, al-Harith's movement seems to have been dissolved, but the problem of the rights of non-Arab Muslims would continue to plague the Umayyads.
The Hashimiyya movement (a sub-sect of the Kaysanites Shia), led by the Abbasid family, overthrew the Umayyad caliphate. The Abbasids were members of the Hashim clan, rivals of the Umayyads, but the word "Hashimiyya" seems to refer specifically to Abu Hashim, a grandson of Ali and son of Muhammad ibn al-Hanafiyya. According to certain traditions, Abu Hashim died in 717 in Humeima in the house of Muhammad ibn Ali, the head of the Abbasid family, and before dying named Muhammad ibn Ali as his successor. This tradition allowed the Abbasids to rally the supporters of the failed revolt of Mukhtar, who had represented themselves as the supporters of Muhammad ibn al-Hanafiyya.
From the caliphate's north-western African bases, a series of raids on coastal areas of the Visigothic Kingdom paved the way to the permanent occupation of most of Iberia by the Umayyads (starting in 711), and on into south-eastern Gaul (last stronghold at Narbonne in 759). Hisham's reign witnessed the end of expansion in the west, following the defeat of the Arab army by the Franks at the Battle of Tours in 732. In 739 a major Berber Revolt broke out in North Africa, which was subdued only with difficulty, but it was followed by the collapse of Umayyad authority in al-Andalus. In India the Arab armies were defeated by the south Indian Chalukya dynasty and by the north Indian Pratiharas Dynasty in the 8th century and the Arabs were driven out of India. In the Caucasus, the confrontation with the Khazars peaked under Hisham: the Arabs established Derbent as a major military base and launched several invasions of the northern Caucasus, but failed to subdue the nomadic Khazars. The conflict was arduous and bloody, and the Arab army even suffered a major defeat at the Battle of Marj Ardabil in 730. Marwan ibn Muhammad, the future Marwan II, finally ended the war in 737 with a massive invasion that is reported to have reached as far as the Volga, but the Khazars remained unsubdued.
The final son of Abd al-Malik to become caliph was Hisham (724–43), whose long and eventful reign was above all marked by the curtailment of military expansion. Hisham established his court at Resafa in northern Syria, which was closer to the Byzantine border than Damascus, and resumed hostilities against the Byzantines, which had lapsed following the failure of the last siege of Constantinople. The new campaigns resulted in a number of successful raids into Anatolia, but also in a major defeat (the Battle of Akroinon), and did not lead to any significant territorial expansion.